Drudge Retort: The Other Side of the News
Sunday, October 13, 2024

In late September, Shield AI co-founder Brandon Tseng swore that weapons in the U.S. would never be fully autonomous -- meaning an AI algorithm would make the final decision to kill someone. "Congress doesn't want that," the defense tech founder told TechCrunch. "No one wants that."

More

Comments

Admin's note: Participants in this discussion must follow the site's moderation policy. Profanity will be filtered. Abusive conduct is not allowed.

More from the article...

... But Tseng spoke too soon. Five days later, Anduril co-founder Palmer Luckey expressed an openness to autonomous weapons -- or at least a heavy skepticism of arguments against them. The U.S.'s adversaries "use phrases that sound really good in a sound bite: Well, can't you agree that a robot should never be able to decide who lives and dies?" Luckey said during a talk earlier this month at Pepperdine University. "And my point to them is, where's the moral high ground in a landmine that can't tell the difference between a school bus full of kids and a Russian tank?"

When asked for further comment, Shannon Prior, a spokesperson for Anduril, said that Luckey didn't mean that robots should be programmed to kill people on their own, just that he was concerned about "bad people using bad AI." ...


#1 | Posted by LampLighter at 2024-10-12 05:39 PM | Reply

War Pigs Suck.......

War Pigs Suck.........

WAR PIGS SUCK!!!!!!!

#2 | Posted by Effeteposer at 2024-10-12 06:17 PM | Reply

@#2

Why doesn't your current alias push that thought upstream to your apparent boss in Moscow?

#3 | Posted by LampLighter at 2024-10-12 06:27 PM | Reply | Newsworthy 1

#2

Putin does indeed suck.

#4 | Posted by LegallyYourDead at 2024-10-12 07:49 PM | Reply

#2 | Posted by Effeteposer (with the Kumquat Pol Pot) at 2024-10-12 06:17 PM | Reply | Flag: MAGAts post and re-post 100% BS because they believe that everyone is as gullible and stupid as they are

#5 | Posted by Hans at 2024-10-12 07:54 PM | Reply

should AI weapons be allowed to opt to kill

Yes. I foresee no possible problems.

Doggun

#6 | Posted by censored at 2024-10-12 09:33 PM | Reply | Newsworthy 1

That stupid Irobot movie talked about how machines can't be trusted with this kind of decision.

#7 | Posted by Tor at 2024-10-13 05:04 PM | Reply

The U.S. does not ban companies from making fully autonomous lethal weapons nor does it explicitly ban them from selling such things to foreign countries. Last year, the U.S. released updated guidelines for AI safety in the military that have been endorsed by many U.S. allies and requires top military officials to approve of any new autonomous weapon; yet the guidelines are voluntary (Anduril said it is committed to following the guidelines), and U.S. officials have continuously said it's "not the right time" to consider any binding ban on autonomous weapons.
Stop funding these monstrosities.

I can imagine the confusion on where the bribes went come Empty Bank Day.

#8 | Posted by redlightrobot at 2024-10-13 05:32 PM | Reply

War takes a lot of innocent peoples lives, destroys resources and societal work and makes the big tech war mongers, political class and neocons very rich. That's why they hate freedom of speech and why serious negotiation rarely done by those that should be negotiating.

#9 | Posted by Robson at 2024-10-13 07:38 PM | Reply

- the big tech war mongers, political class and neocons very rich.

Elon Musk, Peter Thiel, and Leo Leonard on line 1 for you.

#10 | Posted by Corky at 2024-10-13 08:48 PM | Reply | Newsworthy 1

Too late, that genie is out of the bottle, in Ukraine they already have independent drones...and the American military is well known for its hidden DARPA projects.

#11 | Posted by Hughmass at 2024-10-14 07:05 AM | Reply

Ukraine proved its required for drones to kill based on machine vision AI with human decision making impossible due to jamming.

#12 | Posted by sitzkrieg at 2024-10-14 09:47 AM | Reply | Newsworthy 1

Yes. I foresee no possible problems.

"Would you like to play a game of Global Thermonuclear War? "

#13 | Posted by donnerboy at 2024-10-14 01:49 PM | Reply | Funny: 2

Re 11 & 12

THIS is why the Singularity cannot be stopped.

Everyone will find an excuse to allow AI more and more independent control of its own actions.

Humans apparently still don't realize (or don't care) that introducing a powerful alien intelligence into our midst will have incalculable results.

#14 | Posted by donnerboy at 2024-10-14 01:53 PM | Reply | Newsworthy 2

AI needs to be used to answer questions about ethics and morals not be expected to decide if someone lives or dies.

#15 | Posted by Tor at 2024-10-14 02:42 PM | Reply

And "Silicon Valley " should not be allowed to introduce an advanced alien species into our midst without a vigorous democratic debate and at a minimum congressional approval.

#16 | Posted by donnerboy at 2024-10-14 03:28 PM | Reply

#15 | Posted by Tor

Um that is both an ethical and moral question.

#17 | Posted by GalaxiePete at 2024-10-14 05:39 PM | Reply

#5 | Posted by Hans

Effet is not a MAGAt. However he (I assume) is a useful tool.

#18 | Posted by GalaxiePete at 2024-10-14 05:43 PM | Reply

Well, since we are no longer in charge of our country, but instead find ourselves with a political system run by those with the big bucks, what we, the general populace think, is irrelevant. When an enemy presents a tool, say a drone that is programmed to be a suicide missile, to find a target and kill, then we have to adopt that weapon, and by the way, Ukraine already has them.

#19 | Posted by Hughmass at 2024-10-15 07:40 AM | Reply

#14 AI can't reason. It's not independent. It's a math formula. When is the last time you took linear algebra?

#20 | Posted by sitzkrieg at 2024-10-15 07:56 AM | Reply

#6 | Posted by censored

Soon to be guarding a gated community near you.

#21 | Posted by Whatsleft at 2024-10-15 12:38 PM | Reply

Domestically...HELL NO!

on foreign soil outside of a war zone
(like something being policed by multinational forces...No, I would say.

In an active combat site/war zone, they would almost have to be.

#22 | Posted by earthmuse at 2024-10-15 02:50 PM | Reply

Comments are closed for this entry.

Home | Breaking News | Comments | User Blogs | Stats | Back Page | RSS Feed | RSS Spec | DMCA Compliance | Privacy | Copyright 2024 World Readable

Drudge Retort