Viewing a single comment thread. View all comments

GreenDemonClean t1_jargsg9 wrote

This was always the plan.

108

Bobtheguardian22 t1_jarv50r wrote

cant wait to enlist again as a robot dog operator.

like playing call of duty with a gunned up machine robot dog.

32

eheun t1_jarvhch wrote

Kinda badass ngl - but the ethical implications of video games with irl consequences?

18

The-God-Of-Memez t1_jas305e wrote

Hopefully one day advancement of military will lead to destroying the other countries toys instead of killing the people

10

yuxulu t1_jatzx2q wrote

Settle international disagreement with battlebots.

5

Alimbiquated t1_jasqznw wrote

I can imagine something like that happening on the Moon or Mars. Because these places are so hostile to human habitation, a space colony would probably have a long first phase where robots build the colony before the astronauts arrive in any numbers. If it came to war, the robots would fight it out among themselves.

2

Sweaty-Feedback-1482 t1_jas3oaa wrote

Aren’t we already there with drone operators?

4

savage_slurpie t1_jasc5af wrote

Been there for more than a decade. People just don’t like to think about the ethical implications of using drones.

6

[deleted] t1_jbiyv9w wrote

The use of drones really hasn't changed anything from an ethics standpoint. Pilots dropping bombs from drones is ethically and practically no different than pilots dropping bombs from bombers and fighter jets. In all those scenarios, the pilot is using inputs from various types of data to make the decision to drop the bomb and is making that decision from a perspective of complete safety from enemy attack.

The real change in the calculus of ethics will happen when drones have the ability to decide whether or not to kill a human target. By "decide", I don't mean automatically running a program to determine whether or not the target fits a preprogrammed target profile (even if that target profile is determined through machine learning, even self-trained machine learning). Instead, I mean "decide" in the sense of actual generalized AI where the drone, on its own initiative, makes the conscious decision whether or not to kill a human target. We're a very long way off from that.

1

No-Fly-6043 t1_jasxa5a wrote

I mean I play 5D chess with multidimensional time travel irl, and it is/was/going to be fine

1

LystAP t1_jarzfvh wrote

r/combatfootage got a good preview of what’s coming.

3

Assassin_by_Birth t1_jau65a6 wrote

Why would they need you to operate?

1

Bobtheguardian22 t1_jau9a0h wrote

to tell the AI whats a school bus and whats an enemy combatant!😥

3

Afferbeck_ t1_javbi3h wrote

Seems like AI couldn't do a worse job at that than the humans have been

1

GhettoFinger t1_jawlxdd wrote

That’s not what the AI’s role would be. The human will make the decision to fire at something or someone, but they wouldn’t need you to operate its movement. So you tell the AI robot dog to kill that thing, then it moves, aims and fires on its own. So the human is making the determination what is a school bus and and enemy combatant, not the AI.

1

Parametric_Or_Treat t1_jasles1 wrote

I swear to God if I have to stand and salute a robot dog veteran before an NFL game as a robot R and B Singer sings America the Beautiful I will lose it

8