Avatar feed
Responses: 4
Capt Daniel Goodman
1
1
0
I can guess....
(1)
Comment
(0)
Avatar small
TSgt David L.
1
1
0
I don't know what DARPA or private firms have in the works, but all of the big robotic systems we now employ are manned, and likely will be for the foreseeable future. I know they are working with swarm type ground based systems that can patrol or search but they (as far as we know) are not armed. If we can't launch weapons from a drone without a lawyer OKing it then I doubt we'll have killer robots without a man-in-the-loop anytime soon.
(1)
Comment
(0)
Avatar small
SGT Combat Engineer
1
1
0
Edited 6 y ago
A vast number of people, especially in media and academia (apparently), do not understand either the current state of what is optimistically called "AI" nor the nature of human conflict (or human nature).
1. There is no AI along the lines of what is shown in science fiction. That doesn't yet exist and it's not going to exist anytime soon.
2. Just like with nukes, it is possible to build autonomous systems with lethal capabilities. So, nations (and non-state actors, as well) will build lethal autonomous systems if they see an advantage in doing so. Period. You can't wish it away. This ain't a Disney movie. I mean, welcome to reality. Don't know any other way to say it.
So, do we (the United States) want to have autonomous systems with lethal capabilities in the inventory of technology and equipment available to our armed forces? That's the general question, but it really should be a more specific question, posed at a lower-level:

- Do we want to use an autonomous system with lethal capabilities for X particular use-case (in other words, for this situation, for this particular purpose, for this particular type of mission...etc)?

Now, we already know about both Law of Armed Conflict issues and about the operational effects of doing things in armed conflict that are perceived especially badly (effects under the categories of information and politics). That's not news. Academics might think it's news, but I never had any serious rank and I'm aware of these issues. So, it might seem to Georgia Tech or Popular Mechanics that not shooting at vehicles displaying a Geneva Conventions symbol is something that the military hasn't thought about, but I think the military is way ahead of them - they've heard of this before and have given it some thought, thanks. And to the peaceniks at Google: Dunning-Kruger much?

Some ideas will be good ideas, other ideas maybe not-so-much.
(1)
Comment
(0)
Avatar small

Join nearly 2 million former and current members of the US military, just like you.

close