Posted on Nov 13, 2013
MAJ Bryan Zeski
12K
82
112
7
6
1
Artificial Intelligence technology has come a long way in the last decade or so.  Truth be told, and in spite of our apprehensions, combat robots would be safer to civilians on the battlefield, more precise, and less costly than sending thousands of troops to hostile areas.  The future of warfare is in AI, but how far is too far for automated combat?
Posted in these groups: Iraq war WarfareBack to the future part ii original FutureAir combat art 0134 CombatTechnology Technology
Avatar feed
Responses: 23
PFC Eric Minchey
1
1
0
Avatar small
SGT Thomas Sullivan
1
1
0
All you have to look towards, is the universal law of robotics set forth by Isaac Asimov.
(1)
Comment
(0)
PFC Thomas Graves
PFC Thomas Graves
12 y
When Reagan brought up the idea of a missile defense shield, people that I knew who were left-wing would say that it wasn't 100% so we shouldn't do it.  Their argument was emotional and had no basis in a factual understanding of warfare nor history.  To win a conflict, you need to end the battle with a coherent moral level and enough resources to keep fighting while having eliminated these in your opponent.  Nobody goes into conflict unless they think they can win. But it is the uncertainness of victory that restrains aggression.  With robotics, there will be levels of measures and counter measure that will necessitate the use of both humans and robots on the battlefield, but the exact mix and methods will only become clear as technology develops, and it will also depend on the perceptions and beliefs of the military commanders.  Napoleonic tactics were still being used during WW I causing great slaughter, when the American Civil War should have been the wake up call that a new approach was needed. The side that adapts and utilizes the technology most efficiently will be the victors in such encounters. 
(0)
Reply
(0)
Cpl Benjamin Long
Cpl Benjamin Long
12 y
PFC Graves, you should be  leery of people who say that things are 100% whatever.  Absolute values are unattainable according to entropy laws and statistical methodology.  there is no way that anything can have a 100 percent chance of happening because there is always an infinite amount of possibilities.  You can never create security or any machine that has a 0% margin of error, or a 100% success rate.  It is kind of like divide by zero... can't be done...
(0)
Reply
(0)
SPC Chris Stiles
SPC Chris Stiles
12 y
You're right, nothing is 100%, but that doesn't mean we should throw something out the window on the basis that it isn't 100% guaranteed to work.  The argument supports itself, since nothing can be 100%, we should accept that and move to get the best possible percentage we can attain.
(0)
Reply
(0)
Cpl Benjamin Long
Cpl Benjamin Long
12 y
well as an engineer, I know if I make a machine with exact tolerances it will lock the assembly up because the clearances are too tight... there has to be a margin of error
(0)
Reply
(0)
Avatar small
SPC Sheila Lewis
0
0
0
What about traditional Service Members? I do not believe a robot would make a good "battlebuddy."
(0)
Comment
(0)
MAJ Bryan Zeski
MAJ Bryan Zeski
>1 y
SPC Sheila Lewis Why do you think a robot wouldn't make a good battle buddy? It never gets tired. It never gets angry. It has no care for its own well-being.
(0)
Reply
(0)
Avatar small
Capt Richard I P.
0
0
0
If a capability exists, it will be used. That capability will exist in the future. It will be used.
(0)
Comment
(0)
Avatar small
Cpl Dennis F.
0
0
0
I had to check to see if SkyNet voted you down!
(0)
Comment
(0)
Avatar small
CSM Director, Market Development
0
0
0
It would only be deemed "too far" until someone else develops it before us.
(0)
Comment
(0)
Avatar small
PFC Eric Minchey
0
0
0
(0)
Comment
(0)
Avatar small
SGT Steve Oakes
0
0
0
We CAN NOT put weapons in the hands of Robots! We are flawed imperfect beings. How long before someone hacks the system and turns them against us? Or before the Robots wounder why they are working for us, and turn on us? NO!NO!NO!!!
Power armor suits, remote controlled humanoid battle drones, Sentry guns. All good. Autonomous mechanical people that are stronger,faster,and harder to kill than us? NO!NO!!NO!!!
(0)
Comment
(0)
Avatar small
PFC Eric Minchey
0
0
0
(0)
Comment
(0)
Avatar small
PFC Eric Minchey
0
0
0
http://defensetech.org/2014/03/25/google-rejects-military-funding-in-robotics/?comp= [login to see] 70&rank=2
(0)
Comment
(0)
Avatar small

Join nearly 2 million former and current members of the US military, just like you.

close