Posted on Jun 16, 2018
SFC Marc W.
2.81K
0
5
0
0
0
As tech companies throughout the world toy with artificial intelligence, who should develop the values, morals, or guidelines and what should they include?
Avatar feed
Responses: 3
Barry Davidson
0
0
0
True AI is decades to centuries away, if we ever attain it at all. That being said, what is moral? Each person has their own definition, and that would translate to programming. That would make them hopelessly unstable in my opinion. Especially since they learn on their own, and humanity isn't a good example.

Some would say that the 3 laws of robotics is a good starting place. I'd disagree.
(0)
Comment
(0)
SFC Marc W.
SFC Marc W.
>1 y
Hence why I bring this up for discussion, so much to it.
(0)
Reply
(0)
Avatar small
LTC Stephan Porter
0
0
0
Two questions then:

1) Hiw would you describe the development of AI as a “moral” issue?

2) the golden rule can mean different things to different people. Can you be more specific.
(0)
Comment
(0)
Avatar small
SPC Joseph Wojcik
0
0
0
Well, the moral thing to do would to not program AI in the first place.
But, I guess if they insist on it, AI should be programmed to respect natural law (think golden rule).
(0)
Comment
(0)
SFC Marc W.
SFC Marc W.
>1 y
I'm very weary of AI, but I think it can be done right and that it can be a big help in many ways.
(0)
Reply
(0)
Avatar small

Join nearly 2 million former and current members of the US military, just like you.

close