Responses: 3
True AI is decades to centuries away, if we ever attain it at all. That being said, what is moral? Each person has their own definition, and that would translate to programming. That would make them hopelessly unstable in my opinion. Especially since they learn on their own, and humanity isn't a good example.
Some would say that the 3 laws of robotics is a good starting place. I'd disagree.
Some would say that the 3 laws of robotics is a good starting place. I'd disagree.
(0)
(0)
Two questions then:
1) Hiw would you describe the development of AI as a “moral” issue?
2) the golden rule can mean different things to different people. Can you be more specific.
1) Hiw would you describe the development of AI as a “moral” issue?
2) the golden rule can mean different things to different people. Can you be more specific.
(0)
(0)
Well, the moral thing to do would to not program AI in the first place.
But, I guess if they insist on it, AI should be programmed to respect natural law (think golden rule).
But, I guess if they insist on it, AI should be programmed to respect natural law (think golden rule).
(0)
(0)
Read This Next