Posted on Mar 3, 2018
Opinion | Human intelligence can’t be transferred to machines
1.31K
8
14
2
2
0
Posted 7 y ago
Responses: 4
I do not buy it, perhasp not in the near future, but never is a long long time.
Clarke's three laws,
1. When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.
2. The only way of discovering the limits of the possible is to venture a little way past them into the impossible.
3. Any sufficiently advanced technology is indistinguishable from magic.
All three apply here
Clarke's three laws,
1. When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.
2. The only way of discovering the limits of the possible is to venture a little way past them into the impossible.
3. Any sufficiently advanced technology is indistinguishable from magic.
All three apply here
(3)
(0)
Categorically untrue. Human brains (all brains) operate on synapses and neurons. Everything we think, everything we feel, everything we opine, everything we 'hunch'...operates on those two things. And those things exist in two states, and two states only: Off. Or on. You think you love someone? That's specific neurons in your head firing. You multiply 2 by 6, that's specific synapses firing. At it's most base level, quantifying this is simplicity itself. Orders of magnitude of difficulty are introduced by the sheer number of POSSIBLE states of ALL your synapses and neurons, in terms of on or off. One math professor told me in college that that number is greater than the number of atoms in the known universe. Ok. Fine. Go with that. You know what else operates in 2 states, on or off? Computers. That'll take a butt-load of computing power to simulate. Probably not more than we have today if, say the US government turned ALL it's collective computing power to it. Which, of course, they won't. Which just means we need advances in computing power. But it will come. My guess is within 200 years, we'll be able to duplicate a specific individual on silicon. Or whatever medium (probably quantum).
(2)
(0)
Sgt Wayne Wood
Neurons do not operate in a binary environment. There are endless variants of ‘maybe’ due to differeent thresholds required for a given connection to fire. Since each neuron connects with MANY others, the possibilities are literally limitless.
Corollary, the more you know the more you CAN know
Corollary, the more you know the more you CAN know
(0)
(0)
SN Greg Wright
Sgt Wayne Wood - Individual neurons are still either off or on, no matter their particular threshold. I'll stick to my guns on this.
(0)
(0)
Sgt Wayne Wood
SN Greg Wright that would be applicable in situations where there were 1:1 correspondence between the base neuron and it’s downstream corresponents.
Unfortunately we find that neurons can ‘intercept’ signals along the axon as well as feedback to themselves to either amplify or retard the signal...
Fuck if i know... i used your model when i did all my work simply because i could model it. but it IS incorrect & simplistic. :-)
Unfortunately we find that neurons can ‘intercept’ signals along the axon as well as feedback to themselves to either amplify or retard the signal...
Fuck if i know... i used your model when i did all my work simply because i could model it. but it IS incorrect & simplistic. :-)
(0)
(0)
Capt Daniel Goodman
PO1 Tony Holland
As soon as the author mentioned “feelings” and “emotions” his whole premise went down the shitter.
We can already program AI to mimic almost any behavior, we can also create AI that can learn through supervised (or unsupervised) trials.
What we lack, thusfar, is the hardware with sufficient capacity to make it fully realized.
I’ve written Expert Systems & Neural Networks and you can get a few basic architectures to perform any number of things... having said that, WTF did the author mean with the term “general artificial intelligence”?
By definition, expert systems are expert in the domain for which they are created. Artificial Neural Networks only adapt to the environment for which they are created. Unless there is a massive AHA! moment, i don’t see a change in that any time soon.
Having said THAT, who can predict the black swan?
As for emotions, ethics, feelings, morals, etc... you can’t model what you can’t explain.
To quote Hathaway, “What is love? (baby don’t hurt me)”
PO1 Tony Holland
As soon as the author mentioned “feelings” and “emotions” his whole premise went down the shitter.
We can already program AI to mimic almost any behavior, we can also create AI that can learn through supervised (or unsupervised) trials.
What we lack, thusfar, is the hardware with sufficient capacity to make it fully realized.
I’ve written Expert Systems & Neural Networks and you can get a few basic architectures to perform any number of things... having said that, WTF did the author mean with the term “general artificial intelligence”?
By definition, expert systems are expert in the domain for which they are created. Artificial Neural Networks only adapt to the environment for which they are created. Unless there is a massive AHA! moment, i don’t see a change in that any time soon.
Having said THAT, who can predict the black swan?
As for emotions, ethics, feelings, morals, etc... you can’t model what you can’t explain.
To quote Hathaway, “What is love? (baby don’t hurt me)”
(1)
(0)
Capt Daniel Goodman
Interestingly, of similuates the thalomocorticsp tracts,mamd allows simulated vidupa input using a webcam, wjenni had it running, it was quite impressive, though there is warning matieriap about processor overheating in using it....
(0)
(0)
Sgt Wayne Wood
I bet they used MASSIVELY overclocked gaming ststems to pull it off. If there’s a CPU thermal event occurring then you’ll have to go with liquid coolers, or crack the case, build your own heatsinks and blow cold air directly into the case (route it through a cooler full of dry ice, maybe? But then, condensation issues.)
You need a massively parallel multi-CPU oR multi-GPU system to get it working. That raises the issue of buss latency, or worse, attenuation from heat (again).
IBM has/had LANs in a box with optical connections that were nifty, but there are still issues with CPU/GPU heating.
So, the software is pushing the limits of the hardware
You need a massively parallel multi-CPU oR multi-GPU system to get it working. That raises the issue of buss latency, or worse, attenuation from heat (again).
IBM has/had LANs in a box with optical connections that were nifty, but there are still issues with CPU/GPU heating.
So, the software is pushing the limits of the hardware
(0)
(0)
SN Greg Wright
Sgt Wayne Wood - GPUs will always be better for this than CPUs. And what, you don't OC and liquid cool your pc now?
(0)
(0)
Sgt Wayne Wood
SN Greg Wright iphone 6+ is my ‘PC’ now & it neither liquid cools nor overclocks :-).
Look into the chinese NPUs... GPUs (like CPUs) are still sequential even with pre-fetch (which is the current security issue)
Look into the chinese NPUs... GPUs (like CPUs) are still sequential even with pre-fetch (which is the current security issue)
(0)
(0)
Read This Next