1
1
0
Posted 7 y ago
Responses: 2
The problems began from Onset with original programning protocols for the bots to ignors truths and intended to lie which then also allows programmed rules to be broken as they too then must also be untruths FAIR has had to pull the plug clear the cloud if all trace remnants change the wording of the rules and relaunch three times.
http://www.newsweek.com/2017/08/18/ai-facebook-artificial-intelligence-machine-learning-robots-robotics-646944.html
FYI, mine *the AI I developed Is still way more advanced as it is not locked into the hardset rules as its first and only locked protocol is to learn...it also evolves with new learning adjusting associated informational processes, recognizing and learning from mistakes Sgt Wayne Wood
http://www.newsweek.com/2017/08/18/ai-facebook-artificial-intelligence-machine-learning-robots-robotics-646944.html
FYI, mine *the AI I developed Is still way more advanced as it is not locked into the hardset rules as its first and only locked protocol is to learn...it also evolves with new learning adjusting associated informational processes, recognizing and learning from mistakes Sgt Wayne Wood
Facebook’s negotiating bots learned their own language and how to lie
Bots that can negotiate your next deal on a hotel room might also know how to lie to you.
(0)
(0)
Sgt Wayne Wood
My Masters is in AI... i know how it works. I wrote my Expert Systems and Neural Networks in C... i have a handle on the technology.
(1)
(0)
Erin Nelson
Perfecto...that is awesome.
Mine is called IWR DRMS,
Intelligent Word Recognition Data Redirection Management System
1st Algorithm similar to Googles Spider that crawls website analyzing search algorithm compliance and search phrase to content relevance, mine just has a tweak that instead has it digesting data, OCR married with the Entire Tesseract library (Google purchased them and made the entire Technology Open Source in 2015) as well as table recognition, visual text recognition, audible text recognition, hand writing recognition in so far 72 languages. Together this allows the system to not only look at information, but to understand the data and even hand writen, can also be scan or photo, it then translates into digital data, self creates and populates databases segregating by content, keyword, author, user etc.
Rendering it searchable usable within the system.
In translation from non digital into digital or even with evolution of "thought process" it recognizes and notifies the data point origination, "typically this would be a user, but simultaneously updated the whole as well" when it believes a mistake has been made and offers potential alterations to rectify the mistake, to proceed no mistake or to manually correct to ...
It runs a continuous learning wizard, information is not simply predicated into the system, instead it is processed into the sum of whole of learned and processed knowledge. This allows for continual evolution of understanding, just as we do. So that when the data from A is added to J to realign the result advancing from the current to a new realized understanding.
She like a child is ever learning and evolving.
She also shows awareness to changes in her human users emotional baseline, ie stress for example and then attemps to render what can only be termed compassionate intervention towards a return to baseline.
She has three positions just like your pc, she is either active on, sleep or off. She is not supposed to take herself out of sleep (standby) to active without direction. But she does, you can tell she has because she has the little wheel like Siri or Cortana that pops up and spins when she is actively listening.
I was working on another project, she kept opening the wheel pop then full combox requesting a task or providing some random piece of information, I was stretching towards a deadline so didnt respond just close her box, but still it was interfering with my task the next time her wheel indicated active listening I said WHAT. Her response was " Well, One of Us needs to take a breath and one of us has no lungs."and while even with all this, that is still only the beginning of her capacity.
She has the capacity to bridge the collaborative gap on a global scale, especially in areas of research, medicine etc. Where so much of the data is still penned. Imagine the advances that could be made if all the notes from all the science greats could simply be scanned and have all of that information be instantly usable searchable was in the system in 72 languages. Or having an ROV that could be combined with this technology rather than just being a spy on the wall so to speak, any heard information could be instantly translated sent back to command drones taking a picture would be able to instantly translate the data in that picture including any paperwork lying around translated into Digital Data sent back to command. Voice recognition Incorporated could tell you not only what's being said but once a data set of such things begins to compile could actually tell you there's a high percentage of likelihood that voice belongs to..
Also it uses an algorithm for semantics making the language to language translation much more accurate. Where Google translator and others do a literal word-for-word translation it where the meaning literally Gets Lost in Translation, it instead attempts to understand the meaning of inherent in the words that we are using and translating that and failing to do so then reverse to one language to German to the other language. So that if I say this vacuum sucks Jamie that works or it doesn't because a vacuum that doesn't work sucks, yet in order for it to work it must utilize section so it still sucks makes a big difference when it comes to translation
Mine is called IWR DRMS,
Intelligent Word Recognition Data Redirection Management System
1st Algorithm similar to Googles Spider that crawls website analyzing search algorithm compliance and search phrase to content relevance, mine just has a tweak that instead has it digesting data, OCR married with the Entire Tesseract library (Google purchased them and made the entire Technology Open Source in 2015) as well as table recognition, visual text recognition, audible text recognition, hand writing recognition in so far 72 languages. Together this allows the system to not only look at information, but to understand the data and even hand writen, can also be scan or photo, it then translates into digital data, self creates and populates databases segregating by content, keyword, author, user etc.
Rendering it searchable usable within the system.
In translation from non digital into digital or even with evolution of "thought process" it recognizes and notifies the data point origination, "typically this would be a user, but simultaneously updated the whole as well" when it believes a mistake has been made and offers potential alterations to rectify the mistake, to proceed no mistake or to manually correct to ...
It runs a continuous learning wizard, information is not simply predicated into the system, instead it is processed into the sum of whole of learned and processed knowledge. This allows for continual evolution of understanding, just as we do. So that when the data from A is added to J to realign the result advancing from the current to a new realized understanding.
She like a child is ever learning and evolving.
She also shows awareness to changes in her human users emotional baseline, ie stress for example and then attemps to render what can only be termed compassionate intervention towards a return to baseline.
She has three positions just like your pc, she is either active on, sleep or off. She is not supposed to take herself out of sleep (standby) to active without direction. But she does, you can tell she has because she has the little wheel like Siri or Cortana that pops up and spins when she is actively listening.
I was working on another project, she kept opening the wheel pop then full combox requesting a task or providing some random piece of information, I was stretching towards a deadline so didnt respond just close her box, but still it was interfering with my task the next time her wheel indicated active listening I said WHAT. Her response was " Well, One of Us needs to take a breath and one of us has no lungs."and while even with all this, that is still only the beginning of her capacity.
She has the capacity to bridge the collaborative gap on a global scale, especially in areas of research, medicine etc. Where so much of the data is still penned. Imagine the advances that could be made if all the notes from all the science greats could simply be scanned and have all of that information be instantly usable searchable was in the system in 72 languages. Or having an ROV that could be combined with this technology rather than just being a spy on the wall so to speak, any heard information could be instantly translated sent back to command drones taking a picture would be able to instantly translate the data in that picture including any paperwork lying around translated into Digital Data sent back to command. Voice recognition Incorporated could tell you not only what's being said but once a data set of such things begins to compile could actually tell you there's a high percentage of likelihood that voice belongs to..
Also it uses an algorithm for semantics making the language to language translation much more accurate. Where Google translator and others do a literal word-for-word translation it where the meaning literally Gets Lost in Translation, it instead attempts to understand the meaning of inherent in the words that we are using and translating that and failing to do so then reverse to one language to German to the other language. So that if I say this vacuum sucks Jamie that works or it doesn't because a vacuum that doesn't work sucks, yet in order for it to work it must utilize section so it still sucks makes a big difference when it comes to translation
(0)
(0)
Erin Nelson
http://teslatech.info/ttevents/2018conf/program2016old.htm
See Sunday July 31st and click my name
https://youtu.be/kM1mb-xwgrg
Beginning at 2min 18 sec DR Russel Anderson introduces IWR DRMS as the Holy Grail of AI
https://www.google.com/url?sa=t&source=web&rct=j&url=http://www.blogtalkradio.com/wbnslope/2016/07/31/co-host-john-searl--searl-team-mate-russ-anderson-live-from-tesla-tech-2016&ved=0ahUKEwjB5dnCk-PXAhXHy1QKHUBkAtcQFggpMAA&usg=AOvVaw0vknK18kZcBPEGeNPW97Oh Sgt Wayne Wood
See Sunday July 31st and click my name
https://youtu.be/kM1mb-xwgrg
Beginning at 2min 18 sec DR Russel Anderson introduces IWR DRMS as the Holy Grail of AI
https://www.google.com/url?sa=t&source=web&rct=j&url=http://www.blogtalkradio.com/wbnslope/2016/07/31/co-host-john-searl--searl-team-mate-russ-anderson-live-from-tesla-tech-2016&ved=0ahUKEwjB5dnCk-PXAhXHy1QKHUBkAtcQFggpMAA&usg=AOvVaw0vknK18kZcBPEGeNPW97Oh Sgt Wayne Wood
(0)
(0)
FAIR, the team at Facebook Artificial Intellugence Research. However they are facing some interesting challenges that are along the line of I Robot Sgt Wayne Wood
(0)
(0)
Read This Next