Future A.I. May Suffer Psychological Issues

According to top neuroscientist Zachary Mainen, it's entirely plausible that future artificial intelligence programs, as they inch closer and closer to human-level intelligence and similar function, may wind up suffering many of the same neurological and psychological glitches that humans do. The way that the human mind works involves a great number of variables, and the slightest thing going wrong with brain chemistry, stress levels, or outside influences can cause migraines, depression, hallucination, and other ill effects. In much the same way, the kind of self-modifying and ever-growing code bases that AI based on neural networking are built on could cause similar issues.

Mainen says that his ideas mostly come from a field of study called computational psychiatry, wherein psychiatrists study AI programs that are rigged up with certain conditions and made to be as similar as possible to the human brain. While this is currently done mostly on a testing and theoretical basis, the possibility for applications in AI engineering is quite strong. He explains that Serotonin is a regulating chemical in the brain that can be responsible for depression and hallucinations if something with it goes wrong, such as a wrong dosage or having it go to the wrong place. Mainen says that something similar could happen in an AI program that uses neurochemistry as a basis to operate on a humanlike level.

While this sounds like science fiction at first, it's not entirely unthinkable; serotonin is a regulator, and a script that serves to rein in or narrow down expectations, or tone down a response, could easily substitute. Should this be used inappropriately, an AI's perception of a situation could be affected by its expectations, approach, processing method, or other factors. Dopamine, as another example, is a reward signal in the human brain, and an AI that's being taught with a reward system could easily have a positive stimuli of some sort substitute for dopamine. All of the basic concepts needed to make an AI that operates in a similar fashion to the human brain are already present in the mainstream AI scene, but any such AI would require an incredible amount of training, perhaps even an impractical amount, just to be able to properly regulate its innate processes so that it can begin gathering and processing information.

Copyright ©2019 Android Headlines. All Rights Reserved
This post may contain affiliate links. See our privacy policy for more information.
You May Like These
More Like This:
About the Author
2018/10/Daniel-Fuller-2018.jpg

Daniel Fuller

Senior Staff Writer
Daniel has been writing for Android Headlines since 2015, and is one of the site's Senior Staff Writers. He's been living the Android life since 2010, and has been interested in technology of all sorts since childhood. His personal, educational and professional backgrounds in computer science, gaming, literature, and music leave him uniquely equipped to handle a wide range of news topics for the site. These include the likes of machine learning, voice assistants, AI technology development, and hot gaming news in the Android world. Contact him at [email protected]
Android Headlines We Are Hiring Apply Now