Being someone with autism, I always found myself being drawn to the logic-oriented characters of science fiction movies, such as Star Wars’ C-3PO, Sonny from I, Robot, or Baymax from Big Hero 6. They tend to be extremely knowledgeable about certain subjects, but also struggle to understand concepts of human social behavior such as body language or sarcasm. This is a struggle for many individuals like myself.
Recent advances in artificial intelligence have been addressing such challenges, and they may lead to robots becoming more human than most science fiction predicts.
A form of artificial intelligence called deep learning is being used to read and interpret human body language and facial expressions — and this software could help individuals with autism become more sociable.
Stanford University is developing such software for Google Glass, a wearable computer developed by Google that the user wears like a pair of glasses. The software uses interactive applications that train the wearers to identify examples of emotions in his, her, or their environment. According to Annett Hahn Windgassen of the San-Francisco area, her son, Erik, in 2016, became more engaged with his peers thanks to this technology.
This would be an effective tool for autistic individuals. In my experience at meetups with fellow autistics, the majority of us are drawn to video games. Many individuals with autism are drawn to a special interest, a topic that the person in question obsesses about and develops an encyclopedic knowledge about. Individuals with autism would be right at home using a video game that requires the player to identify human emotions..
This software would also be an ideal fit for robots, since computers essentially think like those with autism to begin with.
At Vanderbilt University, computer science professor Maithilee Kunda and her team have developed software that functions much like the minds of individuals such as autism advocate and animal science professor Temple Grandin, whose thought processes are, according to Kunda, “much on the visual side.†The software scans visual quizzes, which it then solves.
Kunda also notes that the relationship between artificial intelligence and autism is symbiotic. Creating these programs will help researchers to better understand the autistic brain, and understanding autism will better inform software built for not only those with autism, but also others with unique modes of learning and perceiving the world.
I am excited for the potential of emotionally intelligent robots in the near future. It would be every science fiction nerd’s dream to have their very own C-3PO, someone who can quote-unquote “speak†autistic, and would not lose patience with me if I do not understand sarcasm or accidentally use body language that neurotypical humans would find offensive.
On the flip side, robots and autistic humans should not be considered one and the same.
An old adage states that “If you have met one person with autism, you have met one person with autism.†What may be a challenge for some of us may not necessarily be a challenge for other Autistic human beings. If I use Google Glass to help me better navigate social situations, that does not mean I am the same as a robot that lacks the proper software. We are not robots with machine learning, and we are not robots without it. If robots are compared to Autistic humans, it could potentially dehumanize us, making us seem to neurotypicals no more than fleshy computers.
So the question is, will deep learning help people with autism integrate better into the neurotypical community, or will we be seen as meaty robots?
- Apple Takes on Meta in Race to Make VR Mainstream - October 20, 2023
- Deep learning in autism treatment blurs line between machine and human - May 21, 2019
- Reverie Season 1 Review: Now I’m Scared of VR - August 15, 2018