I recently wrote an article for Scientific American called 'Robots with Heart'. In the piece, I described our work into incorporating an 'empathy module' into robots in order for them to better serve the emotional and physical needs of humans. While many readers offered ideas on how we might apply these empathetic robots to medical or other applications, some objected to the very idea of making robots recognize and empathize with human emotions. One reader opined that, as emotions are what make humans human, we really should not build robots with that very human trait and take over the care-giving jobs that humans do so well. On the other hand, there are others who are so enthusiastic about this very idea that they ask me, "If robots are intelligent and can feel, will they one day have a conscience?"
Perhaps it is important for us to understand what it means by robot intelligence and feeling. It is important for us to understand, first of all, how and why humans feel.
What is the role of emotion in the evolution of our species? Research has shown that humans bond with other humans by establishing a rapport. The survival of a species depends on that bonding, and much of this bonding is enabled by emotion. We also signal our intent with emotion.
Our feelings and emotions are triggered by stimuli, either external or internal (such as a memory), and manifest themselves in terms of physical signs – pulse rate, perspiration, facial expressions, gesture, and tone of voice. We might cry or laugh, shudder in disgust, or shrink in defeat. Unlike our language, much of these emotions are expressed spontaneously and automatically, without any conscious control. We learn to recognize emotions in other human beings from birth. Babies are soothed by the gentle humming of a lullaby even before they are born. They respond to the smiling face of a parent at birth and are certainly capable of expressing their own emotions from day one.
Industry robots build our cars and our smartphones. Rehabilitation robots help people walk again. Machine teaching assistants can answer student questions. Software programs can write legal documents. They can even grade your essays. Software systems can write stories for newspapers. An Artificial Intelligence (AI) program just beat a human at Go, known to be the most complicated board game. IBM Watson beat human champions at Jeopardy. Machines can even paint to the point of fooling humans into believing the result to be that of a professional human artist. Machines can compose music. Robots can obviously be built to be stronger, faster, and smarter than humans in specific areas. But do they need to feel like we do?
In early 2016, our team announced the first known system that can recognize a dozen human emotions from tone of speech instantaneously and in real-time. Prior to this work, recognizing emotions from tone of voice would incur some delay in processing time due to a procedure called 'feature engineering', a delay that is unnatural in a human-robot communication scenario. To understand how we achieved this, we need to understand machine learning.
Every robot is run on a hardware platform driven by software algorithms. An algorithm is designed by humans to tell the machine how to respond to certain stimuli for example, or how to answer a question, or how to navigate around a room. Much like an architect building a house, an AI engineer looks at the whole picture of what the task the machine is supposed to achieve, and builds software 'blocks' to make it achieve that task. What is called programming is simply the implementation of the codes that realizes these blocks. One of the most important blocks is machine learning – algorithms that enable machines to learn and simulate human-like responses, such as a chess move, or to answer a question.
What has fueled real breakthroughs in artificial intelligence has been machine learning. Instead of being programmed to respond in certain, predictable ways, machines are programmed to learn from large amounts of real-world examples of stimuli-responses. If a machine looks at tonnes of cat pictures labelled 'cat', it can use any one of the many machine learning algorithms to recognize a cat from any unseen picture. If a machine looks at trillions of websites and their translations, it can learn to approximate translation in the manner of Google Translate.
A critical part of machine learning is to learn the representation of the characteristics, called features, of the physical input. A cat is represented by its contour, edge, facial and body features. Speech input is represented by the frequency components of the audio. Emotions in speech are represented by not just the pitch, but the chroma, the tempo, the speed, of that voice. Machine learning needs to first perform feature engineering to extract these characteristics. For tone of voice, feature engineering typically extracts 1000-2500 characteristics from the input audio, and this process slows down the whole emotion recognition process. These thousands of features are carefully designed by humans and each of them requires processing time.
Recent breakthroughs in neural networks, aka deep learning, enabled by both machine speedup and massive amounts of data for learning, have led to vast improvements in machine learning. To start with, some deep learning methods, such as convolutional neural networks (CNN), can automatically learn the characteristics during the learning process, without an explicit and delayed feature engineering process or human design. This is perhaps the most important contribution of deep learning to the field of AI.
Coming back to our emotion recognition system from tone of voice, what we did is replace feature engineering and classifier learning by a simple convolutional neural net, which learns just as well, if not better, than classical machine learning approaches, and is much faster, because it does not require an explicit and slow feature engineering process. Similarly, facial expression recognition can be done in real-time with a CNN.
In addition, researchers are working to enable robots to express emotions - changing the pitch of its machine voice, using dozens to hundreds of tiny motors to control the synthetic facial muscles. The androids Sofia or Erica are two examples of humanoid robots with facial expressions.
Human-robot bonding and the Fourth Industrial Revolution
The Fourth Industrial Revolution is upon us. And it seems that technology is poised to replace humans in many areas. Skills that took years, maybe decades, to acquire seem to become obsolete overnight. Most of the population are not aware of the pace of progress made in AI and robotics prior to the current torrent of publicity, and are extrapolating what they see today to predict that robots will take over in 30 years, or 50 years. There has been a lot of anxiety in the society regarding the question of if and when will robots 'take over' from humans.
Truth is, this kind of prediction has been around for a long time. It happened during all previous industrial revolutions, when people feared steam engines or computers would render humans redundant. What has always happened is that people simply learned different skills to manage these machines, and more.
Nevertheless, with more applications of AI and robots, a new kind of relationship between human and machines needs to evolve. For humans to be less fearful of and to trust a walking, talking, gesturing and weight-carrying robot, we need to have mutual empathy with the robot. What sets a robot apart from mere electronic appliances is their advanced machine intelligence – and emotions. To understand the cry of a baby, or the painful groan in a patient’s voice, is critical to home care robots. For robots to be truly intelligent, they need to 'have a heart'.
Still, will robots be conscious?
If a robot develops analytical skills, learning ability, communication, and even emotional intelligence, will it have a conscience? Will it be sentient? Can it dream?
The above-mentioned neural networks, unlike other machine learning algorithms, remind people more of our own brains. Neural networks can even generate random, dream-like images, leading some to believe that even robots can dream.
The real question is do we understand what makes us humans sentient? Is it just the combination of our sensory perception and the thinking process? Or is there more to it? AI researchers cannot answer this question, but we do believe that to make 'good' robots we have to teach them values – a set of decision-making rules that follow our ethical and moral norms. With the expansion in robot intelligence, teaching values to machines will become as important as teaching them to human children. Our next challenge would be to enable automatic machine learning of such values – once they have the prerequisite emotional recognition and communication skills.
Pascale Fung, Professor of Electronic and Computer Engineering, Hong Kong University of Science and Technology
The views expressed in this article are those of the author alone and not the World Economic Forum.
A weekly update of what’s on the Global Agenda
Do you remember how Anakin Skywalker was seriously burnt and lost his legs in the third episode of Star Wars, The Revenge of the Sith? And do you also remember how robot surgeons did the best they could to save him? In the very near future, similarly amazing robots might come to healthcare to save our lives, too.
Medical robots do not only exist in sci-fi movies and the distant future, they are coming to healthcare and all stakeholders must prepare for them. Robots can support, assist and extend the service health workers are offering. In jobs with repetitive and monotonous functions they might even obtain the capacity to completely replace humans.
Thus, medical professionals and caretakers would do well to learn more about medical robots: what they are capable of, how to work with them and in what way they might complement the tasks they perform daily. Otherwise human medical workers might get replaced or grow frustrated if they experience that robots are able to do their jobs and they cannot change their previous tasks into something irreplaceable.
Here are the most exciting medical robot facts:
1) 70% Drop in Hospital Acquired Infections due to Xenex Robot
Statistics of the Centers for Disease Control and Prevention show that in the United States 1 in every 25 patients will contract hospital acquired infections (HAIs) such as MRSA (methicillin-resistant Staphylococcus aureus) and C. diff (Clostridium difficile), and 1 in 9 will die.
The Xenex Robot might constitute the next level of hygiene. It allows for fast and effective systematic disinfection of any space within a healthcare facility. This helpful automatic tool destroys deadly microorganisms causing HAIs by utilizing special UV disinfection methodologies. The Xenex Robot is more effective in causing cellular damage to microorganisms than other devices for disinfection, thus the number of HAIs might be more effectively reduced. Westchester Medical Center reported a 70 per cent drop in Intensive Care Unit C. diff with the use of Xenex Robots.
2) Two Belgian Hospitals “Hired” Pepper Robots as Receptionists
Pepper, the 1.2 meter tall humanoid “social robot” will be “employed” as a receptionist in two Belgian hospitals. It’s a fascinating idea – because let’s be honest: there is not a single person who was not even once greeted by a grumpy receptionist during a hospital visit and got lost in a hospital floor due to information hastily provided by kind but tired nurses at the end of their shift.
Pepper can recognise the human voice in 20 languages and can detect whether it is talking to a man, woman or child. Its skills enable Pepper to “work” as a receptionist in huge hospitals and to accompany visitors to the correct department so they do not get lost while trying to see their loved ones. “Social robots” such as Pepper or the smaller Nao might also be used as assistance in exercise sessions and help children overcome their fears of surgery.
Thus, the monotonous and repetitive job of being a receptionist might be replaced by Pepper whose programmed smile will greet everyone in the same way, while the person behind the reception might receive a more creative task in the future.
3) By 2020, surgical robotics sales are expected to almost double to $6.4 billion
What would you ask from a good fairy before undergoing an operation? You would ask for a successful procedure and the doctor being in his or her best shape, wouldn’t you? The da Vinci Surgical System does exactly this: it enables the surgeon to operate with enhanced vision, precision and control. Thus it contributes greatly to a successful procedure.
This robotic system features a magnified 3D high-definition vision system and tiny wristed instruments that bend and rotate far greater than the human hand. With the da Vinci Surgical System, surgeons operate through just a few small incisions. The surgeon is 100% in control of the robotic system at all times, and he or she is able to carry out more precise operations than previously thought possible. This industry is about to boom as recently reported.
4) 750,000 Remote Clinical Encounters Through Intouch Health
Imagine you are at home with your dad and he suddenly feels strong headache, starts to feel dizzy and has such a speech disorder which he has never experienced before. Of course you would immediately call an ambulance. But what if you live in a rural area or frontier town where it takes ages until the doctor and the nurses arrive?
Intouch health and its telehealth network could help in such situations. Through the waste network patients in remote areas have access to high-quality emergency consultations for stroke, cardiovascular, and burn services in the exact time they need it. Moreover with telehealth, medical professionals in such towns and rural areas also have access to specialty services and patients can be treated in their own communities.
Through this network, a “telemedical robot” has already established over 750,000 clinical encounters where it was not possible before.
5) TUG Robot Able To Carry Around More Than 400 Kilograms of Medication
The TUG robot is the robust and muscular big brother of Pepper, who is able to carry around a multitude of racks, carts or bins up to 453 kilograms in the form of medications, laboratory specimens or other sensitive materials. The TUG is sent or requested using a touch screen interface and upon completing its “mission”, it returns to the charging dock for a sip of energy while it is loaded for the next job.
And the benefits? These robots work around the clock, so fewer employees are necessary for the burdening nightshifts. Staff can spend more time with patients or assist nursing instead of transporting goods through the hospital. Moreover, nurses do not have to carry around heavy loads and can avoid related injuries.
6) Bear-Shaped Robot Can Lift Patients Out of Bed 40 Times a Day
Riba or Robot for Interactive Body Assistance is somewhat similar to the TUG robot, however it is rather used at homes with care patients who need assistance. Its Japanese version, the Robear is shaped as a giant, gentle bear with a cartoonish head. They both can lift and move patients in and out of bed into a wheelchair, help patients to stand, and to turn them to prevent bed sores as many times as you want.
These robots not only promise to make up for the shortage of carers, but to save human personnel from having to carry out strenuous tasks, such as lifting patients out of bed 40 times a day.
7) Less Than a Millimeter Sized Microbot Delivers Drugs Through Bloodstream
Remember the 1960s science fiction movie Fantastic Voyage, where a submarine and the people inside it were shrunk to microscopic dimensions and were injected into a person’s bloodstream? Now, reality has come one step closer to this scenario.
Researchers from the Max Planck Institute have been experimenting with exceptionally micro-sized –meaning they are smaller than a millimeter – robots that literally swim through your bodily fluids and could be used to deliver drugs or other medical relief in a highly-targeted way. These scallop-like microbots are designed to swim through non-Newtonian fluids, like your bloodstream, around your lymphatic system, or across the slippery goo on the surface of your eyeballs.
8) Veebot Draws Blood In Less Than a Minute
There is hardly any adult in the developed world who has never been the subject of a blood draw. Many have serious fears about it. On the one hand, it might be pretty scary that it is carried out with a needle. On the other hand, sometimes it takes a lot of time and more than one attempts until the nurse or the phlebotomist finds the appropriate vein to carry out the procedure. Veebot, a blood-drawing robot helps with the latter and allows for speeding up of the unpleasant experience.
With Veebot, the whole process takes about a minute, and tests show that it can correctly identify the best vein with approximately 83% accuracy, which is about as good as an experienced human phlebotomist.
9) Cuddly Animal-Shaped PARO Robot Reduces Stress for Patients
It is widely known that pets and cute animals help to ease stress, to divert attention from pain and to reduce the feeling of loneliness. Unfortunately, not every hospital or extended care facility allows animals to live next to patients. AIST, a leading Japanese industrial automation pioneer offers a solution.
PARO is an advanced interactive robot developed by AIST. It allows the documented benefits of animal therapy to be administered to patients in medical environments in the shape of a baby harp seal covered with soft artificial fur to make people feel comfortable, as if they are touching a real animal. This therapeutic robot has been found to reduce the stress factor experienced both by patients and by their caregivers.