Skip to main content

EMOTIONAL ROBOTS

Robots are real. This idea has severe, negative connotations for much of society. When many people think of robots they likely think of a Terminator-like being; a singularity with desires to raize humanity. 


Developing emotional intelligence in robots is a difficult task, melding the use of computer "vision" to interpret objects and people and creating software that can respond accordingly.

As robots become smart enough to detect our feelings and respond appropriately, they could have something like emotions of their own. But that won’t necessarily make them more like humans.

Coolest attraction is a robot that plays ping pong.

The robot called Forpheus does more than play a mean game of table tennis. It can read body language to gauge its opponent's ability, and offer advice and encouragement.

Honda, the Japanese auto giant, launched a new robotics program called Empower, Experience, Empathy including its new 3E-A18 robot which "shows compassion to humans with a variety of facial expressions," according to a statement.

Although empathy and emotional intelligence do not necessarily require a humanoid form, some robot makers have been working on form as well as function.

Robots conquer fires, jungle, sandstorms in New Navy Training Ground.

Octavia, a humanoid robot designed to fight fires on Navy ships, has mastered an impressive range of facial expressions.

When she’s turned off, she looks like a human-size doll. She has a smooth white face with a snub nose. Her plastic eyebrows sit evenly on her forehead like two little capsized canoes.

What’s amazing is that her emotional affect is an accurate response to her interactions with humans. She looks pleased, for instance, when she recognizes one of her teammates. She looks surprised when a teammate gives her a command she wasn’t expecting. She looks confused if someone says something she doesn’t understand.

She can show appropriate emotional affect because she processes massive amounts of information about her environment. She can see, hear, and touch. She takes visual stock of her surroundings using the two cameras built into her eyes and analyzes characteristics like facial features, complexion, and clothing. She can detect people’s voices, using four microphones and a voice-recognition program called Sphinx. She can identify 25 different objects by touch, having learned them by using her fingers to physically manipulate them into various possible positions and shapes. Taken together, these perceptual skills form a part of her “embodied cognitive architecture,” which allows her—according to her creators at the Navy Center for Applied Research in Artificial Intelligence—to “think and act in ways similar to people".

Other robots such as Qihan Technology's Sanbot and SoftBank Robotics' Pepper, are being "humanized" by teaching them to read and react to people's emotional states.

Pepper is the world’s first social humanoid robot able to recognize faces and basic human emotions. 

Pepper is "capable of interpreting a smile, a frown, your tone of voice, as well as the lexical field you use and non-verbal language such as the angle of your head," according to SoftBank.

ROBOTS IN HUMAN SHOES:

"It's not just about technology, it's about psychology and trust."

"Empathy is the goal: the robot is putting itself in the shoes of the human, and that's about as hard as it gets," said Patrick Moorhead, a technology analyst with Moor Insights & Strategy.

"In some ways it can be a bit creepy if you're crying and the robot is trying to console you," he said.

The Buddy robot features a camera, ultrasound, infra-red and thermal sensors, a range-finder sensor, a temperature sensor and ground detectors.

"We're been working very hard to have an emotional robot," said Jean-Michel Mourier of French-based Blue Frog Robotics, which makes the companion and social robot called Buddy, set to be released later this year.

"He has a complex brain," Mourier said at a CES event. "It will ask for a caress or it will get mad if you poke him in the eye."

"If you have no friends, the next best thing is a friend robot, and introverts might feel more comfortable talking to a robot."

Milo uses icons on his chest to reinforce concepts taught in a lesson, such as how to recognize emotions like hurt or anger in other people.

Robotic company RoboKind makes Milo, an expressive classroom robot that the company says "never gets tired, never gets frustrated, and is always consistent."

Milo demonstrates various facial expressions and delivers lessons verbally, while a screen on the robot’s chest shows symbols meant to help the student understand what is being said. The robotic ability to repeat something again and again in the same tone without tiring is particularly suited to helping children with ASD learn. Milo is also being used in special education classrooms and showing benefits for children with Down syndrome, ADHD, trauma, and other social or emotional diagnoses.

“The robots can do some things people can’t do,” says Richard Margolin, the CEO of RoboKind.

Beyond helping children with special challenges, robots are being designed to help tutor and encourage learning more generally. RoboKind, for example, has created a curriculum for the Milo robot to teach coding skills. Another project named Minnie out of the University of Wisconsin, Madison is designed to encourage reading with middle-schoolers through a back and forth interaction.

Milo to teach students with autism emotional, social and communication skills.

In each case, the approach is centered on providing a social component similar to a study buddy or tutor—not a substitute teacher—but with extra patience built in.

Today, many robots are being designed to live among us, to be social and adept caretakers, tutors, and companions. And unlike the robots that hollowed out factory jobs in the first wave of automation, the majority of these social robots are being designed to solve worker shortages and assist, rather than replace, human workers. It remains to be seen whether things will play out this way, of course. In the meantime, fears that robots will take jobs that involve human interaction away from people who need them—and can do them better—will likely persist.

Comments

Popular posts from this blog

STELLAR FORMATION

Although stars are inanimate objects, we tend to describe their stages of evolution as if they were alive. Just like us, they are born, live, and then die. Of course, their lifetimes are much longer than ours and they can ‘live’ for billions of years. And during their lives, stars produce monumental amounts of energy through nuclear processes in their interior, giving them their characteristic shine. So let’s start at the beginning. Where do stars come from? A Giant Gas Cloud A star begins life as a giant cloud of gas which is generally an accumulation of dust, gas, and plasma. Stars form inside relatively dense concentrations of interstellar gas and dust known as molecular clouds. At these temperatures, gases become molecular meaning that atoms bind together. CO and H2 are the most common molecules in interstellar gas clouds. Pillars of Creation. An interstellar cloud of gas and dust in the Eagle Nebula,  known for its complexity and beauty. A Protostar Is a Baby Star A protostar look

DOOMSCROLLING IS SLOWLY SWALLOWING YOUR MENTAL HEALTH !

Have you ever picked up your phone to aimlessly browse social media, only to find yourself sucked into a vortex of terrifying information that captures your attention but destroys your nerves? There’s a word for that: “doomscrolling.”          Droomscrolling— It's not good. It’s called “doomscrolling” (or “doomsurfing”) — a portmanteau that Merriam-Webster defines as “referring to the tendency to continue to surf or scroll through bad news, even though that news is saddening, disheartening or depressing”. It's 11:37pm and the pattern shows no signs of shifting. At 1:12am, it’s more of the same. Thumb down, thumb up. Twitter, Instagram, and—if you’re feeling particularly wrought/masochistic—Facebook. Ever since the COVID-19 pandemic left a great many people locked down in their homes in early March, the evening ritual has been codifying: Each night ends the way the day began, with an endless scroll through social media in a desperate search for clarity.  

Brains as Computers- Computers as Brains?

The metaphor, analogy, theory, or reality of brains as computers? A computer is a programmable device, whether it be electrical, analogue, or quantum. Due to his dualist belief that the mind programmes the brain, Wilder Penfield said that the brain functions just like a computer. If this kind of dualism is disregarded, specifying what a brain "programme" might entail and who is authorised to "programme" the brain will be necessary in order to identify the brain to a computer. This is a metaphor if the brain "programmes" itself while it learns. This is a metaphor if evolution "programmes" the brain. In fact, the brain-computer metaphor is frequently used in the literature on neuroscience rather than as an analogy, or explicit comparison, by importing computer-related terms into discussions of the brain. For example, we claim that brains compute the locations of sounds, and we speculate about how perceptual algorithms are implemented in the brain.