Researchers from the RIKEN Guardian Robot Project in Japan have made an android child named Nikola that successfully conveys six basic emotions. The new study, published in Frontiers in Psychology, tested how well people could identify six facial expressions—happiness, sadness, fear, anger, surprise, and disgust—which were generated by moving “muscles” in Nikola’s face. This is the first time that the quality of android-expressed emotion has been tested and verified for these six emotions.
Rosie the robot maid was considered science fiction when she debuted on the Jetson’s cartoon over 50 years ago. Although the reality of the helpful robot is currently more science and less fiction, there are still many challenges that need to be met, including being able to detect and express emotions. The recent study led by Wataru Sato from the RIKEN Guardian Robot Project focused on building a humanoid robot, or android, that can use its face to express a variety of emotions. The result is Nikola, an android head that looks like a hairless boy.
We give meaning to our world through the categorisation of objects. When and how does this process begin? By studying the gaze of one hundred infants, scientists at the Institut des Sciences Cognitives Marc Jeannerod (CNRS/Université Claude Bernard Lyon 1) have demonstrated that, by the age of fourth months, babies can assign objects that they have never seen to the animate or inanimate category. These findings, published in PNAS on 15 February 2022, reveal measurable changes in neural organisation, which reflect the transition from simply viewing the world to understanding it.
The way babies look at the world is a great mystery. What do they really see? What information do they get from seeing? One might think they look at things that stand out the most—by virtue of size or colour, for example. But when do babies begin to see and interpret the world like adults?