A fascinating challenge in the field of human–robot interaction is the possibility to endow robots with emotional intelligence in order to make the interaction more intuitive, genuine, and natural. To achieve this, a critical point is the capability of the robot to infer, interpret, and display human emotions. With FORET, we are exploring ways to establish trust between humans and robots through the development of novel emotional communication channels. 
By exploring emotion recognition and response in music and motion on non-anthropomorphic robots, I hope to find new ways to help humans experience emotions, and to develop trust and connections with robot dancers.
I developed a realtime music mood recognition tool to determine the emotional quadrant to which the target music belongs to. The quadrant is then used to determine the gestural response of the robots. I also mapped a series of emotional gestures from human onto robots, performing emotion recognition in human activity with a novel motion dataset recorded in 2021 in collaboration with Israeli choreographer Avshalom Pollak and different SVM classification methods.
Back to Top