@itu.dk
Computer Science
IT-University of Copenhagen
Morten is a software developer and robotics engineer with a keen interest in affective computing, synthetic emotions and artificial intelligence.
His research focuses primarily on affective robotics, robotics & affective interaction design, and constructing emotionally intelligent systems. With a focus on the engineering aspects of robot construction, Morten’s projects evolve around improving how robots communicate affective information. This is achieved through expanding knowledge on the means of which robots express themselves and the software that controls their behaviour.
PhD Robotics IT-University of Copenhagen
MsC Algorithm Design
Scopus Publications
Morten Roed Frederiksen, Kasper Stoy, and Maja Matarić
IEEE
A common denominator for most therapy treatments for children who suffer from an anxiety disorder is daily practice routines to learn techniques needed to overcome anxiety. However, applying those techniques while experiencing anxiety can be highly challenging. This paper presents the design, implementation, and pilot study of a tactile hand-held pocket robot “AffectaPocket”, designed to work alongside therapy as a focus object to facilitate coping during an anxiety attack. The robot does not require daily practice to be used, has a small form factor, and has been designed for children 7 to 12 years old. The pocket robot works by sensing when it is being held and attempts to shift the child's focus by presenting them with a simple three-note rhythm-matching game. We conducted a pilot study of the pocket robot involving four children aged 7 to 10 years, and then a main study with 18 children aged 6 to 8 years; neither study involved children with anxiety. Both studies aimed to assess the reliability of the robot's sensor configuration, its design, and the effectiveness of the user tutorial. The results indicate that the morphology and sensor setup performed adequately and the tutorial process enabled the children to use the robot with little practice. This work demonstrates that the presented pocket robot could represent a step toward developing low-cost accessible technologies to help children suffering from anxiety disorders.
Morten Roed Frederiksen and Kasper Støy
Springer Nature Switzerland
Morten Roed Frederiksen and Kasper Sty
IEEE
To improve user personalization of robots in social situations, robots can benefit from inferring information about the humans with whom they interact. Physical human behaviors and personality traits have previously been touted as possible sources of such information but often require complex processing or sensoring requirements. This paper investigates posing specific questions related to extrovert behaviors as an alternative source of this information. It aims to highlight significant relationships between questions aimed at behavioral reactions in specific scenarios and speech and movement attributes, obtained by a robot in a one-on-one social interaction. The paper used an experiment in which participants interacted with a robot through a non-scripted conversation. In it, the robot would gather information on the speech and movement characteristics, and estimated arousal/valence levels of the participant. The experiment was followed by a series of specific questions aimed at outlining the extroversion level of the participants. The results showed multiple significant but weak correlations $(\\mathrm{p}\\lt$.05) between the recorded attributes. These include correlations between the average determined valence and the average recorded velocity of speech, between the average answer reaction time and average answer length. The results also found correlations between arousal levels, average pause duration, and the answers recorded for individual questions of the questionnaire. The results suggest that introducing specific questions in human-robot interactions can potentially be used to decrease the processing and sensor demands of robots and offer user personalization using only a limited set of sensors.
Morten Roed Frederiksen, Katrin Fischer, and Maja Mataric
IEEE
This paper describes a between-subjects Amazon Mechanical Turk study (n = 220) that investigated how a robot’s affective narrative influences its ability to elicit empathy in human observers. We first conducted a pilot study to develop and validate the robot’s affective narratives. Then, in the full study, the robot used one of three different affective narrative strategies (funny, sad, neutral) while becoming less functional at its shopping task over the course of the interaction. As the functionality of the robot degraded, participants were repeatedly asked if they were willing to help the robot. The results showed that conveying a sad narrative significantly influenced the participants’ willingness to help the robot throughout the interaction and determined whether participants felt empathetic toward the robot throughout the interaction. Furthermore, a higher amount of past experience with robots also increased the participants’ willingness to help the robot. This work suggests that affective narratives can be useful in short-term interactions that benefit from emotional connections between humans and robots.
Morten Roed Frederiksen and Kasper Stoy
IEEE
Speech characteristics have shown potential as a tool to identify personality factors in humans but usually demands longer interactions or elaborate sensor requirements. This paper presents a novel robot system that uses speech and body movement characteristics to recognize and distinguish between users-groups interacting with it through small sets of interactions. It clusters people with similar characteristics together and measures the affective impact of specific robot behaviors. The system was tested using a custom-created affective robot through 36 interactions with 6 human participants aging from 11 to 70. 186 samples were collected in two different physical contexts and the similarity of the samples for each user was compared. The preliminary results indicate that the speech and movement characteristics have the potential as a tool to recognize specific users and as a guide to form user groups. This was found using only basic sensors available in most robots through a limited set of interactions. The results further highlight that there are significant differences between measurements for the same users in different physical contexts meaning that the participants move and talk differently with each context. This paper suggests combining the speech and movement characteristics with information on the physical context to gain better user adaptation in robot behaviors for future projects.
Morten Roed Frederiksen, Massimiliano Leggieri, and Henrik Hautop Lund
Atlantis Press
Morten Roed Frederiksen and Kasper Stoy
IEEE
This paper investigates the specific scenario of high-intensity confrontations between humans and robots, to understand how robots can defuse the conflict. It focuses on the effectiveness of using five different affective expression modalities as main drivers for defusing the conflict. The aim is to discover any strengths or weaknesses in using each modality to mitigate the hostility that people feel towards a poorly performing robot. The defusing of the situation is accomplished by making the robot better at acknowledging the conflict and by letting it express remorse. To facilitate the tests, we used a custom affective robot in a simulated conflict situation with 105 test participants. The results show that all tested expression modalities can successfully be used to defuse the situation and convey an acknowledgment of the confrontation. The ratings were remarkably similar, but the movement modality was different (ANON p<.05) than the other modalities. The test participants also had similar affective interpretations on how impacted the robot was of the confrontation across all expression modalities. This indicates that defusing a high-intensity interaction may not demand special attention to the expression abilities of the robot, but rather require attention to the abilities of being socially aware of the situation and reacting in accordance with it.
Morten Roed Frederiksen and Kasper Stoy
IEEE
In an effort to improve how robots function in social contexts, this paper investigates if a robot that actively shares a reaction to an event with a human alters how the human perceives the robot’s affective impact. To verify this, we created two different test setups. One to highlight and isolate the reaction element of affective robot expressions, and one to investigate the effects of applying specific timing delays to a robot reacting to a physical encounter with a human. The first test was conducted with two different groups (n=84) of human observers, a test group and a control group both interacting with the robot. The second test was performed with 110 participants using increasingly longer reaction delays for the robot with every ten participants. The results show a statistically significant change (p<.05) in perceived affective impact for the robots when they react to an event shared with a human observer rather than reacting at random. The result also shows for shared physical interaction, the near-human reaction times from the robot are most appropriate for the scenario. The paper concludes that a delay time around 200ms may render the biggest impact on human observers for small-sized non-humanoid robots. It further concludes that a slightly shorter reaction time around 100ms is most effective when the goal is to make the human observers feel they made the biggest impact on the robot.
Morten Roed Frederiksen, Massimiliano Leggieri, and Henrik Hautop Lund
ALife Robotics Corporation Ltd.
Morten Roed Frederiksen and Kasper Stoy
IEEE
This paper provides a survey of the different means of expression employed by robots conveying affective states to human recipients. The paper introduces a model of affective expression modalities (MOAM) that describes and compares the emphasis on specific means of expression and applies it to the surveyed robots. Using the model entails viewing the effect of applied expression modalities in light of how well the robot responds to external stimuli and with attention to how aligned the robot’s means of affective expressions are with the intended working scenario. The model-based survey shows that a majority (85%) of the surveyed robots contain a category with room for additional affective means of expression, and a quarter (25.6%) of the robots use a single or two affective expression modalities to convey affective states. The result of the survey indicates there is an under-researched opportunity in exploring synergies between the different expression modalities to amplify the overall affective impact of a robot.
Morten Roed Frederiksen and Kasper Stoey
IEEE
This paper investigates the potential benefits of augmenting audio-based affective means of expression to strengthen the perceived intentions of a robot. Robots are often viewed as being simple machines with limied capabilities of communication. Changing how a robot is perceived, towards a more affective interpretation ofits intentions, requires careful consideration of the means of expression available to the robot. It also requires alignment between these means to ensure they work in coordination with each other to make the robot easier to understand. As an effort to strengthen the affective interpretation of a soft robotic arm robot, we altered its overall expression by changing the available audio-based expression modalities. The system mitigatedthe naturally occurring noise from actuators and pneumatic systems and used a custom sound that supported the movement of the robot. The robot was tested by interacting with human observers (n=78) and was perceived as being significantly more curious, happy and less angry when augmented by audio that aligned with the naturally occurred robot sounds. The results show that the audio-based expression modality of robots is a valuable communication tool to consider augmenting when designing robots that convey affective information.