Morten Roed Frederiksen

Verified email at

Computer Science
IT-University of Copenhagen


Morten is a software developer and robotics engineer with a keen interest in affective computing, synthetic emotions and artificial intelligence.
His research focuses primarily on affective robotics, robotics & affective interaction design, and constructing emotionally intelligent systems. With a focus on the engineering aspects of robot construction, Morten’s projects evolve around improving how robots communicate affective information. This is achieved through expanding knowledge on the means of which robots express themselves and the software that controls their behaviour.


PhD Robotics IT-University of Copenhagen
MsC Algorithm Design


Scopus Publications

Scopus Publications

  • A minimalistic approach to user group adaptation of robot behaviors using movement and speech analysis
    Morten Roed Frederiksen and Kasper Stoy

    2021 30th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2021, Pages: 1110-1116, Published: 8 August 2021 IEEE
    Speech characteristics have shown potential as a tool to identify personality factors in humans but usually demands longer interactions or elaborate sensor requirements. This paper presents a novel robot system that uses speech and body movement characteristics to recognize and distinguish between users-groups interacting with it through small sets of interactions. It clusters people with similar characteristics together and measures the affective impact of specific robot behaviors. The system was tested using a custom-created affective robot through 36 interactions with 6 human participants aging from 11 to 70. 186 samples were collected in two different physical contexts and the similarity of the samples for each user was compared. The preliminary results indicate that the speech and movement characteristics have the potential as a tool to recognize specific users and as a guide to form user groups. This was found using only basic sensors available in most robots through a limited set of interactions. The results further highlight that there are significant differences between measurements for the same users in different physical contexts meaning that the participants move and talk differently with each context. This paper suggests combining the speech and movement characteristics with information on the physical context to gain better user adaptation in robot behaviors for future projects.

  • Playware ball – development of an intelligent ball
    Morten Roed Frederiksen, Massimiliano Leggieri, and Henrik Hautop Lund

    Journal of Robotics, Networking and Artificial Life, eISSN: 23526386, Pages: 217-221, Published: March 2021 Atlantis Press

  • Robots can defuse high-intensity conflict situations
    Morten Roed Frederiksen and Kasper Stoy

    IEEE International Conference on Intelligent Robots and Systems, ISSN: 21530858, eISSN: 21530866, Pages: 11376-11382, Published: 24 October 2020 IEEE
    This paper investigates the specific scenario of high-intensity confrontations between humans and robots, to understand how robots can defuse the conflict. It focuses on the effectiveness of using five different affective expression modalities as main drivers for defusing the conflict. The aim is to discover any strengths or weaknesses in using each modality to mitigate the hostility that people feel towards a poorly performing robot. The defusing of the situation is accomplished by making the robot better at acknowledging the conflict and by letting it express remorse. To facilitate the tests, we used a custom affective robot in a simulated conflict situation with 105 test participants. The results show that all tested expression modalities can successfully be used to defuse the situation and convey an acknowledgment of the confrontation. The ratings were remarkably similar, but the movement modality was different (ANON p<.05) than the other modalities. The test participants also had similar affective interpretations on how impacted the robot was of the confrontation across all expression modalities. This indicates that defusing a high-intensity interaction may not demand special attention to the expression abilities of the robot, but rather require attention to the abilities of being socially aware of the situation and reacting in accordance with it.

  • On the causality between affective impact and coordinated human-robot reactions
    Morten Roed Frederiksen and Kasper Stoy

    29th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2020, Pages: 488-494, Published: August 2020 IEEE
    In an effort to improve how robots function in social contexts, this paper investigates if a robot that actively shares a reaction to an event with a human alters how the human perceives the robot’s affective impact. To verify this, we created two different test setups. One to highlight and isolate the reaction element of affective robot expressions, and one to investigate the effects of applying specific timing delays to a robot reacting to a physical encounter with a human. The first test was conducted with two different groups (n=84) of human observers, a test group and a control group both interacting with the robot. The second test was performed with 110 participants using increasingly longer reaction delays for the robot with every ten participants. The results show a statistically significant change (p<.05) in perceived affective impact for the robots when they react to an event shared with a human observer rather than reacting at random. The result also shows for shared physical interaction, the near-human reaction times from the robot are most appropriate for the scenario. The paper concludes that a delay time around 200ms may render the biggest impact on human observers for small-sized non-humanoid robots. It further concludes that a slightly shorter reaction time around 100ms is most effective when the goal is to make the human observers feel they made the biggest impact on the robot.

  • Playware ball – initial development impressions of an intelligent ball
    Morten Roed Frederiksen, Massimiliano Leggieri, and Henrik Hautop Lund

    Proceedings of International Conference on Artificial Life and Robotics, eISSN: 24359157, Volume: 2020, Pages: 14-18, Published: 2020 ALife Robotics Corporation Ltd.

  • A Systematic Comparison of Affective Robot Expression Modalities
    Morten Roed Frederiksen and Kasper Stoy

    IEEE International Conference on Intelligent Robots and Systems, ISSN: 21530858, eISSN: 21530866, Pages: 1385-1392, Published: November 2019 IEEE
    This paper provides a survey of the different means of expression employed by robots conveying affective states to human recipients. The paper introduces a model of affective expression modalities (MOAM) that describes and compares the emphasis on specific means of expression and applies it to the surveyed robots. Using the model entails viewing the effect of applied expression modalities in light of how well the robot responds to external stimuli and with attention to how aligned the robot’s means of affective expressions are with the intended working scenario. The model-based survey shows that a majority (85%) of the surveyed robots contain a category with room for additional affective means of expression, and a quarter (25.6%) of the robots use a single or two affective expression modalities to convey affective states. The result of the survey indicates there is an under-researched opportunity in exploring synergies between the different expression modalities to amplify the overall affective impact of a robot.

  • Augmenting the audio-based expression modality of a non-affective robot
    Morten Roed Frederiksen and Kasper Stoey

    2019 8th International Conference on Affective Computing and Intelligent Interaction, ACII 2019, Published: September 2019 IEEE
    This paper investigates the potential benefits of augmenting audio-based affective means of expression to strengthen the perceived intentions of a robot. Robots are often viewed as being simple machines with limied capabilities of communication. Changing how a robot is perceived, towards a more affective interpretation ofits intentions, requires careful consideration of the means of expression available to the robot. It also requires alignment between these means to ensure they work in coordination with each other to make the robot easier to understand. As an effort to strengthen the affective interpretation of a soft robotic arm robot, we altered its overall expression by changing the available audio-based expression modalities. The system mitigatedthe naturally occurring noise from actuators and pneumatic systems and used a custom sound that supported the movement of the robot. The robot was tested by interacting with human observers (n=78) and was perceived as being significantly more curious, happy and less angry when augmented by audio that aligned with the naturally occurred robot sounds. The results show that the audio-based expression modality of robots is a valuable communication tool to consider augmenting when designing robots that convey affective information.