ANIMATAS – JUSThink

Project Overview

ANIMATAS (MSCA – ITN – 2017 – 765955 2) is H2020 Marie Sklodowska Curie European Training Network funded by Horizon 2020 (the European Union’s Framework Programme for Research and Innovation), coordinated by Université Pierre et Marie Curie (Paris, France)

ANIMATAS will establish a leading European Training Network (ETN) devoted to the development of a new generation of creative and critical research leaders and innovators who have a skill-set tailored for the creation of social capabilities necessary for realising step changes in the development of intuitive human-machine interaction (HMI) in educational settings.

This will be achieved through a transnational network of universities and industrial partners that will supervise and deliver specialized training for early stage researchers (ESRs), and the cross-fertilization of state-of-the-art methods from the domains of social robotics, embodied virtual characters, social and educational sciences in order to facilitate the development of skills necessary to design machines capable of engaging in intuitive sustained encounters with teachers and children.

The ETN will ensure an integrative approach to the development of new capabilities with a view to their impact on whole system performance in terms of the complete HMI loop. This will be done by building industry–guided showcases that integrate the social capabilities developed by the ESRs. The participation of industrial partners will support the translation of new academic results to the market-place and a better transfer of knowledge between different sectors. The exposure of the non- academic sector to the ESRs has a great market potential that our industrial partners aim to capitalize upon in terms of recruiting young talents after the end of the project and adopting ANIMATAS’ advances in intuitive HMI for future product lines. This will greatly benefit the ESRs, which will be provided with new career perspectives in the social robotics and ed-tech industries.

The ETN will strengthen Europe’s capacity in research and innovation by nurturing a new generation of highly skilled ESRs with an entrepreneurial mind-set and an understanding of intuitive HMI and potential products in these emerging markets.

 more on http://animatas.eu/

Sub projects

Teacher orchestration of child-robot interaction

Objectives: As teachers manage their classrooms, the introduction of technological tools could allow them to better orchestrate individual learning. In this context, the aim of this ESR project will be to model this mix agent group in which the robot(s) will be involved in a learning task with children and where the teaching session will be orchestrated by the teacher. Several scenarios in school will be investigated, including one or many robots in classrooms interacting with one or many children and exploring various orchestration strategies enabling teachers to manage and visualise learning in the classroom. After reviewing the literature in joint attention, mutual modeling and mix-agent modeling, the ESR will investigate methods for adaptation to the child in a learning task orchestrated by a teacher/therapist. It will consist of considering learning needs of the child and orchestration inputs from the teacher. The proposed approaches will be evaluated via real experiments with children in learning contexts.

in association with Ana Paiva (INESC-ID)

Which mutual-modelling mechanisms to optimize child-robot interaction

Objectives: In the context of collaborative learning with a robot, the ability to establish a mental model of the other is crucial in order to interpret and respond in an appropriate manner. As humans, this skill of mutual modeling is performed during most interactions by attributing goals and beliefs to others. The aim of this ESR project will be to investigate ways to enable a robot with mutual modelling in a collaborative learning task with a child. The model will investigate the impact of the strategies proposed for mutual modelling on engagement, and motivation of the child in the learning task as well as potential learning gain. After reviewing the literature in mutual modeling, engagement and motivation, the ESR will investigate methods for collaborative learning in human-human interaction and co-learning with robots. The aim will be to propose strategies to motivate and engage the child in co-learning while maximizing the learning gain. The proposed approaches will be evaluated via real experiments with children in learning contexts.

in association with Ginevra Castellano (UU) and Chloé Clavel (IMT)

Automatic assessment of engagement during multi-party interactions

Objectives: Robots in educational institutions are challenged with socially appropriate interactions with (i) previously unseen users and (2) multi-party interactions with changing numbers of partners. In such situations, advanced models of interpersonal interactions are required to effectively adapt to such complex social situations. Among the multimodal communication skills, the robot has to identify partner(s) to whom it should adapt, follow or either provide help or encouragements. Detecting engagement or disengagement in such situations is challenging and requires to continuously assess the dynamics of interaction, which is usually driven by the characteristics of individuals (role, social traits, social attitude, dominance) and the task to achieve. The concept of interpersonal synchrony will be employed to model group interactions. Using automatic behaviour understanding techniques, we will derive a number of quantifiers to characterize different aspects of synchrony between partners within an interaction as well as of the interaction itself. This could be performed at low-level (e.g., body movement synchrony) and high-level (e.g., contingency of engagement/attitudes in the learning task). This approach will be employed to predict engagement at individual and group levels. The adaptation of the robot will exploit these metrics. Various adaptation strategies will be investigated (robot’s role) and evaluated by a set of machine learning metrics and joint actions metrics achievement. The model will be implemented in educational settings in collaboration with Interactive Robotics. Evaluation will include users’ feedback.

in association with Mohamed Chetouani (UPMC) and Ginevra Castellano (UU)

Efficacy of a ‘Misconceiving’ Robot to Improve Computational Thinking in a Collaborative Problem Solving Activity: A Pilot Study

U. Norman; A. Chin; B. Bruno; P. Dillenbourg 

2022. 31st IEEE International Conference on Robot & Human Interactive Communication (RO-MAN), Naples, Italy, August 29 – September 2, 2022. DOI : 10.1109/RO-MAN53752.2022.9900775.

Studying Alignment in a Collaborative Learning Activity via Automatic Methods: The Link Between What We Say and Do

U. Norman; T. Dinkar; B. Bruno; C. Clavel 

Dialogue & Discourse. 2022-08-06. Vol. 13, num. 2, p. 1-48. DOI : 10.5210/dad.2022.201.

Mutual Modelling Ability for a Humanoid Robot: How can it improve my learning as we solve a problem together?

U. Norman; B. Bruno; P. Dillenbourg 

2021-03-12. Robots for Learning Workshop in 16th annual IEEE/ACM Conference on Human-Robot Interaction (HRI 2021), Virtual Conference, March 9-11, 2021.

When Positive Perception of the Robot Has No Effect on Learning

J. Nasir; U. Norman; B. Bruno; P. Dillenbourg 

2020-08-31. 29th IEEE International Conference on Robot and Human Interactive Communication (IEEE RO-MAN), Virtual Conference, Aug 31 – Sept 4, 2020. p. 313-320. DOI : 10.1109/RO-MAN47096.2020.9223343.

Is There ‘ONE way’ of Learning? A Data-driven Approach

J. Nasir; B. Bruno; P. Dillenbourg 

2020. 22nd ACM International Conference on Multimodal Interaction, Virtual event, Netherlands, October 25-29, 2020. p. 388–391. DOI : 10.1145/3395035.3425200.

Engagement in Human-Agent Interaction: An Overview

C. Oertel; G. Castellano; M. Chetouani; J. Nasir; M. Obaid et al. 

Frontiers In Robotics And Ai. 2020-08-04. Vol. 7, p. 92. DOI : 10.3389/frobt.2020.00092.

You Tell, I Do, and We Swap until we Connect All the Gold Mines!

J. Nasir; U. Norman; B. Bruno; P. Dillenbourg 

ERCIM News. 2020-01-01. Vol. 2020, num. 120, p. 22-23.

Robot Analytics: What Do Human-Robot Interaction Traces Tell Us About Learning?

J. Nasir; U. Norman; W. Johal; J. K. Olsen; S. Shahmoradi et al. 

2019-10-14. IEEE RoMan 2019 – The 28th IEEE International Conference on Robot & Human Interactive Communication, New Delhi, India, October 14-18, 2019. DOI : 10.1109/RO-MAN46459.2019.8956465.

What Teachers Need for Orchestrating Robotic Classrooms

S. Shahmoradi; A. Kothiyal; B. Bruno; P. Dillenbourg; J. K. Olsen 

2019-09-17. EC-TEL 2020, Heidelberg, Germany, September 14-17, 2020. p. 87-101. DOI : 10.1007/978-3-030-57717-9_7.

Orchestration of Robotic Activities in Classrooms: Challenges and Opportunities

S. Shahmoradi; J. K. Olsen; S. Haklev; W. Johal; U. Norman et al. 

2019-09-09.  p. 640-644. DOI : 10.1007/978-3-030-29736-7_57.

Applying IDC theory to education in the Alps region: A response to Chan et al.’s contribution

P. Dillenbourg; K. G. Kim; J. Nasir; S. T. Yeo; J. K. Olsen 

Research and Practice in Technology Enhanced Learning. 2019. Vol. 14, p. 17. DOI : 10.1186/s41039-019-0111-6.

Learning By Collaborative Teaching : An Engaging Multi-Party CoWriter Activity

L. El-Hamamsy; W. Johal; T. L. C. Asselborn; J. Nasir; P. Dillenbourg 

2019. The 28th IEEE International Conference on Robot & Human Interactive Communication (RoMan 2019), New Delhi, India, October 14 – 18, 2019. DOI : 10.1109/RO-MAN46459.2019.8956358.