top of page
Picture2.png

ABOUT CONBOTS

Call identifier: H2020-ICT-2018-2020

Call topic: ICT-09-2019-2020

Grant number: 871803

Starting Date: January 1st, 2020

Duration: 48 months

Cost: € 4810796

Download the project brochure here.

Motivation & Background

From a parent coordinating movements to help a child learn to walk, to a violinist training a concerto,  humans rely on physical interaction to learn from each other and from the environment.

CONBOTS is built on recent neuroscientific findings demonstrating the benefits of physical interaction among people in performing and learning new motor tasks, where the human central nervous system understands a partner motor control and can use it to improve task performance and motor learning.

Based on this Neuroscience-driven approach to motor learning and on a strongly multidisciplinary foundation, the project proposes to design and test a new class of robots, the CONBOTS, which physically couple people to facilitate the learning of new motor skills to augment of handwriting and music training through robotics.

Moreover, users’ experience will be enhanced by designing a multiparametric wearable sensor system and machine learning models, capable of estimating users’ emotional status and re-adapt platform behavior in order to customize the learning process.

This will be implemented through the use of innovative robotic technology, wearable sensors and machine learning techniques to give rise to novel human-human and human-robot interaction paradigms applied in the two different learning contexts, i.e.: training graphomotor skills in children learning handwriting, and augmenting learning performance in beginner musicians.

CONBOTS will expand the impact and the application of robotics to the education 4.0, grounding on innovative environments characterized by three main factors: (i) the combined use of robotics, AI, mixed reality and man-machine interfaces; (ii) personalized and data based customized learning; (iii) peer to peer and interactive learning.

Objectives

CONBOTS aims at designing a platform combining four enabling technologies:

  1. Compact robotic haptic devices to gently interact with upper limbs;

  2. An interactive controller yielding physical communication, integrating differential Game Theory (GT) and an algorithm to identify the partner’s control;

  3. A bi-directional user interface encompassing AR-based application-driven serious games, and a set of wearable sensors and instrumented objects;

  4. Machine learning algorithms for tailoring learning exercises to the user physical, emotional, and mental state.

Picture3.png

The CONBOTS project is designed around 7 main objectives:

  • To define the user needs and platform requirements and to deploy measures to raise awareness and take-up of CONBOTS scientific and technological achievements

  • To design physically interacting robotic haptic devices in the form of an end-effector robotic workstation and a wearable exoskeleton

  • To exploit recent neuroscience insights on physical interaction and modern control theory to establish a Game Theory of physical communication for improving human-human and human- robot interactions.

  • To design a bi-directional user interface with Augmented Reality serious games, wearable sensors and instrumented objects for maximising the impact of physical interaction in learning contexts

  • To develop machine learning algorithms for describing the physical, emotional, and mental state of the users, in order to tailor serious games and optimise the training process to the specific user needs

  • To Integrate the different CONBOTS enabling technologies and optimise them for the two application scenarios

  • To test the efficacy of CONBOTS platforms for training sensorimotor skills of children during handwriting and music learning

bottom of page