Norwegian version of this page

EPEC - Engineering Predictability with Embodied Cognition (completed)

How can multimodal systems sense, learn, and predict future events?

EPEC.  Engineering predictability with embodied cognition. It says with letters.. Logo.

Humans are superior to computers and robots when it comes to perceiving with eyes, ears and other senses as well as combining perception with learned knowledge to choose the best actions. This project aims to develop human-inspired models of behaviour and perception and to show that these models can predict future actions accurately.

Our inspiration comes from embodied cognition, a concept from psychology proposing that our bodies, perceptions, abilities, and form, influences how we think. Our goal is to exploit the form of various systems to develop predictive reasoning models as alternatives to traditional reactive systems. These models will be applied in interdisciplinary fields of music technology and robotics. In music, we aim to provide everyday people new ways to move within musical spaces. Our models learn about their interactions with smartphones to proactively assist with their future actions. In robotics, we are developing robots with dynamic forms that can change their thinking in response to new body shapes.

A hand holding a smartphone and feet on some robotics. Photo.
Musical interaction on smartphones and robotic systems, are EPEC's application areas for new predictive models.

EPEC is directed by Professor Jim T?rresen, who also leads the ROBIN research group in the Department of Informatics. The project employs two post doctoral fellows, Kai Olav Ellefsen and Charles Martin, and PhD researcher T?nnes Nygaard. The project also includes Associate Professor Kyrre Glette, PhD researcher J?rgen Nordmoen, and a number of masters students in machine learning, robotics and music technology.

Objectives

Design, implement and evaluate multimodal systems that are able to sense, learn and predict future events.

Sub-projects

  • Internal Models: Predicting real-world effects through internal simulations
  • DyRET: Dynamic Robot for Embodied Testing
  • Interactive music systems: Computer systems for extending and enhancing musical listening, performance and collaboration.

Master Projects

Researchers from the EPEC group supervise master projects in robotics, music technology, and machine learning. Come work with us on predictive models, embodied interactive systems and new robotic interactions!

Funding

Supported by The Research Council of Norway under FRINATEK grant agreement 240862 from 2015 to 2019. The grant funds 1 PhD and 2 post-doc positions (10% of prop. funded).

Error when retrieving publications from Cristin

Tags: machine learning, robotics, interactive music
Published May 24, 2016 3:38 PM - Last modified Sep. 28, 2022 12:53 PM

Contact

Participants

Detailed list of participants