Musical human-computer interaction

Duration:
01.10.2017–01.10.2027

The project investigates aspects of rhythm and motion through the design and construction of interfaces for musical human-computer interaction.

PhD fellows Qichao Lan, Tejaswinee Kelkar and Cagri Erdem are rehearsing on various new interfaces for musical expression in preparation for a MusicLab performance.

PhD fellows Qichao Lan, Tejaswinee Kelkar and Cagri Erdem are rehearsing on various new interfaces for musical expression in preparation for a MusicLab performance.

Contact

Central questions

  • What types of sensors can be used in musical human-computer interaction?
  • What types of mappings between actions and sounds work well?
  • What are differences between acoustic and electroacoustic sound generation?

The core activity of the project is the investigation of aspects of rhythm and motion through the design and construction of interfaces for musical human-computer interaction. This includes the study and design of both acoustic instruments and completely digital systems. We are particularly interested in various types of electroacoustic instruments, in which we explore the complexity of human motion in musical experience and practice.

Methodology

We employ a multitude of methods, including theoretical modelling, empirical studies using motion capture and physiological measurements, rapid prototyping, as well as iterative and creative design processes.

This analysis-by-synthesis approach leads to a new understanding of rhythmic phenomena in general, and also lead to various types of artistic and creative results.

    Participants

    Funding

    Funded by The Research Council of Norway

    Project number :262762

    See also

    Norwegian version of this page
    Published Jan. 31, 2025 9:51 AM - Last modified Jan. 31, 2025 9:51 AM