Norwegian version of this page

MIRAGE - A Comprehensive AI-Based System for Advanced Music Analysis (completed)

One main goal is to improve computers' capability to listen to and understand music, and to conceive technologies to facilitate music understanding and appreciation. One main application is to make music more accessible and engaging.

About the MIRAGE project

 

Read the article: "Artificial intelligence can help you understand music better"

Short description of the MIRAGE project: "Advancing AI for music analysis and transcription"

 

The objective is to generate rich and detailed descriptions of music, encompassing a large range of dimensions, including low-level features, mid-level structures and high-level concepts. Significant effort will be dedicated to the design of applications of value for musicology, music cognition and the general public.

We extend further the design of our leading computational framework aimed at extracting a large set of information from music, such as timbre, notes, rhythm, tonality or structure. Yet music can easily become complex. To make sense of such a subtle language, refined musicological considerations need to be formalised and integrated into the framework. Music is a lot about repetition: motives are repeated many times within a piece, and pieces of music imitate each other and cluster into styles. Revealing this repetition is both challenging and crucial. A large range of musical styles will be considered: traditional, classical and popular; acoustic and electronic; and from various cultures. The rich description of music provided by this new computer tool will also be used to investigate elaborate notions such as emotions, groove or mental images.

The approach follows a transdisciplinary perspective, articulating traditional musicology, cognitive science, signal processing and artificial intelligence. 

This project is also oriented towards the development of groundbreaking technologies for the general public. Music videos have the potential to significantly increase music appreciation. The effect is increased when music and video are closely articulated. Our technologies will enable to generate videos on the fly for any music. One challenge in music listening is that it all depends on the listeners’ implicit ear training. Automated, immersive, interactive visualisations will help listeners (even hearing-impaired) understand and appreciate better the music they like (or don’t like yet). This will make music more accessible and engaging. It will be also possible to visually browse into large music catalogues. Applications to music therapy will also be considered.

This project is in collaboration with the National Library of Norway, world leading in digitising cultural heritage. Check out the National Library of Norway’s Digital Humanities Laboratory.

Objectives

The fundamental research question of MIRAGE is:

  • How to design a computational system that would generate detailed and rich descriptions of music along a large range of dimensions?

A number of sub-questions arise from this:

  • What kinds of music analysis can be undertaken using this new computational technology, that would push musicology forwards towards new directions?
  • How can music perception be modelled in the form of a complex system composed of a large set of interdependent modules related to different music dimensions?
  • What can be discovered in terms of new predictive models describing listeners’ percepts and impressions, based on this new type of music analysis?

General methodology

The envisioned methodology consists of close interaction between automated music transcription and detailed musicological analysis of the transcriptions.

Methods based on mathematical analysis of audio recordings are not able to catch all the subtle transformations at work in the music. Most listeners, even if they are not musicians nor musicologists, are able to follow and understand the logic of the music because they more or less consciously build a somewhat refined analysis of the music. For these reasons, we advocate a different approach that attempts to construct a highly detailed analysis of the music in order to grasp some of the subtler aspects and to reach a higher degree of understanding. Besides there is a high degree of interdependency between these different structural dimensions. Any particular dimension of music cannot be understood fully without taking into consideration the other dimensions as well. For instance, the “simple” task of tracking beat actually requires to be able to detect repeating motifs, because beat perception can emerge from successive repetitions. But in the same time, motivic analysis requires a rich description of the music, which includes rhythm. We see therefore a circularity in the dependencies that can be addressed only by considering all these aspects altogether while progressively analysing the piece of music from beginning to the end.

Hence, whereas a traditional audio-based approach would typically apply signal processing, machine learning or statistical operators directly to the audio recording, the proposed approach relies on the contrary on a transcription of the audio signal into a music score and an AI-based detailed analysis of the score grounded on musicological and cognitive considerations.

Structure

The project is organised into five workpackages:

  • WP1: New Methods for Automated Transcription: Detection of notes from audio, construction of the score based on higher-level musicological analysis (provided by WP2). Tests on a large range of music from diverse cultures and genres. Through a close collaboration with musicologists, traditional music from Norway and many other cultures around the world will be considered. Popular and art music from the 20th and 21st centuries will also be studied, in particular music using particular instrumentarium or particular performing techniques, as well as electro-acoustic music, challenging further the task of transcription by questioning the basic definitions of “notes” or musical events and their parameters.
  • WP2: Comprehensive Model for Music Analysis: Modelling of the musicological analysis of the (transcriptions of audio recording using WP1, MIDI files, etc.) along a large range of musical dimensions. Each musical dimension is modelled by a specific module. The complex network of interdependencies between modules is also investigated. Musicological validation following the same principles and plan as in WP1.
  • WP3: New Perspectives for Musicology. This WP aims at transcending musicology’s capabilities through the development of new computational technologies specifically tailored to its needs. Three topics:
    • Maximising the informativeness of music visualisation
    • Retrieval technology tailored to musicological queries
    • Unveiling music intertextuality
  • WP4: Theoretical and Practical Impacts on Music Cognition. We take benefit of the new analytical tool to enrich music cognition models. Theoretically, in the way the computational models conceive in this project can suggest blueprints for cognitive models. Practically, because a better description of music enables a richer understanding of its impact on listeners. Extending further the momentum gathered by our previous software MIRtoolbox in the domain of music cognition, the new computational framework for music analysis will be fully integrated in our new open source toolbox MiningSuite. One particular application consists in enriching predictive models formalising the relationships between musical characteristics and their impact in listeners’ appreciation of music. Will be considered in particular music shape and mental images, groove and emotions.
  • WP5: Technological and Societal Repercussions. This WP examines the large range of possible impacts of the research, with a view to initiate further research and innovation projects and networks. Three axes:
    • Valorisation of online music catalogue: As a continuation to the SoundTracer project, we will prototype apps allowing the general public to browse into the Norwegian folk music catalogue, understand the characteristics of the different music recordings, interactively search for particular musical characteristics of their choices, such as melodies or rhythmical patterns, and get personalised recommendations based on the users’ appreciation of the tunes they have already listened. Contrary to Spotify of Apple Music, for instance, where songs are compared based on users’ consumption, here music will be compared based on actual musical content as found by the analyses produced in WP2.
    • Impact to the general public: The objective of this task is to compile a detailed list of possible applications of the developed technologies to the general public. We will imagine for instance how new methods of music visualisations will help non-expert understand better how music works and appreciate the richness of music more deeply. We will also investigate how these new technologies might be used for instance as a complement to traditional music critique or to music videos. We will prototype examples of visualisations that will be published in mainstream music or technology magazines. The new capabilities of music retrieval, catalogue browsing and recommendations offered by these new technologies will be studied on various music catalogues. The final objective is to initiate new research and innovation projects around those topics.

    • Music therapy tools: We will also prototype music therapy applications of our technologies. In particular, we will extend further our Music Therapy ToolBox (MTTB), dedicated to the analysis of free improvisations between therapists and clients, with the integration of visualisations related to higher-level music analysis. Here also, further applications will be envisaged for future research and innovation proposals.

Publications

  • Bishop, Laura; H?ffding, Simon; Lartillot, Olivier Serge Gabriel & Laeng, Bruno (2023). Mental Effort and Expressive Interaction in Expert and Student String Quartet Performance. Music & Science. ISSN 2059-2043. 6. doi: 10.1177/20592043231208000.
  • Maidhof, Clemens; Müller, Viktor; Lartillot, Olivier; Agres, Kat; Bloska, Jodie & Asano, Rie [Show all 8 contributors for this article] (2023). Intra- and inter-brain coupling and activity dynamics during improvisational music therapy with a person with dementia: an explorative EEG-hyperscanning single case study. Frontiers in Psychology. ISSN 1664-1078. 14. doi: 10.3389/fpsyg.2023.1155732.
  • Szorkovszky, Alexander; Veenstra, Frank; Lartillot, Olivier Serge Gabriel; Jensenius, Alexander Refsum & Glette, Kyrre (2023). Embodied Tempo Tracking with a Virtual Quadruped, Proceedings of the Sound and Music Computing Conference 2023. SMC Network . ISSN 978-91-527-7372-7. doi: 10.5281/zenodo.10060970. Full text in Research Archive
  • Thedens, Hans-Hinrich & Lartillot, Olivier (2023). AudioSegmentor: A tool for disseminating archival recordings online. Studia Musicologica Norvegica. ISSN 0332-5024. 49(1), p. 92–101. doi: 10.18261/smn.49.1.7. Full text in Research Archive
  • Lartillot, Olivier; Johansson, Mats Sigvard; Elowsson, Anders; Monstad, Lars L?berg & Cyvin, Mattias Stor?s (2023). A Dataset of Norwegian Hardanger Fiddle Recordings with Precise Annotation of Note and Beat Onsets. Transactions of the International Society for Music Information Retrieval. ISSN 2514-3298. 6(1), p. 186–202. doi: 10.5334/TISMIR.139.
  • Juslin, Patrik N.; Sakka, Laura S.; Barradas, Gon?alo T. & Lartillot, Olivier (2022). Emotions, mechanisms, and individual differences in music listening: A stratified random sampling approach. Music Perception. ISSN 0730-7829. 40(1), p. 55–86. doi: 10.1525/mp.2022.40.1.55. Full text in Research Archive
  • Lartillot, Olivier; Elovsson, Anders; Johansson, Mats Sigvard; Thedens, Hans-Hinrich & Monstad, Lars Alfred L?berg (2022). Segmentation, Transcription, Analysis and Visualisation of the Norwegian Folk Music Archive. In Pugin, Laurent (Eds.), DLfM '22: 9th International Conference on Digital Libraries for Musicology. Association for Computing Machinery (ACM). ISSN 978-1-4503-9668-4. p. 1–9. doi: https:/doi.org/10.1145/3543882.3543883. Full text in Research Archive
  • Haugen, Mari Romarheim (2021). Investigating Music-Dance Relationships. A Case Study of Norwegian Telespringar. Journal of music theory. ISSN 0022-2909. 65(1), p. 17–38. doi: 10.1215/00222909-9124714.
  • Lartillot, Olivier (2021). Computational Musicological Analysis of Notated Music: a Brief Overview. Nota Bene. ISSN 1891-4829. 15, p. 142–161. Full text in Research Archive
  • Weisser, Stéphanie; Lartillot, Olivier & Sechehaye, Hélène (2021). Investiguer la grésillance. Pour une approche ethno-acoustique du timbre musical. Cahiers d'ethnomusicologie. ISSN 2235-7688. 34, p. 37–58.
  • Elovsson, Anders & Lartillot, Olivier (2021). A Hardanger Fiddle Dataset with Performances Spanning Emotional Expressions and Annotations Aligned using Image Registration, Proceedings of the 22nd International Society for Music Information Retrieval Conference, Online, Nov 7-12, 2021. International Society for Music Information Retrieval. ISSN 978-1-7327299-0-2. p. 174–181. Full text in Research Archive
  • Lartillot, Olivier; Nymoen, Kristian; C?mara, Guilherme Schmidt & Danielsen, Anne (2021). Computational localization of attack regions through a direct observation of the audio waveform. Journal of the Acoustical Society of America. ISSN 0001-4966. 149(1), p. 723–736. doi: 10.1121/10.0003374.
  • Bruford, Fred & Lartillot, Olivier (2020). Multidimensional similarity modelling of complex drum loops using the GrooveToolbox, Proceedings of the 21st International Society for Music Information Retrieval (ISMIR) Conference. McGill-Queen's University Press. ISSN 978-0-9813537-0-8. p. 263–270. Full text in Research Archive
  • Lartillot, Olivier & Bruford, Fred (2020). Bistate reduction and comparison of drum patterns, Proceedings of the 21st International Society for Music Information Retrieval (ISMIR) Conference. McGill-Queen's University Press. ISSN 978-0-9813537-0-8. p. 318–324. Full text in Research Archive
  • Elovsson, Karl Anders (2020). Polyphonic pitch tracking with deep layered learning. Journal of the Acoustical Society of America. ISSN 0001-4966. 148(1), p. 446–468. doi: 10.1121/10.0001468.
  • Lartillot, Olivier; Cancino-Chacón, Carlos & Brazier, Charles (2020). Real-Time Visualisation Of Fugue Played By A String Quartet. In Spagnol, Simone & Valle, Andrea (Ed.), Proceedings of the 17th Sound and Music Computing Conference. Axea sas/SMC Network. ISSN 978-88-945415-0-2. p. 115–122. Full text in Research Archive

View all works in Cristin

  • Lartillot, Olivier (2024). Successes and challenges of computational approaches for audio and music analysis and for predicting music-evoked emotion.
  • Lartillot, Olivier (2024). KI-verkt?y for h?ndtering, transkribering og analyse av musikkarkiver.
  • Ziegler, Michelle; Sudo, Marina; Akkermann, Miriam & Lartillot, Olivier (2024). Towards Collaborative Analysis: Kaija Saariaho’s IO.
  • Lartillot, Olivier (2024). Musicological and Technological Perspectives on Computational Analysis of Electroacoustic Music. In Jensenius, Alexander Refsum (Eds.), Sonic Design: Explorations Between Art and Science. Springer Nature. ISSN 978-3-031-57892-2. p. 271–297. doi: https:/doi.org/10.1007/978-3-031-57892-2_15.
  • Lartillot, Olivier (2024). Harmonizing Tradition with Technology: Enhancing Norwegian Folk Music through Computational Innovation.
  • Monstad, Lars L?berg & Lartillot, Olivier (2024). muScribe: a new transcription service for music professionals.
  • Lartillot, Olivier (2024). MIRAGE Closing Seminar: Digitisation and computer-aided music analysis of folk music.
  • Johansson, Mats Sigvard & Lartillot, Olivier (2024). Automated transcription of Hardanger fiddle music: Tracking the beats.
  • Thedens, Hans-Hinrich & Lartillot, Olivier (2024). The Norwegian Catalogue of Folk Music Online.
  • Lartillot, Olivier (2024). Real-time MIRAGE visualisation of Bartok's first quartet, first movement.
  • Lartillot, Olivier (2024). Overview of the MIRAGE project.
  • Monstad, Lars L?berg & Lartillot, Olivier (2024). Automated transcription of Hardanger fiddle music: Detecting the notes.
  • Lartillot, Olivier & Monstad, Lars L?berg (2023). MIRAGE - A Comprehensive AI-Based System for Advanced Music Analysis.
  • Christodoulou, Anna-Maria; Lartillot, Olivier & Anagnostopoulou, Christina (2023). Computational Analysis of Greek Folk Music of the Aegean.
  • Lartillot, Olivier (2023). Towards a Comprehensive Modelling Framework for Computational Music Transcription/Analysis.
  • Lartillot, Olivier (2023). Music Therapy Toolbox, and prospects.
  • Lartillot, Olivier & Monstad, Lars L?berg (2023). Computational music analysis: Significance, challenges, and our proposed approach.
  • Lartillot, Olivier (2023). MIRAGE Symposium #2: Music, emotions, analysis, therapy ... and computer.
  • Wosch, Thomas; Vobig, Bastian; Lartillot, Olivier & Christodoulou, Anna-Maria (2023). HIGH-M (Human Interaction assessment and Generative segmentation in Health and Music).
  • Maidhof, Clemens; Agres, Kat; Fachner, J?rg & Lartillot, Olivier (2023). Intra- and inter-brain coupling during music therapy.
  • Monstad, Lars L?berg & Lartillot, Olivier (2023). Automatic Transcription Of Multi-Instrumental Songs: Integrating Demixing, Harmonic Dilated Convolution, And Joint Beat Tracking.
  • Christodoulou, Anna-Maria; Lartillot, Olivier & Anagnostopoulou, Christina (2023). Greek Folk Music Dataset.
  • Lartillot, Olivier; Swarbrick, Dana; Upham, Finn & Cancino-Chacón, Carlos Eduardo (2023). Video visualization of a string quartet performance of a Bach Fugue: Design and subjective evaluation.
  • Bishop, Laura; H?ffding, Simon; Laeng, Bruno & Lartillot, Olivier (2023). Mental effort and expressive interaction in expert and student string quartet performance.
  • Monstad, Lars Alfred L?berg (2023). KI kan demokratisere musikkbransjen. VG : Verdens gang. ISSN 0805-5203.
  • Lartillot, Olivier (2023). Computational audio and musical features extraction: from MIRtoolbox to the MiningSuite.
  • Lartillot, Olivier (2023). Dynamic Visualisation of Fugue Analysis, Demonstrated in a Live Concert by the Danish String Quartet.
  • Lartillot, Olivier (2023). Towards a comprehensive model for computational music transcription and analysis: a necessary dialog between machine learning and rule-based design?
  • Lartillot, Olivier; Thedens, Hans-Hinrich; Mjelva, Olav Lukseng?rd; Elovsson, Anders; Monstad, Lars L?berg & Johansson, Mats Sigvard [Show all 8 contributors for this article] (2023). Norwegian Folk Music & Computational Analysis.
  • Monstad, Lars Alfred L?berg; Baden, Peter & W?rstad, Bernt Isak Grave (2023). Kan kunstig intelligens brukes i l?tskriverprosessen?
  • Monstad, Lars L?berg (2023). Kunstig Intelligens i kunst og kultur. [TV]. NRK Dagsrevyen.
  • Monstad, Lars Alfred L?berg (2023). Demonstrasjon av Kunstig Intelligens som verkt?y for komponister.
  • Monstad, Lars L?berg; Silje Larsen, Borgan & Vegard, Waske (2023). AI i musikken: konsekvenser og muligheter.
  • Danielsen, Anne; C?mara, Guilherme Schmidt; Lartillot, Olivier; Leske, Sabine Liliana & Spiech, Connor (2022). Musical rhythm. Behavioural, computational and neurophysiological perspectives.
  • Lartillot, Olivier & Thedens, Hans-Hinrich (2022). Online Norwegian Folk Music Archive.
  • Lartillot, Olivier; God?y, Rolf Inge & Christodoulou, Anna-Maria (2022). Computational detection and characterisation of sonic shapes: Towards a Toolbox des objets sonores.
  • Lartillot, Olivier; Elovsson, Anders; Johansson, Mats Sigvard; Thedens, Hans-Hinrich & Monstad, Lars Alfred L?berg (2022). Segmentation, Transcription, Analysis and Visualisation of the Norwegian Folk Music Archive.
  • Dalgard, Joachim; Lartillot, Olivier; Vuoskoski, Jonna Katariina & Guldbrandsen, Erling Eliseus (2021). Absorption - Somewhere between the heart and the brain.
  • Lartillot, Olivier & Johansson, Mats Sigvard (2021). Automated beat tracking of Norwegian Hardanger fiddle music.
  • Danielsen, Anne (2021). Opening remarks, presentation of RITMO.
  • Lartillot, Olivier; Guldbrandsen, Erling Eliseus & Cancino-Chacón, Carlos Eduardo (2021). Dynamics analysis, and application to a comparative study of Bruckner performances.
  • Lartillot, Olivier & Johansson, Mats Sigvard (2021). Tracking beats in Hardanger fiddle tunes .
  • Lartillot, Olivier; Elovsson, Anders & Mjelva, Olav Lukseng?rd (2021). A new software for computer-assisted annotation of music recordings, with a focus on transcription.
  • Lartillot, Olivier (2021). Presentation of MIRAGE project.
  • Tidemann, Aleksander & Lartillot, Olivier (2021). Interactive tools for exploring performance patterns in hardanger fiddle music.
  • Elovsson, Anders & Lartillot, Olivier (2021). A Hardanger Fiddle Dataset with Performances Spanning Emotional Expressions and Annotations Aligned using Image Registration.
  • Lartillot, Olivier & Lillesl?tten, Mari (2021). Artificial intelligence can help you understand music better. [Internet]. RITMO News.
  • Lartillot, Olivier & Lillesl?tten, Mari (2021). Olivier Lartillot utvikler verkt?y for ? forst? musikk bedre. [Internet]. Det humanistiske fakultet UiO YouTube account.
  • Tidemann, Aleksander; Lartillot, Olivier & Johansson, Mats Sigvard (2021). Towards New Analysis And Visualization Software For Studying Performance Patterns in Hardanger Fiddle Music.
  • Elovsson, Anders & Lartillot, Olivier (2021). HF1: Hardanger fiddle dataset.
  • Lartillot, Olivier; Cancino-Chacón, Carlos & Brazier, Charles (2020). Real-Time Visualisation Of Fugue Played By A String Quartet.
  • Bruford, Fred & Lartillot, Olivier (2020). Multidimensional similarity modelling of complex drum loops using the GrooveToolbox.
  • Lartillot, Olivier & Toiviainen, Petri (2020). Read about the Matlab MIRtoolbox. Young Acousticians Network (YAN) Newsletter. p. 4–10.
  • Lartillot, Olivier & Bruford, Fred (2020). Bistate reduction and comparison of drum patterns.
  • Christodoulou, Anna-Maria; Anagnostopoulou, Christina & Lartillot, Olivier (2022). Computational Analysis of Greek folk music of the Aegean islands. National and Kapodistrian University of Athens.

View all works in Cristin

Published May 12, 2019 5:10 PM - Last modified Nov. 20, 2024 10:55 AM

Contact

Head of project:

Olivier Lartillot

Participants

Detailed list of participants