Musical instruments that play along: ML for music generation, interaction, and performance

Predictive Interactions between a performer, instrument and sound.

Description

The concept of using AI/ML models that can generate music and sound has expanded dramatically in recent years. However, despite media attention towards musical AI research, music involving AI is rarely heard in concerts apart from a few special research events. This is partly due to a lack of musical ML systems directed towards music performers.
 
In this project, you will help to change this by developing a new ML model that can interact with a human in live performance. This model could connect directly into existing music technology components such as digital audio workstation software (DAWs) or be a self-contained computer musical instrument, touchscreen app, or custom sensor-based device for musical expression.

Requirements

  • Completed coursework or experience in machine learning, AI, or data science. Knowledge/experience in deep learning would be a plus.
  • Experience with Python.
  • Strong interest in music, performance, and creativity.

Background Literature

Keywords

music, performance, creativity, machine learning, artificial intelligence

Updated:  1 June 2019/Responsible Officer:  Dean, CECS/Page Contact:  CECS Marketing