Bio-Inspired Computing for Music Charles Martin - Univ. Oslo, Dept. - - PowerPoint PPT Presentation

bio inspired computing for music charles martin univ oslo
SMART_READER_LITE
LIVE PREVIEW

Bio-Inspired Computing for Music Charles Martin - Univ. Oslo, Dept. - - PowerPoint PPT Presentation

Bio-Inspired Computing for Music Charles Martin - Univ. Oslo, Dept. Informatics https://folk.uio.no/charlepm https://charlesmartin.com.au But why?? Predictive Musical Interaction - How can musical instruments be more intelligent ? -


slide-1
SLIDE 1

Bio-Inspired Computing for Music

slide-2
SLIDE 2

Charles Martin - Univ. Oslo, Dept. Informatics

https://folk.uio.no/charlepm https://charlesmartin.com.au

slide-3
SLIDE 3

But why??

slide-4
SLIDE 4

Predictive Musical Interaction

  • How can musical instruments

be more “intelligent”?

  • What would that mean for

musicians, music making, and music?

microjam.info

slide-5
SLIDE 5

Music and Research

we make new musical instruments, measure musical experiences, find new ways to express

  • urselves with technology.

Why?

  • Expression and creativity is important.
  • Music is everywhere; people care about it.
  • Music is hard; realtime, high standards.
  • It’s fun.

You don’t have to be a professional musician to do a great musical project!

slide-6
SLIDE 6

Space: the final frontier. These are the voyages

slide-7
SLIDE 7

Models

Then... Now...

slide-8
SLIDE 8

Mozer (1994) Eck & Schmidhuber (2003)

Music Generation: Predicting note-by-note

Magenta (Google) 2016-2017

slide-9
SLIDE 9

Representations of Music

Thin Medium? Thick

folkRNN Performance RNN WaveNet

slide-10
SLIDE 10

Theme: Creative Machine Learning

  • Using Deep Learning to represent

and create music and art!

  • Compose music for video games?
  • Mobile apps that generate music

endlessly!

  • MIREX Competition Tasks - music

information retrieval

  • Techniques: Deep Learning,

Recurrent Networks, Convolutional Networks,

slide-11
SLIDE 11

Theme: Intelligent Instruments

How can we integrate ML/AI into NIMEs?

  • Help users make better music.
  • “Guess” intentions (key, scale,

harmony).

  • Generate extra sounds (ensemble

experience) using ANNs or evolution.

  • Make sound/music in response to

sensors/cameras (sonification).

  • Challenge/fun here is interacting with

ML algorithms.

Data → ML → Interaction!

slide-12
SLIDE 12

Theme: Embedded Instruments

  • Use microcontrollers /

single board computers to make stand-alone NIMEs

  • Includes prototyping,

hardware creation, programming.

  • Could be for augmented

instruments as well?

  • e.g.,
slide-13
SLIDE 13

Theme: Social Music Making

How can music bring people together?

  • Make “ensemble” playing easier.
  • Instruments that change to

support “parts” in a group.

  • Asynchronous music making:

performing as a group via facebook or the web

slide-14
SLIDE 14
slide-15
SLIDE 15
slide-16
SLIDE 16

Mixture Density RNN

slide-17
SLIDE 17
slide-18
SLIDE 18
slide-19
SLIDE 19

Neural Touchscreen Ensemble

slide-20
SLIDE 20
slide-21
SLIDE 21

Neural iPad Ensemble

slide-22
SLIDE 22
slide-23
SLIDE 23

Embodied Musical Predictions

EMPI: Embodied Musical Predictive Instrument

slide-24
SLIDE 24

RITMO: Centre for Interdisciplinary Studies in Rhythm, Time and Motion

slide-25
SLIDE 25

MusicLab vol. 3: Rhythm

MusicLab Vol. 3 explores the phenomenon of rhythm

  • in music and in the body - all within MusicLab’s

unique blend of research and edutainment through and intellectual warm-up with world-leading experts, music-dance performances and data jockeying with

  • ur house DJ and anyone else interested.

Time and place: Nov. 15, 2018 7:00 PM, Escape, Ole Johan Dahls hus

https://www.hf.uio.no/ritmo/english/n ews-and-events/events/musiclab/201 8/rhythm/index.html