Skip to main content
R-VAE: Live latent space drum rhythm generation from minimal-size datasets

Abstract

In this article, we present R-VAE, a system designed for the modeling and exploration of latent spaces learned from rhythms encoded in MIDI clips. The system is based on a variational autoencoder neural network, uses a data structure that is capable of encoding rhythms in simple and compound meter, and can learn models from little training data. To facilitate the exploration of models, we implemented a visualizer that relies on the dynamic nature of the pulsing rhythmic patterns. To test our system in real-life musical practice, we collected small-scale datasets of contemporary music genre rhythms and trained models with them. We found that the non-linearities of the learned latent spaces coupled with tactile interfaces to interact with the models were very expressive and led to unexpected places in musical composition and live performance settings. A music album was recorded and it was premiered at a major music festival using the VAE latent space on stage.

Keywords

Rhythm, Meter, Latent space, Music visualization

How to Cite

Vigliensoni, G., McCallum, L., Maestre, E. & Fiebrink, R., (2022) “R-VAE: Live latent space drum rhythm generation from minimal-size datasets”, Journal of Creative Music Systems 1(1). doi: https://doi.org/10.5920/jcms.902

968

Views

519

Downloads

Share

Authors

Gabriel Vigliensoni orcid logo (Goldsmiths, University of London)
Louis McCallum (University of Arts London)
Esteban Maestre orcid logo (McGill University)
Rebecca Fiebrink orcid logo (University of Arts London)

Downloads

Issue

Publication details

Licence

Creative Commons Attribution 4.0

Identifiers

Peer Review

This article has been peer reviewed.

File Checksums (MD5)

  • Final post-copyedit camera-ready: f0f8f062da590e991474a7dec91248bf