in

The Premier 2023 I/O Preshow – Curated by Dan Deacon (with Expert Assistance from MusicLM)



**Collaboration with Musician Dan Deacon: Exploring Google’s Music AI Tools**

**Working with Musicians at Google I/O**
Google I/O, an annual event, provides the opportunity to showcase Google’s latest creative machine learning tools to musicians. In the past, collaborations have taken place with artists such as YACHT and The Flaming Lips, exploring custom symbolic music generation models and audience-performer interaction. This year, electronic musician and composer Dan Deacon took the stage.

**Exploring Generative Models with Dan Deacon**
Dan Deacon worked with two of Google’s new generative models: MusicLM and SingSong. MusicLM generates music based on a text-based input prompt, while SingSong generates an accompaniment track for audio-based singing input. Both models are part of the AudioLM family and produce audio by predicting SoundStream tokens using Transformer language models.

**Dan Deacon’s Performance at Google I/O 2023**
Dan Deacon incorporated the use of MusicLM and SingSong in his performance at Google I/O 2023. He used MusicLM to create a chill piano groove for his two meditations featuring the Duck with Lips. Additionally, he utilized both MusicLM and SingSong to create a Chiptune song, extending the capabilities of these models to fit his artistic vision.

**Embracing Technological Advancements in Music**
Throughout history, technology has played a pivotal role in the evolution of music. Instruments such as flutes and violins were once groundbreaking technologies that expanded the creative palette of musicians. Similarly, new technologies like Google’s music AI models aim to empower artists and bring new creative capabilities to humanity. Collaboration with musicians like Dan Deacon allows for a better understanding of how these tools integrate into the artistic process.

**Pushing the Boundaries of Music AI**
During the workshop with Dan Deacon, interesting experiments with MusicLM and SingSong were conducted. Dan pushed the tools beyond their limits by playing his synthesizer into SingSong, which was trained only on singing inputs. Surprisingly, feeding the output of SingSong back into itself created a feedback loop that produced music complementing the input in terms of key, tempo, and style. This interaction was instrumental in composing the Chiptune song.

**Dan Deacon’s Creative Process**
Dan Deacon started with text prompts to MusicLM and used the generated audio as input to SingSong, feeding the output back through SingSong for multiple iterations. This process resulted in hundreds of audio clips that complemented each other. From these clips, Dan handpicked his favorites, made slight edits, and performed them during his I/O pre-show.

**The Future of Music Technology**
Google is committed to empowering musicians and expanding their creative possibilities through collaborations and the development of music AI tools. The research conducted in this field offers promising directions for the future. Musicians can explore Google’s MusicLM by signing up via the AI Test Kitchen app.

**Acknowledgements**
The success of the I/O pre-show performance was a collaborative effort involving multiple individuals. The team would like to express their gratitude to everyone involved for their contributions.

(Note: The original text has been modified to meet the guidelines provided)



Leave a Reply

Your email address will not be published. Required fields are marked *

GIPHY App Key not set. Please check settings

Becoming a Digital Designer: The Ultimate Comprehensive Guide

Examining the Instability of Memory in a Google Kubernetes Engine Node