What is Magenta.js?
Magenta.js is a JavaScript library developed by Google Research that brings machine learning models for music and art generation directly to the browser. It's built on TensorFlow.js and provides pre-trained models for various creative tasks.
How It Works
The music generator on the main page uses MusicVAE (Variational Autoencoder) model, specifically the 'trio_4bar_lokl_small_q1' checkpoint. This model:
- Generates 4-bar sequences of melody, bass, and drums
- Creates music in real-time using browser-based machine learning
- Produces MIDI-like data that's synthesized using Tone.js
The Code
Here's the core logic behind the music generation:
const model = new mm.MusicVAE(
'https://storage.googleapis.com/download.magenta.tensorflow.org/' +
'tfjs_checkpoints/music_vae/trio_4bar_lokl_small_q1');
const player = new mm.Player();
// Generate and play a sample
async function playNextSample() {
const sample = await model.sample(1);
player.start(sample[0]);
}
Technical Details
The model uses a variational autoencoder architecture that:
- Encodes musical sequences into a latent space
- Generates new sequences by sampling from this space
- Maintains musical coherence through learned patterns
- Synthesizes audio using the Web Audio API via Tone.js