Convert Figma logo to code with AI

jisungk logodeepjazz

Deep learning driven jazz generation using Keras & Theano!

2,871
441
2,871
12

Quick Overview

DeepJazz is a deep learning model that generates jazz music. It uses a Long Short-Term Memory (LSTM) neural network to learn the patterns and structure of jazz music from a dataset of MIDI files, and then generates new, original jazz compositions.

Pros

  • Generates Unique and Creative Jazz Music: The model is able to create novel jazz compositions that capture the essence of the genre, with unique melodies, harmonies, and rhythms.
  • Flexible and Customizable: The project provides options to fine-tune the model, adjust the temperature of the generation, and generate music of different lengths and styles.
  • Open-Source and Accessible: The project is available on GitHub, allowing for community contributions and further development.
  • Potential for Music Education and Collaboration: The generated music can be used for educational purposes, as well as a starting point for human-AI collaborative music creation.

Cons

  • Limited Dataset: The model is trained on a relatively small dataset of MIDI files, which may limit the diversity and complexity of the generated music.
  • Lack of Real-Time Generation: The current implementation generates music in a non-real-time fashion, which may limit its practical applications.
  • Potential for Biased or Unoriginal Output: As with any machine learning model, there is a risk of the generated music being biased or lacking true originality.
  • Computational Complexity: Training and running the model can be computationally intensive, which may limit its accessibility for some users.

Code Examples

from deepjazz.model import JazzModel

# Load the pre-trained model
model = JazzModel()

# Generate a new jazz composition
generated_music = model.generate_music(length=120, temperature=0.8)

# Save the generated music to a MIDI file
generated_music.write('generated_jazz.mid')

This code demonstrates how to load the pre-trained JazzModel, generate a new jazz composition, and save it to a MIDI file.

from deepjazz.dataset import JazzDataset

# Load the jazz dataset
dataset = JazzDataset()

# Train the model on the dataset
model.train(dataset, epochs=50, batch_size=64)

This code shows how to load the jazz dataset and train the JazzModel on the data.

from deepjazz.utils import visualize_generated_music

# Generate some music and visualize it
generated_music = model.generate_music(length=120, temperature=0.5)
visualize_generated_music(generated_music)

This code generates some music and visualizes the resulting MIDI data using the visualize_generated_music function.

Getting Started

To get started with DeepJazz, follow these steps:

  1. Clone the repository:
git clone https://github.com/jisungk/deepjazz.git
  1. Install the required dependencies:
cd deepjazz
pip install -r requirements.txt
  1. Download the pre-trained model:
python download_model.py
  1. Generate some jazz music:
from deepjazz.model import JazzModel

model = JazzModel()
generated_music = model.generate_music(length=120, temperature=0.8)
generated_music.write('generated_jazz.mid')
  1. (Optional) Train the model on your own dataset:
from deepjazz.dataset import JazzDataset

dataset = JazzDataset()
model.train(dataset, epochs=50, batch_size=64)

That's it! You can now experiment with the DeepJazz model, generate new jazz compositions, and even fine-tune the model on your own data.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

Note: deepjazz is no longer being actively developed. It may be refactored at some point in the future. Goodbye and thank you for your interest 😢


deepjazz

Using Keras & Theano for deep learning driven jazz generation

I built deepjazz in 36 hours at a hackathon. It uses Keras & Theano, two deep learning libraries, to generate jazz music. Specifically, it builds a two-layer LSTM, learning from the given MIDI file. It uses deep learning, the AI tech that powers Google's AlphaGo and IBM's Watson, to make music -- something that's considered as deeply human.

SoundCloud
Check out deepjazz's music on SoundCloud!

Dependencies

Instructions

Run on CPU with command:

python generator.py [# of epochs]

Run on GPU with command:

THEANO_FLAGS=mode=FAST_RUN,device=gpu,floatX=float32 python generator.py [# of epochs]

Note: running Keras/Theano on GPU is formally supported for only NVIDIA cards (CUDA backend).

Note: preprocess.py must be modified to work with other MIDI files (the relevant "melody" MIDI part needs to be selected). The ability to handle this natively is a planned feature.

Author

Ji-Sung Kim
Princeton University, Department of Computer Science
hello (at) jisungkim.com

Citations

This project develops a lot of preprocessing code (with permission) from Evan Chow's jazzml. Thank you Evan! Public examples from the Keras documentation were also referenced.

Code License, Media Copyright

Code is licensed under the Apache License 2.0
Images and other media are copyrighted (Ji-Sung Kim)