This repository contains code for training and evaluating a neural network model for drum humanization(see image below). The goal of the project is to generate MIDI performances from quantized drum patterns that mimic the nuanced timing and velocity variations of a professional Bossa Nova drummer performance.
- Preprocessing: Maps drum hits to canonical categories, partitions sequences, and applies fixed-size windows for pattern creation.
- Data Representation: Represents drum patterns using binary matrices for hits, continuous matrices for timing offsets, and velocities.
- Models:
- MLP (Multilayer Perceptron): Utilizes concatenated matrices of hits, timing offsets, and velocities as inputs.
- Other models (to be added as developed).
- Training: Implemented using TensorFlow with the Adam optimizer.
- Evaluation: Assessing the generated MIDI performances for realism and fidelity to input patterns.
-
Clone this repository:
git clone https://github.com/your-username/drum-humanization.git
-
Install dependencies:
pip install -r requirements.txt
data/
: Contains datasets and preprocessing scripts.models/
: Trained neural network models.outputs/
: Model predictions and midi outputsml.py
: Script for training the modelmidi.py
: Script for generating midis from model outputsanalysis.ipynb
: Model outputs exploration
Contributions are welcome! Feel free to open an issue for suggestions or bug reports, or submit pull requests for improvements.