Skip to content

Using the ControlNet architecture for video generation on a synthetic dataset.

Notifications You must be signed in to change notification settings

faverogian/control-net-video

Repository files navigation

Control Net Video

sample

Overview

This repository contains a synthetic dataset and a control net video pipeline for experimenting with video analysis tasks. The dataset consists of 6-frame videos featuring a growing dot with one of three growth rates, labeled as class 0, 1, or 2.

Contents

  • synthetic_dataset.ipynb: A Jupyter notebook for generating the synthetic dataset.
  • control_net_video_pipeline: A pipeline for running experiments on the synthetic dataset.
  • train.sh and run.sh: Bash scripts for ease of use.

Usage

Steps to Run the Project

  1. Generate the synthetic dataset by running synthetic_dataset.ipynb.
  2. Configure the control net video pipeline according to your experiment needs.
  3. Run the experiment using the provided bash scripts.

Author

  • Gian Favero, Mila, 2025

License

This project is licensed under the MIT License.

Acknowledgments

  • This project was developed at Mila, a research institute in artificial intelligence.

Contributing

Contributions are welcome. Feel free to open issues or pull requests.

About

Using the ControlNet architecture for video generation on a synthetic dataset.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published