Skip to content

Latest commit

 

History

History
31 lines (18 loc) · 1.62 KB

README.md

File metadata and controls

31 lines (18 loc) · 1.62 KB

Link to video recording of the talk

Repository for my talk at Fluttercon USA 2024 on training Convolutional Neural Networks (CNNs) and using them in Flutter apps

Structure

Look at the slides here

The Jupyter notebook train_mlp_and_cnn.ipynb shows how to train a very simple Multilayer Perceptron (MLP) and CNN on images of handwritten digits.

Check out this repository by my classmate Bernardo Ribeiro which goes into more depth on how to train different models for image classification of hand gestures. Specifically this notebook to train MobileNetV3.

You can find the code to run inference on the Flutter side here and the rock paper scissor game logic is here.

Demo App APK for Android

Download the demo app apk for Android here

The gestures are mapped as follows: Gestures

  • Rock <- fist
  • Paper <- stop, stop inverted, palm
  • Scissors <- peace, peace inverted
  • Lizard <- ok
  • Spock <- call

To enable the lizard and spock moves you need to tap the flash IconButton in the top right corner of the app bar.

Screenshot of demo app