Check out our article on Medium or test our best model using the neural-art web app.
Clone the project and install the neural-art package:
cd neural-art
pip install -e .
You can download a sample of the Wikiart dataset from the repository of Professor Chee Seng Chan
Use this command to download the images (Size = 25.4Gb):
wget http://web.fsktm.um.edu.my/~cschan/source/ICIP2017/wikiart.zip
Use this command to download the csv files:
wget http://web.fsktm.um.edu.my/~cschan/source/ICIP2017/wikiart_csv.zip
Once the downloads are complete, unzip the folders, and move them to the raw_data folder of the package
Once you have downloaded the dataset and installed the package, follow the instructions in the notebook notebooks/data_preparation.ipynb to generate your train/val/test splits.
The paintings in the Wikiart dataset provided by Professor Chan are categorized into 27 art styles. You can choose to work with all 27 art styles, or merge/drop some styles by playing with the parameters of the create_dataset
function in the data
module of the neural-art package.
On our side, we worked with only 8 distinct art styles. To replicate our dataset, follow the instructions of the section "Replicate our custom dataset of 8 art styles" in the notebook notebooks/data_preparation.ipynb.
In this section, we will now describe how to use the trainer
module to train classification models on the Wikiart dataset.
-
You can directly follow the instructions in the notebook notebooks/style_prediction_using_neuralart_package.ipynb to see examples of how to use the
trainer
module. -
You can also follow the instructions in the notebook notebooks/style_prediction_using_notebook.ipynb to see examples of how to train models on the Wikiart dataset without using the
trainer
module.
trainer = Trainer(experiment_name='test_trainer')
trainer.create_dataset_from_directory(image_folder_path, batch_size, img_height, img_width)
To use this method, you must have created your image folder by setting flat=False
in the create_dataset
function of the data
module.
trainer.create_dataset_from_csv(csv_path, image_folder_path, batch_size, img_height img_width, shuffle_dataframe=True)
To use this method, you must have created your image folder by setting flat=True
in the create_dataset
function of the data
module.
trainer.plot_train_batch()
trainer.plot_val_batch(make_prediction=False)
If a model has already been trained, you can use make_prediction=True
to see the predictions for each painting of the validation dataset.
trainer.build_model("VGG16", trainable_layers=2, random_rotation=0.3, random_zoom=0.3, learning_rate=0.001)
Use the trainable_layers
parameter to define the number of layers of the VGG16 pre-trained model to unfreeze
the custom_1
model is a small model composed of 5 convolutional layers
trainer.build_model("custom_1", random_roration=0.3, random_zoom=0.3, learning_rate=0.001)
The custom_2
model has a similar architecture to the VGG16 model.
trainer.build_model("custom_2", random_roration=0.3, random_zoom=0.3, learning_rate=0.001)
trainer.run(epochs=100)
By default, the model is trained using early stopping with patience=20
trainer.plot_history()
trainer.load_model(model_path)
trainer.plot_confusion_matrix()
trainer.load_model(model_path)
You can use the model available in models/20210831-135428-images_41472-unfreeze_2-batch_128. This is a custom_2
model trained on our custom dataset consisting of 41472 images categorized into 8 art styles.
Follow the instructions in the notebooks notebooks/feature_visualization_conv_filters.ipynb and notebooks/feature_visualization_dense_layer.ipynb.