-
-
Notifications
You must be signed in to change notification settings - Fork 2
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
readme files, delete duplicate files
- Loading branch information
1 parent
cae8d70
commit 8be1a3e
Showing
3 changed files
with
118 additions
and
61 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1 +1,117 @@ | ||
--8<-- "../docker/README.md" | ||
# OpenML-Tensorflow container | ||
|
||
The docker container has the latest version of OpenML-Tensorflow downloaded and pre-installed. It can be used to run TensorFlow Deep Learning analysis on OpenML datasets. | ||
This document contains information about: | ||
|
||
[Usage](#usage): how to use the image\ | ||
[Using Locally Stored Datasets](#using-locally-stored-datasets-optional): mounting datasets from the local cache\ | ||
[Environment Variables](#environment-variable): setting the cache directory path\ | ||
|
||
## Usage | ||
|
||
These are the steps to use the image: | ||
|
||
1. Pull the docker image | ||
``` | ||
docker pull taniyadas/openml-tensorflow:latest | ||
``` | ||
2. If you want to run a local script, it needs to be mounted first. Mount it into the 'app' folder: | ||
```text | ||
docker run -it -v PATH/TO/CODE_FOLDER:/app taniyadas/openml-tensorflow /bin/bash | ||
``` | ||
You can also mount multiple directories into the container (such as your code file directory and dataset directory ) using: | ||
```text | ||
docker run -t -i -v PATH/TO/CODE_FOLDER:/app -v PATH/TO/DATASET_FOLDER:/app/dataset taniyadas/openml-tensorflow /bin/bash | ||
``` | ||
3. Please make sure to give the correct path to the dataset. For example, | ||
```text | ||
openml_tensorflow.config.dir = 'dataset/Images' | ||
``` | ||
4. Run your code scripts using, for example: | ||
```text | ||
python docs/Examples/tf_image_classification.py | ||
``` | ||
|
||
## Using Locally Stored Datasets | ||
|
||
If you don't want to download the dataset each time you run your script, you can mount your dataset saved in your local cache directory to the container. | ||
|
||
### Example Usage | ||
|
||
1. Mount the dataset to the 'app/dataset' folder | ||
|
||
``` | ||
docker run -t -i -v PATH/TO/CODE_FOLDER:/app -v PATH/TO/DATASET_FOLDER:/app/dataset taniyadas/openml-tensorflow /bin/bash | ||
``` | ||
|
||
2. Set correct path to the dataset. | ||
```text | ||
openml_tensorflow.config.dir = '/app/dataset/Images' | ||
``` | ||
|
||
## Environment Variable | ||
|
||
You can configure the cache directory to control where 'OpenML' datasets are downloaded and cached. | ||
|
||
``` | ||
cache_dir = "/app/.openml" | ||
openml.config.set_root_cache_directory(cache_dir) | ||
```# OpenML-Tensorflow container | ||
The docker container has the latest version of OpenML-Tensorflow downloaded and pre-installed. It can be used to run TensorFlow Deep Learning analysis on OpenML datasets. | ||
This document contains information about: | ||
[Usage](#usage): how to use the image\ | ||
[Using Locally Stored Datasets](#using-locally-stored-datasets-optional): mounting datasets from the local cache\ | ||
[Environment Variables](#environment-variable): setting the cache directory path\ | ||
## Usage | ||
These are the steps to use the image: | ||
1. Pull the docker image | ||
``` | ||
docker pull taniyadas/openml-tensorflow:latest | ||
``` | ||
2. If you want to run a local script, it needs to be mounted first. Mount it into the 'app' folder: | ||
```text | ||
docker run -it -v PATH/TO/CODE_FOLDER:/app taniyadas/openml-tensorflow /bin/bash | ||
``` | ||
You can also mount multiple directories into the container (such as your code file directory and dataset directory ) using: | ||
```text | ||
docker run -t -i -v PATH/TO/CODE_FOLDER:/app -v PATH/TO/DATASET_FOLDER:/app/dataset taniyadas/openml-tensorflow /bin/bash | ||
``` | ||
3. Please make sure to give the correct path to the dataset. For example, | ||
```text | ||
openml_tensorflow.config.dir = 'dataset/Images' | ||
``` | ||
4. Run your code scripts using, for example: | ||
```text | ||
python docs/Examples/tf_image_classification.py | ||
``` | ||
|
||
## Using Locally Stored Datasets | ||
|
||
If you don't want to download the dataset each time you run your script, you can mount your dataset saved in your local cache directory to the container. | ||
|
||
### Example Usage | ||
|
||
1. Mount the dataset to the 'app/dataset' folder | ||
|
||
``` | ||
docker run -t -i -v PATH/TO/CODE_FOLDER:/app -v PATH/TO/DATASET_FOLDER:/app/dataset taniyadas/openml-tensorflow /bin/bash | ||
``` | ||
|
||
2. Set correct path to the dataset. | ||
```text | ||
openml_tensorflow.config.dir = '/app/dataset/Images' | ||
``` | ||
|
||
## Environment Variable | ||
|
||
You can configure the cache directory to control where 'OpenML' datasets are downloaded and cached. | ||
|
||
``` | ||
cache_dir = "/app/.openml" | ||
openml.config.set_root_cache_directory(cache_dir) | ||
``` |