Skip to content

Commit

Permalink
readme files, delete duplicate files
Browse files Browse the repository at this point in the history
  • Loading branch information
Taniya-Das committed Nov 19, 2024
1 parent cae8d70 commit 8be1a3e
Show file tree
Hide file tree
Showing 3 changed files with 118 additions and 61 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ run = openml_tensorflow.add_onnx_to_run(run)
#### Using docker image

The docker container has the latest version of [OpenML-Tensorflow](https://github.com/openml/openml-tensorflow) downloaded and pre-installed. It can be used to run TensorFlow Deep Learning analysis on OpenML datasets.
See [docker](docker/README.md).
See [docker](docs/Docker%20reference/Docker.md).


This library is currently under development, please report any bugs or feature reuest in issues section.
59 changes: 0 additions & 59 deletions docker/README.md

This file was deleted.

118 changes: 117 additions & 1 deletion docs/Docker reference/Docker.md
Original file line number Diff line number Diff line change
@@ -1 +1,117 @@
--8<-- "../docker/README.md"
# OpenML-Tensorflow container

The docker container has the latest version of OpenML-Tensorflow downloaded and pre-installed. It can be used to run TensorFlow Deep Learning analysis on OpenML datasets.
This document contains information about:

[Usage](#usage): how to use the image\
[Using Locally Stored Datasets](#using-locally-stored-datasets-optional): mounting datasets from the local cache\
[Environment Variables](#environment-variable): setting the cache directory path\

## Usage

These are the steps to use the image:

1. Pull the docker image
```
docker pull taniyadas/openml-tensorflow:latest
```
2. If you want to run a local script, it needs to be mounted first. Mount it into the 'app' folder:
```text
docker run -it -v PATH/TO/CODE_FOLDER:/app taniyadas/openml-tensorflow /bin/bash
```
You can also mount multiple directories into the container (such as your code file directory and dataset directory ) using:
```text
docker run -t -i -v PATH/TO/CODE_FOLDER:/app -v PATH/TO/DATASET_FOLDER:/app/dataset taniyadas/openml-tensorflow /bin/bash
```
3. Please make sure to give the correct path to the dataset. For example,
```text
openml_tensorflow.config.dir = 'dataset/Images'
```
4. Run your code scripts using, for example:
```text
python docs/Examples/tf_image_classification.py
```

## Using Locally Stored Datasets

If you don't want to download the dataset each time you run your script, you can mount your dataset saved in your local cache directory to the container.

### Example Usage

1. Mount the dataset to the 'app/dataset' folder

```
docker run -t -i -v PATH/TO/CODE_FOLDER:/app -v PATH/TO/DATASET_FOLDER:/app/dataset taniyadas/openml-tensorflow /bin/bash
```

2. Set correct path to the dataset.
```text
openml_tensorflow.config.dir = '/app/dataset/Images'
```

## Environment Variable

You can configure the cache directory to control where 'OpenML' datasets are downloaded and cached.

```
cache_dir = "/app/.openml"
openml.config.set_root_cache_directory(cache_dir)
```# OpenML-Tensorflow container
The docker container has the latest version of OpenML-Tensorflow downloaded and pre-installed. It can be used to run TensorFlow Deep Learning analysis on OpenML datasets.
This document contains information about:
[Usage](#usage): how to use the image\
[Using Locally Stored Datasets](#using-locally-stored-datasets-optional): mounting datasets from the local cache\
[Environment Variables](#environment-variable): setting the cache directory path\
## Usage
These are the steps to use the image:
1. Pull the docker image
```
docker pull taniyadas/openml-tensorflow:latest
```
2. If you want to run a local script, it needs to be mounted first. Mount it into the 'app' folder:
```text
docker run -it -v PATH/TO/CODE_FOLDER:/app taniyadas/openml-tensorflow /bin/bash
```
You can also mount multiple directories into the container (such as your code file directory and dataset directory ) using:
```text
docker run -t -i -v PATH/TO/CODE_FOLDER:/app -v PATH/TO/DATASET_FOLDER:/app/dataset taniyadas/openml-tensorflow /bin/bash
```
3. Please make sure to give the correct path to the dataset. For example,
```text
openml_tensorflow.config.dir = 'dataset/Images'
```
4. Run your code scripts using, for example:
```text
python docs/Examples/tf_image_classification.py
```

## Using Locally Stored Datasets

If you don't want to download the dataset each time you run your script, you can mount your dataset saved in your local cache directory to the container.

### Example Usage

1. Mount the dataset to the 'app/dataset' folder

```
docker run -t -i -v PATH/TO/CODE_FOLDER:/app -v PATH/TO/DATASET_FOLDER:/app/dataset taniyadas/openml-tensorflow /bin/bash
```

2. Set correct path to the dataset.
```text
openml_tensorflow.config.dir = '/app/dataset/Images'
```

## Environment Variable

You can configure the cache directory to control where 'OpenML' datasets are downloaded and cached.

```
cache_dir = "/app/.openml"
openml.config.set_root_cache_directory(cache_dir)
```

0 comments on commit 8be1a3e

Please sign in to comment.