Skip to content

Commit

Permalink
update readme
Browse files Browse the repository at this point in the history
  • Loading branch information
ngbountos committed Jun 18, 2023
1 parent fc7cd8f commit 45f4c95
Showing 1 changed file with 39 additions and 30 deletions.
69 changes: 39 additions & 30 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,40 +18,18 @@ If you use this work, please cite:
pages = {1453-1462}
}
```

### Contents
- [Setup](#dependancies)
- [Data and pretrained models](#dataset-and-pretrained-models)
- [SSL Training from scratch](#train-ssl-from-scratch)
- [Description of Hephaestus annotation scheme](#annotation)
- [Recreating cropped dataset](#reproduce-cropped-patches)
- [Acknowledgment](#acknowledgments)
### Dependancies
This repo has been tested with python3.9. To install the necessary dependancies run:
`pip install -r requirements.txt`

### Multi-GPU / Multi-Node training
You can make use of torchrun or SLURM to launch distributed jobs.

#### torchrun:
Single-Node Multi-GPU:
```
torchrun --standalone --nnodes=1 --nproc_per_node=2 main.py
```

Multi-Node Multi-GPU:
```
# On XXX.XXX.XXX.62 (the master node)
torchrun \
--nproc_per_node=2 --nnodes=2 --node_rank=0 \
--master_addr=XXX.XXX.XXX.62 --master_port=1234 \
main.py
# On XXX.XXX.XXX.63 (the worker node)
torchrun \
--nproc_per_node=2 --nnodes=2 --node_rank=1 \
--master_addr=XXX.XXX.XXX.62 --master_port=1234 \
main.py
```

#### SLURM:
After setting the relevant parameters inside hephaestus.slurm:
```
sbatch hephaestus.slurm
```

### Dataset and pretrained models

The annotation files can be downloaded [here](https://www.dropbox.com/s/i08mz5514gczksz/annotations_hephaestus.zip?dl=0).
Expand Down Expand Up @@ -100,6 +78,37 @@ python main.py
```
The script will automatically create folders for the checkpoints and store the config file and the wandb run id.


### Multi-GPU / Multi-Node training
You can make use of torchrun or SLURM to launch distributed jobs.

#### torchrun:
Single-Node Multi-GPU:
```
torchrun --standalone --nnodes=1 --nproc_per_node=2 main.py
```

Multi-Node Multi-GPU:
```
# On XXX.XXX.XXX.62 (the master node)
torchrun \
--nproc_per_node=2 --nnodes=2 --node_rank=0 \
--master_addr=XXX.XXX.XXX.62 --master_port=1234 \
main.py
# On XXX.XXX.XXX.63 (the worker node)
torchrun \
--nproc_per_node=2 --nnodes=2 --node_rank=1 \
--master_addr=XXX.XXX.XXX.62 --master_port=1234 \
main.py
```

#### SLURM:
After setting the relevant parameters inside hephaestus.slurm:
```
sbatch hephaestus.slurm
```

### Annotation

The dataset contains both labeled and unlabeled data. The labeled part covers 38 frames summing up to 19,919 annotated InSAR.
Expand Down

0 comments on commit 45f4c95

Please sign in to comment.