Luis Oala commited on
Commit
f44c756
·
unverified ·
1 Parent(s): 9d0fc44

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -7
README.md CHANGED
@@ -1,3 +1,4 @@
 
1
  # From Lens to Logit - Addressing Camera Hardware-Drift Using Raw Sensor Data
2
 
3
  *This repository hosts the code for the project ["From Lens to Logit: Addressing Camera Hardware-Drift Using Raw Sensor Data"](https://openreview.net/forum?id=DRAywM1BhU), submitted to the NeurIPS 2021 Datasets and Benchmarks Track.*
@@ -43,13 +44,15 @@ We also maintain a copy of the entire dataset with a permanent identifier at Zen
43
  ## Code
44
  ### Dependencies
45
  #### Conda environment and dependencies
46
- To make running this code easier you can install the latest conda environment for this project stored in `perturbed-environment.yml`.
47
- ##### Install environment from `perturbed-environment.yml`
48
- If you want to install the latest conda environment run
49
- `conda env create -f perturbed-environment.yml`
50
- ##### Install segmentation_models_pytorch newest version
51
- PyPi version is not up-to-date with github version and lacks features
52
- `python -m pip install git+https://github.com/qubvel/segmentation_models.pytorch`
 
 
53
  ### Recreate experiments
54
  ## Virtual lab log
55
  We maintain a collaborative virtual lab log at [this address](http://deplo-mlflo-1ssxo94f973sj-890390d809901dbf.elb.eu-central-1.amazonaws.com/#/). There you can browse experiment runs, analyze results through SQL queries and download trained processing and task models.
 
1
+ [![MIT License](https://img.shields.io/apm/l/atomic-design-ui.svg?)](https://github.com/tterb/atomic-design-ui/blob/master/LICENSEs)
2
  # From Lens to Logit - Addressing Camera Hardware-Drift Using Raw Sensor Data
3
 
4
  *This repository hosts the code for the project ["From Lens to Logit: Addressing Camera Hardware-Drift Using Raw Sensor Data"](https://openreview.net/forum?id=DRAywM1BhU), submitted to the NeurIPS 2021 Datasets and Benchmarks Track.*
 
44
  ## Code
45
  ### Dependencies
46
  #### Conda environment and dependencies
47
+ To run this code out-of-the-box you can install the latest project conda environment stored in `perturbed-environment.yml`
48
+ ```bash
49
+ conda env create -f perturbed-environment.yml
50
+ ```
51
+ #### segmentation_models_pytorch newest version
52
+ We noticed that PyPi package for `segmentation_models_pytorch` is sometimes behind the project's github repository. If you encounter `smp` related problems we reccomend installing directly from the `smp` reposiroty via
53
+ ```bash
54
+ python -m pip install git+https://github.com/qubvel/segmentation_models.pytorch
55
+ ```
56
  ### Recreate experiments
57
  ## Virtual lab log
58
  We maintain a collaborative virtual lab log at [this address](http://deplo-mlflo-1ssxo94f973sj-890390d809901dbf.elb.eu-central-1.amazonaws.com/#/). There you can browse experiment runs, analyze results through SQL queries and download trained processing and task models.