Metadata-Version: 2.4
Name: lightning-hydra-detection
Version: 0.1.0
Summary: An auto-ML python library for computer vision based on pytorch lightning and hydra.
Author-email: Loic Tetrel <loic.tetrel@kitware.com>
Maintainer-email: kitware <kitware@kitware.com>
Project-URL: Homepage, https://gitlab.kitware.com/keu-computervision/ml/lightning-hydra-detection
Project-URL: Bug Tracker, https://gitlab.kitware.com/keu-computervision/ml/lightning-hydra-detection/-/boards
Classifier: Development Status :: 3 - Alpha
Classifier: Programming Language :: Python :: 3.10
Classifier: Topic :: Scientific/Engineering
Classifier: Intended Audience :: Science/Research
Classifier: Intended Audience :: Healthcare Industry
Classifier: Intended Audience :: Information Technology
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Classifier: Topic :: Scientific/Engineering :: Image Processing
Classifier: Topic :: Scientific/Engineering :: Image Recognition
Requires-Python: >=3.10
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: torch<2.6,>=2.5
Requires-Dist: torchvision<0.21,>=0.20.0
Requires-Dist: lightning<2.6,>=2.5
Requires-Dist: torchmetrics<1.8,>=1.7
Requires-Dist: pycocotools>=2.0
Requires-Dist: matplotlib
Requires-Dist: hydra-core>=1.3
Requires-Dist: hydra-colorlog>=1.2
Requires-Dist: hydra-optuna-sweeper>=1.2
Requires-Dist: aim>=3.16
Requires-Dist: rich<14.1,>=14.0
Provides-Extra: opt
Requires-Dist: mlflow>=1.0.0; extra == "opt"
Requires-Dist: tensorboard; extra == "opt"
Requires-Dist: pyqt5; extra == "opt"
Provides-Extra: test
Requires-Dist: lightning_hydra_detection[opt]; extra == "test"
Requires-Dist: pytest; extra == "test"
Requires-Dist: pytest-cov; extra == "test"
Provides-Extra: dev
Requires-Dist: pre-commit<4; extra == "dev"
Requires-Dist: lightning_hydra_detection[test]; extra == "dev"
Provides-Extra: docs
Requires-Dist: sphinx>=7.0; extra == "docs"
Requires-Dist: myst_parser>=0.13; extra == "docs"
Requires-Dist: sphinx_copybutton; extra == "docs"
Requires-Dist: sphinx_autodoc_typehints; extra == "docs"
Requires-Dist: furo>=2023.08.17; extra == "docs"
Dynamic: license-file

# Lightning Hydra Detection

![Version](https://img.shields.io/badge/Version-v0.1.0-blue)
[![PyPI](https://img.shields.io/badge/PyPI-available-blue?logo=pypi&logoColor=white)](https://pypi.org/search/?q=litdet)
[![License](https://img.shields.io/badge/License-Apache_2.0-yellogreen.svg)](https://opensource.org/licenses/Apache-2.0)

![Platform](https://img.shields.io/badge/Platform-linux--64_%7C_win--64_wsl2%7C_aarch64-gray)
[![CUDA](https://img.shields.io/badge/CUDA-v11.8_%7c_12.1_%7c_v12.4-%2376B900?logo=nvidia)](https://developer.nvidia.com/cuda-toolkit-archive)

[![Python](https://img.shields.io/badge/python-3.9--3.14-blue?logo=python)](https://www.python.org/)
[![pytorch](https://img.shields.io/badge/PyTorch_2.5+-ee4c2c?logo=pytorch&logoColor=white)](https://pytorch.org/get-started/locally/)
[![lightning](https://img.shields.io/badge/-Lightning_2.5+-792ee5?logo=pytorchlightning&logoColor=white)](https://pytorchlightning.ai/)
[![hydra](https://img.shields.io/badge/Config-Hydra_1.3+-89b8cd)](https://hydra.cc/)
[![template](https://img.shields.io/badge/-Lightning--Hydra--Template-017F2F?style=flat&logo=github&labelColor=gray)](https://github.com/ashleve/lightning-hydra-template)

LitDet is a domain-agnostic AutoML framework to accelerate the development of 2D object detection models 🚀⚡🔥.
* **ML Workflows**: Handles model pre-training, fine-tuning or re-training  at scale.
* **Key Features**: Provides a high-level CLI and python API for seamless hardware support (CPU/GPU), COCO-formatted dataset integration, and built-in experiment tracking.
* **Tech Stack**: Built on the Lightning-Hydra framework, it leverage PyTorch and Hydra for flexible, configuration-driven deep learning workflows.

<!-- SPHINX-END -->

<!-- [![tests](https://github.com/ashleve/lightning-hydra-template/actions/workflows/test.yml/badge.svg)](https://github.com/ashleve/lightning-hydra-template/actions/workflows/test.yml)
[![codecov](https://codecov.io/gh/ashleve/lightning-hydra-template/branch/main/graph/badge.svg)](https://codecov.io/gh/ashleve/lightning-hydra-template)
[![PRs](https://img.shields.io/badge/PRs-welcome-brightgreen.svg)](https://github.com/ashleve/lightning-hydra-template/pulls) -->

For more information on the usage, configuration, API documentation, and getting started guides, refer to our [online documentation](https://gitlab.kitware.com/keu-computervision/ml/lightning-hydra-detection).

<img src="docs/litdet_no_title.svg" width="500" alt="LitDet overview">

## Installation

You will need at least `python 3.10` and a recent GPU driver (ex: NVIDIA).

### Pip

Create a python venv and activate it:

```bash
python -m venv .lightning-hydra-detection
source .lightning-hydra-detection/bin/activate
```

You can now clone and install the project with its dependencies:

```bash
git clone https://gitlab.kitware.com/keu-computervision/ml/lightning-hydra-detection
python -m pip install --upgrade pip
pip install .
```

### pipx

To avoid managing the virtual env yourself, simply install with `pipx`:

```bash
git clone https://gitlab.kitware.com/keu-computervision/ml/lightning-hydra-detection
python -m pip install --upgrade pip
pipx install .
```

## Usage

We provide a CLI for training `light-train`, evaluation `light-eval` and prediction `light-predict`.

> **Note**
> Although we don't recommend it, you can run the library as a module directly with python (if you needs some python module like `pdb` or `cProfile`). For ex. `python src/lightning_hydra_detection/train.py ...`

### Training

##### Train a model from a coco-formatted dataset

This implies that you have an existing COCO-formatted dataset in `/path/to/coco/my-data`.

```bash
light-train paths.data_dir=/path/to/coco data.data_name=my-data
```

##### Train a model with specific hardware

```bash
# train on CPU
light-train trainer=cpu

# train on GPU
light-train trainer=gpu
```

##### Train a model using a configuration file

Example configurations are available in [examples/](examples/). Please refer to [detect_example.yaml](examples/detect_example.yaml) or [classif_example.yaml](examples/detect_example.yaml) for detection or classification generic fine-tuning.

Create a folder and add the configuration file `my_experiment.yaml` inside.
You can now launch the command with:

```bash
light-train +my_folder=my_experiment.yaml
```

If your folder is somewhere else, add its location with the following:

```bash
light-train +my_folder=my_experiment.yaml 'hydra.searchpath=[file:///path/to/my_folder/location]'
```

You can still override any parameter from command line like this:

```bash
light-train +my_folder=my_experiment.yaml trainer.max_epochs=20 data.batch_size=64
```

##### Add optionnal callbacks while training

In the following example, [batch_visualizer](configs/callbacks/batch_visualizer.yaml) and [batch_writer](configs/callbacks/batch_writer.yaml) (to visualize and write training images) are added to the default callbacks in [default.yaml](configs/callbacks/default.yaml)

```bash
light-train 'callbacks=[default,batch_visualizer,batch_writer]'
```

Other callbacks will be implemented like automatic class balancing or selective sampling.

### Prediction

##### Predict with an existing checkpoint, configuration file and change input/output dirs

```bash
light-predict ckpt_path=/path/to/checkpoint +my_folder=my_experiment.yaml paths.data_dir=path/to/data/dir paths.output_dir=/path/to/output
```

##### Visualize predictions by batch

```bash
light-predict ckpt_path=/path/to/checkpoint +callbacks=[prediction_visualizer]
```

##### Stop saving predictions

```bash
light-predict ckpt_path=/path/to/checkpoint '~callbacks.prediction_writer'
```

##### Inference on GPU

Use the `trainer` hydra configuration to change the accelerator.

```bash
light-predict ckpt_path=/path/to/checkpoint trainer=gpu
```

### Other customization

For more customization and configurations, check the [lightning hydra template documentation](https://github.com/ashleve/lightning-hydra-template?tab=readme-ov-file#your-superpowers).

## Developers

See [CONTRIBUTING.md](CONTRIBUTING.md) for developers instructions such as code practices, or the review process.

# Useful tools

- COCO dataset explorer: https://cocodataset.org/#explore
- coco-viewer: https://github.com/trsvchn/coco-viewer
