Skip to content

Commit 6b4e940

Browse files
Backport PR #3333 on branch 1.3.x (docs: update docs use cases with correct installations) (#3334)
Backport PR #3333: docs: update docs use cases with correct installations Co-authored-by: Ori Kronfeld <[email protected]>
1 parent 7cae6c9 commit 6b4e940

File tree

4 files changed

+15
-0
lines changed

4 files changed

+15
-0
lines changed

docs/user_guide/use_case/custom_dataloaders.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,10 @@
44
This page is under construction.
55
:::
66

7+
:::{note}
8+
In order to run scvi-tools with custom dataloaders support, use: pip install scvi-tools[dataloaders]
9+
:::
10+
711
In SCVI, custom dataloaders allow you to create a tailored data pipeline that can handle unique formats or complex datasets not covered by the default loaders. A custom dataloader can be useful when you have a specific structure for your data or need to preprocess it in a particular way before feeding it into the model, in order to gain some advantage.
812

913
For example, we offer custom dataloaders that do not necessarily store the data on memory while training, thus enable us to expand the sizes of dataset that we can train our models based on while not being limited by the amount of memory that we have.

docs/user_guide/use_case/downstream_analysis_tasks.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,9 @@
11
# Perform downstream analysis tasks of SCVI models
22

3+
:::{note}
4+
In order to run scvi-tools with scanpy support, use: pip install scvi-tools[scanpy]
5+
:::
6+
37
SCVI provides useful tools for exploring and understanding the learned latent representations, as well as for interpreting various aspects of your single-cell dataset.
48

59
1. Latent Space Exploration

docs/user_guide/use_case/hyper_parameters_tuning.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,9 @@
11
# Optimize SCVI model with hyperparameter tuning
22

3+
:::{note}
4+
In order to run scvi-tools with hyperparameters tuning support, use: pip install scvi-tools[autotune]
5+
:::
6+
37
Hyperparameter tuning is the process of adjusting the parameters that control the training process of a machine learning model to find the best configuration for achieving optimal performance. These hyperparameters could include learning rate, batch size, the number of layers, and more. In PyTorch Lightning, when using Ray for hyperparameter tuning, you can leverage [Ray Tune](https://docs.ray.io/en/latest/tune/index.html), which is a scalable library for distributed hyperparameter optimization. To perform hyperparameter tuning in PyTorch Lightning with Ray, you first define a search space for the hyperparameters you want to tune (such as learning rate or batch size). Then, you set up a TuneReportCallback to track the performance of each training run and report the results back to Ray Tune. Ray will then automatically run multiple trials with different hyperparameter combinations and help you find the best-performing set.
48

59
There are several common parameters that need to be entered when running hyper parameters tuning with ray:

docs/user_guide/use_case/multi_gpu_training.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,8 @@
11
# Train SCVI model with multi GPU support
22

3+
:::{note}
4+
In order to run scvi-tools with mulyi-GPU support, use: pip install scvi-tools[cuda]
5+
:::
36

47
SCVI-Tools v1.3.0 now support training on a **multi GPU system**, which can significantly speed up training and allow you to handle larger datasets. It is supported only on Nvidia GPUs and DDP with CUDA backend.
58

0 commit comments

Comments
 (0)