Skip to content

Commit 2b87d86

Browse files
Recommend the scaled fp8 t5xxl instead of the regular fp8 one.
1 parent 03d6430 commit 2b87d86

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

flux/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ For the easy to use single file versions that you can easily use in [ComfyUI](ht
88

99
### Files to download for the regular version
1010

11-
If you don't have t5xxl_fp16.safetensors or clip_l.safetensors already in your ComfyUI/models/text_encoders/ directory you can find them on: [this link.](https://huggingface.co/comfyanonymous/flux_text_encoders/tree/main) You can use t5xxl_fp8_e4m3fn.safetensors instead for lower memory usage but the fp16 one is recommended if you have more than 32GB ram.
11+
If you don't have t5xxl_fp16.safetensors or clip_l.safetensors already in your ComfyUI/models/text_encoders/ directory you can find them on: [this link.](https://huggingface.co/comfyanonymous/flux_text_encoders/tree/main) You can use t5xxl_fp8_e4m3fn_scaled.safetensors instead for lower memory usage but the fp16 one is recommended if you have more than 32GB ram.
1212

1313
The VAE can be found [here](https://huggingface.co/Comfy-Org/Lumina_Image_2.0_Repackaged/blob/main/split_files/vae/ae.safetensors) and should go in your ComfyUI/models/vae/ folder.
1414

0 commit comments

Comments
 (0)