You can find workflow with samples in the workflow_assets folder.

The workflow contains the information and links needed to get started with using this model.

This fp8_scaled model is faster than the official one released by ComfyOrg when used with the Loader in the workflow.

The custom node that loads the model in the workflow is necessary to obtain fastest inference on lower VRAM GPU.

Workflow

Sample

Downloads last month
17,428
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for silveroxides/FLUX.2-dev-fp8_scaled

Finetuned
(10)
this model