Skip to content

Error during inference: diffusers.configuration_utils.ConfigMixin.load_config() got multiple values for keyword argument 'cache_dir' #255

@nitinmukesh

Description

@nitinmukesh

I want to store the models in a specific folder instead of default huggingface cache. getting error with this code
Windows 11

transformer_repo = "mit-han-lab/svdq-int4-flux.1-dev"
base_model_path = "models/Flux.1-dev"
print("Here0")
transformer = NunchakuFluxTransformer2dModel.from_pretrained(
    transformer_repo, 
    offload=True,
    cache_dir=base_model_path
)
print("Here01")

----Flux-dev mode: FLUX.1-dev Extremely Low VRAM None 1.0 False apply_cache_on_pipe
CUDA allocated: 0.0 MB
CUDA reserved: 0.0 MB
Here0
Error during inference: diffusers.configuration_utils.ConfigMixin.load_config() got multiple values for keyword argument 'cache_dir'

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions