-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
We need Offline support. proper offline support. internet access is not only uneccesary, it doesnt even work as intended when it connects. #750
Description
Did you already ask in the discord? -- No - I do not use discord
You verified that this is a bug and not a feature request or question by asking in the discord? - No
Running 1 process
Loading Flux2 model
Loading transformer
Moving transformer to CPU
Loading Qwen3
torch_dtype is deprecated! Use dtype instead!
torch_dtype is deprecated! Use dtype instead!
config.json: 100%|##########| 728/728 [00:00<?, ?B/s]
model.safetensors.index.json: 32.9kB [00:00, 155kB/s]
WARNING:huggingface_hub.file_download:Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: pip install huggingface_hub[hf_xet] or pip install hf_xet
WARNING:huggingface_hub.file_download:Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: pip install huggingface_hub[hf_xet] or pip install hf_xet
model-00001-of-00005.safetensors: 1%|1 | 52.4M/4.00G [01:22<1:43:27, 635kB/s]
Error running job: An error occurred while downloading using hf_transfer. Consider disabling HF_HUB_ENABLE_HF_TRANSFER for better error handling.
Result:
- 0 completed jobs
- 1 failure
========================================
Traceback (most recent call last):
Traceback (most recent call last):
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub\file_download.py", line 474, in http_get
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub\file_download.py", line 474, in http_get
hf_transfer.download(hf_transfer.download(
ExceptionException: : Error while removing corrupted file: The process cannot access the file because it is being used by another process. (os error 32)Error while removing corrupted file: The process cannot access the file because it is being used by another process. (os error 32)
The above exception was the direct cause of the following exception:
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
Traceback (most recent call last):
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\run.py", line 120, in
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\run.py", line 120, in
main()main()
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\run.py", line 108, in main
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\run.py", line 108, in main
raise eraise e
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\run.py", line 96, in main
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\run.py", line 96, in main
job.run()job.run()
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\jobs\ExtensionJob.py", line 22, in run
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\jobs\ExtensionJob.py", line 22, in run
process.run()process.run()
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\jobs\process\BaseSDTrainProcess.py", line 1604, in run
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\jobs\process\BaseSDTrainProcess.py", line 1604, in run
self.sd.load_model()self.sd.load_model()
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\extensions_built_in\diffusion_models\flux2\flux2_model.py", line 184, in load_model
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\extensions_built_in\diffusion_models\flux2\flux2_model.py", line 184, in load_model
text_encoder, tokenizer = self.load_te()text_encoder, tokenizer = self.load_te()
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\extensions_built_in\diffusion_models\flux2\flux2_klein_model.py", line 43, in load_te
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\extensions_built_in\diffusion_models\flux2\flux2_klein_model.py", line 43, in load_te
text_encoder: Qwen3ForCausalLM = Qwen3ForCausalLM.from_pretrained(text_encoder: Qwen3ForCausalLM = Qwen3ForCausalLM.from_pretrained(
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\transformers\modeling_utils.py", line 277, in _wrapper
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\transformers\modeling_utils.py", line 277, in _wrapper
return func(*args, **kwargs)return func(*args, **kwargs)
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\transformers\modeling_utils.py", line 4900, in from_pretrained
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\transformers\modeling_utils.py", line 4900, in from_pretrained
checkpoint_files, sharded_metadata = _get_resolved_checkpoint_files(checkpoint_files, sharded_metadata = _get_resolved_checkpoint_files(
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\transformers\modeling_utils.py", line 1200, in _get_resolved_checkpoint_files
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\transformers\modeling_utils.py", line 1200, in _get_resolved_checkpoint_files
checkpoint_files, sharded_metadata = get_checkpoint_shard_files(checkpoint_files, sharded_metadata = get_checkpoint_shard_files(
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\transformers\utils\hub.py", line 1084, in get_checkpoint_shard_files
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\transformers\utils\hub.py", line 1084, in get_checkpoint_shard_files
cached_filenames = cached_files(cached_filenames = cached_files(
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\transformers\utils\hub.py", line 567, in cached_files
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\transformers\utils\hub.py", line 567, in cached_files
raise eraise e
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\transformers\utils\hub.py", line 494, in cached_files
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\transformers\utils\hub.py", line 494, in cached_files
snapshot_download(snapshot_download(
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub\utils_validators.py", line 114, in _inner_fn
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub\utils_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)return fn(*args, **kwargs)
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub_snapshot_download.py", line 330, in snapshot_download
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub_snapshot_download.py", line 330, in snapshot_download
_inner_hf_hub_download(file)_inner_hf_hub_download(file)
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub_snapshot_download.py", line 306, in _inner_hf_hub_download
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub_snapshot_download.py", line 306, in _inner_hf_hub_download
return hf_hub_download(return hf_hub_download(
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub\utils_validators.py", line 114, in _inner_fn
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub\utils_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)return fn(*args, **kwargs)
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub\file_download.py", line 1014, in hf_hub_download
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub\file_download.py", line 1014, in hf_hub_download
return _hf_hub_download_to_cache_dir(return _hf_hub_download_to_cache_dir(
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub\file_download.py", line 1175, in _hf_hub_download_to_cache_dir
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub\file_download.py", line 1175, in _hf_hub_download_to_cache_dir
_download_to_tmp_and_move(_download_to_tmp_and_move(
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub\file_download.py", line 1742, in _download_to_tmp_and_move
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub\file_download.py", line 1742, in _download_to_tmp_and_move
http_get(http_get(
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub\file_download.py", line 485, in http_get
File "F:\AI-Toolkit-Easy-Install\AI-Toolkit\venv\lib\site-packages\huggingface_hub\file_download.py", line 485, in http_get
raise RuntimeError(raise RuntimeError(
RuntimeErrorRuntimeError: : An error occurred while downloading usinghf_transfer. Consider disabling HF_HUB_ENABLE_HF_TRANSFER for better error handling.An error occurred while downloading usinghf_transfer. Consider disabling HF_HUB_ENABLE_HF_TRANSFER for better error handling.
I use an offline PC to train. it trains perfectly fine with Z-image Turbo and Base with the Huggingface repo that i clones using Git clone
thinking it will work the same, I git cloned the flux.2 base 9b repo. it didnt work, said "no such file" even though i used the "path to repo"
so i did it the hard way, i connected the internet.
I knew I needed a read token, so i set that up.
the first run, it downloaded 20% of only model 00001 of 00005.safetensors for the qwen 3 cache.
then it stopped downloading and failed. it Always does this for all hf_hub things, that is why i prefer git cloning and using it offline.
anyway, i thought since the cache had a file for Qwen 3, I could git clone the repo into that folder. no blob ref or snapshots folders, just straight clean git clone.
it did not work. so i connected again. this time, it created the blob ref and snapshots folders and just failed without downloading anything.
how do i get the correct models I need?
the only way that seems to make sense is to download it file by file and save it in its proper folder, but toolkit wants snapshots and blobs with md5 hashes
and I Can do it. i can save and symlink everything, but theres absolutely no reason to download everything AGAIN after the 3rd time.