Bug
When using TI2VidTwoStagesPipeline as a Python API (not through the CLI), the distilled_lora parameter requires an sd_ops renaming map to strip the diffusion_model. prefix from LoRA weight keys. Without it, the LoRA silently fails to apply -- no warning, no error -- producing severely degraded output.
This affects anyone using the pipeline programmatically, since the renaming map is only wired up inside the CLI's LoraAction argparse class and is not documented anywhere.
Reproduction
from ltx_core.loader.primitives import LoraPathStrengthAndSDOps
from ltx_pipelines.ti2vid_two_stages import TI2VidTwoStagesPipeline
# This looks correct but silently breaks the distilled LoRA:
pipeline = TI2VidTwoStagesPipeline(
checkpoint_path="ltx-2.3-22b-dev.safetensors",
distilled_lora=[LoraPathStrengthAndSDOps("ltx-2.3-22b-distilled-lora-384.safetensors", 1.0, None)],
# ...
)
The LoRA file keys have a diffusion_model. prefix (e.g. diffusion_model.adaln_single.linear.lora_A.weight), but the model parameters don't. In fuse_loras.py, _prepare_deltas silently skips every key because key_a not in lsd.sd:
# fuse_loras.py line ~47
if key_a not in lsd.sd or key_b not in lsd.sd:
continue # silently skips -- no warning
All 3320 LoRA keys are skipped. The model runs stage 2 with its distilled sigma schedule but without the LoRA that was trained for it, producing noisy/ghosting artifacts.
Fix
Pass LTXV_LORA_COMFY_RENAMING_MAP as the sd_ops argument:
from ltx_core.loader import LTXV_LORA_COMFY_RENAMING_MAP
pipeline = TI2VidTwoStagesPipeline(
distilled_lora=[LoraPathStrengthAndSDOps("...", 1.0, LTXV_LORA_COMFY_RENAMING_MAP)],
# ...
)
Suggested improvements
-
Warn on zero-match LoRAs: In _prepare_deltas / apply_loras, if a LoRA state dict matches zero model keys, emit a warning. This would have caught the bug instantly.
-
Default sd_ops: Since Lightricks' own LoRA files ship with the diffusion_model. prefix, consider making LTXV_LORA_COMFY_RENAMING_MAP the default in LoraPathStrengthAndSDOps instead of requiring it to be passed explicitly.
-
Document programmatic usage: The README only shows CLI usage. A Python API example for TI2VidTwoStagesPipeline showing the correct LoraPathStrengthAndSDOps construction would prevent this.
Environment
- LTX-2.3 (
ltx-core + ltx-pipelines packages from pip, March 2026)
ltx-2.3-22b-distilled-lora-384.safetensors from HuggingFace
- H100 80GB, PyTorch 2.6, CUDA 12.6
Bug
When using
TI2VidTwoStagesPipelineas a Python API (not through the CLI), thedistilled_loraparameter requires ansd_opsrenaming map to strip thediffusion_model.prefix from LoRA weight keys. Without it, the LoRA silently fails to apply -- no warning, no error -- producing severely degraded output.This affects anyone using the pipeline programmatically, since the renaming map is only wired up inside the CLI's
LoraActionargparse class and is not documented anywhere.Reproduction
The LoRA file keys have a
diffusion_model.prefix (e.g.diffusion_model.adaln_single.linear.lora_A.weight), but the model parameters don't. Infuse_loras.py,_prepare_deltassilently skips every key becausekey_a not in lsd.sd:All 3320 LoRA keys are skipped. The model runs stage 2 with its distilled sigma schedule but without the LoRA that was trained for it, producing noisy/ghosting artifacts.
Fix
Pass
LTXV_LORA_COMFY_RENAMING_MAPas thesd_opsargument:Suggested improvements
Warn on zero-match LoRAs: In
_prepare_deltas/apply_loras, if a LoRA state dict matches zero model keys, emit a warning. This would have caught the bug instantly.Default sd_ops: Since Lightricks' own LoRA files ship with the
diffusion_model.prefix, consider makingLTXV_LORA_COMFY_RENAMING_MAPthe default inLoraPathStrengthAndSDOpsinstead of requiring it to be passed explicitly.Document programmatic usage: The README only shows CLI usage. A Python API example for
TI2VidTwoStagesPipelineshowing the correctLoraPathStrengthAndSDOpsconstruction would prevent this.Environment
ltx-core+ltx-pipelinespackages from pip, March 2026)ltx-2.3-22b-distilled-lora-384.safetensorsfrom HuggingFace