Skip to content

Commit 461d2bd

Browse files
committed
todo: 2.
1 parent eb52469 commit 461d2bd

File tree

1 file changed

+6
-2
lines changed

1 file changed

+6
-2
lines changed

src/diffusers/loaders/lora_pipeline.py

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -374,7 +374,9 @@ def load_lora_into_unet(
374374
adapter_name (`str`, *optional*):
375375
Adapter name to be used for referencing the loaded adapter model. If not specified, it will use
376376
`default_{i}` where i is the total number of adapters being loaded.
377-
metadata: TODO
377+
metadata (`dict`):
378+
Optional LoRA adapter metadata. When supplied, the `LoraConfig` arguments of `peft` won't be derived
379+
from the state dict.
378380
low_cpu_mem_usage (`bool`, *optional*):
379381
Speed up model loading only loading the pretrained LoRA weights and not initializing the random
380382
weights.
@@ -856,7 +858,9 @@ def load_lora_into_unet(
856858
adapter_name (`str`, *optional*):
857859
Adapter name to be used for referencing the loaded adapter model. If not specified, it will use
858860
`default_{i}` where i is the total number of adapters being loaded.
859-
metadata: TODO
861+
metadata (`dict`):
862+
Optional LoRA adapter metadata. When supplied, the `LoraConfig` arguments of `peft` won't be derived
863+
from the state dict.
860864
low_cpu_mem_usage (`bool`, *optional*):
861865
Speed up model loading only loading the pretrained LoRA weights and not initializing the random
862866
weights.

0 commit comments

Comments
 (0)