Document Title

A collection of helper functions for PEFT.

Checking if a model is a PEFT model

peft.helpers.check_if_peft_model

< >

( model_name_or_path: str ) bool

Parameters

  • model_name_or_path (str) — Model id to check, can be local or on the Hugging Face Hub.

Returns

bool

True if the model is a PEFT model, False otherwise.

Check if the model is a PEFT model.

Temporarily Setting Adapter Scale in LoraLayer Modules

peft.helpers.set_adapter_scale

< >

( model alpha )

Parameters

  • alpha (float or int) — The scaling factor to be applied. Must be of type float or int.

Raises

ValueError

  • ValueError — If the model does not contain any LoraLayer instances, indicating that the model does not support scaling.

Context manager to temporarily set the scaling of the LoRA adapter in a model.

The original scaling values are restored when the context manager exits. This context manager works with the transformers and diffusers models that have directly loaded LoRA adapters.

Example:

>>> model = ModelWithLoraLayer()
>>> alpha = 0.5
>>> with set_adapter_scale(model, alpha):
...     outputs = model(**inputs)  # Perform operations with the scaled model
>>> outputs = model(**inputs)  # The original scaling values are restored here
< > Update on GitHub