Optimum documentation

Single-HPU Training

You are viewing main version, which requires installation from source. If you'd like regular pip install, checkout the latest stable version (v1.23.3).
Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

Single-HPU Training

Training on a single device is as simple as in Transformers:

  • You need to replace the Transformers’ Trainer class with the GaudiTrainer class,
  • You need to replace the Transformers’ TrainingArguments class with the GaudiTrainingArguments class and add the following arguments:
    • use_habana to execute your script on an HPU,
    • use_lazy_mode to use lazy mode (recommended) or not (i.e. eager mode),
    • gaudi_config_name to give the name of (Hub) or the path to (local) your Gaudi configuration file.

To go further, we invite you to read our guides about accelerating training and pretraining.

< > Update on GitHub