Spaces:
Runtime error
Runtime error
<!--- | |
Copyright 2022 The HuggingFace Team. All rights reserved. | |
Licensed under the Apache License, Version 2.0 (the "License"); | |
you may not use this file except in compliance with the License. | |
You may obtain a copy of the License at | |
http://www.apache.org/licenses/LICENSE-2.0 | |
Unless required by applicable law or agreed to in writing, software | |
distributed under the License is distributed on an "AS IS" BASIS, | |
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | |
See the License for the specific language governing permissions and | |
limitations under the License. | |
--> | |
# μ€μΉλ°©λ²[[installation]] | |
π€ Transformersλ₯Ό μ¬μ© μ€μΈ λ₯λ¬λ λΌμ΄λΈλ¬λ¦¬μ λ§μΆ° μ€μΉνκ³ , μΊμλ₯Ό ꡬμ±νκ±°λ μ νμ μΌλ‘ μ€νλΌμΈμμλ μ€νν μ μλλ‘ π€ Transformersλ₯Ό μ€μ νλ λ°©λ²μ λ°°μ°κ² μ΅λλ€. | |
π€ Transformersλ Python 3.6+, PyTorch 1.1.0+, TensorFlow 2.0+ λ° Flaxμμ ν μ€νΈλμμ΅λλ€. λ₯λ¬λ λΌμ΄λΈλ¬λ¦¬λ₯Ό μ€μΉνλ €λ©΄ μλ λ§ν¬λ μ λ§λ€μ 곡μ μ¬μ΄νΈλ₯Ό μ°Έκ³ ν΄μ£ΌμΈμ. | |
* [PyTorch](https://pytorch.org/get-started/locally/) μ€μΉνκΈ° | |
* [TensorFlow 2.0](https://www.tensorflow.org/install/pip) μ€μΉνκΈ° | |
* [Flax](https://flax.readthedocs.io/en/latest/) μ€μΉνκΈ° | |
## pipμΌλ‘ μ€μΉνκΈ°[[install-with-pip]] | |
π€ Transformersλ₯Ό [κ°μ νκ²½](https://docs.python.org/3/library/venv.html)μ μ€μΉνλ κ²μ μΆμ²λ립λλ€. Python κ°μ νκ²½μ μ΅μνμ§ μλ€λ©΄, μ΄ [κ°μ΄λ](https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/)λ₯Ό μ°Έκ³ νμΈμ. κ°μ νκ²½μ μ¬μ©νλ©΄ μλ‘ λ€λ₯Έ νλ‘μ νΈλ€μ λ³΄λ€ μ½κ² κ΄λ¦¬ν μ μκ³ , μμ‘΄μ± κ°μ νΈνμ± λ¬Έμ λ₯Ό λ°©μ§ν μ μμ΅λλ€. | |
λ¨Όμ νλ‘μ νΈ λλ ν 리μμ κ°μ νκ²½μ λ§λ€μ΄ μ€λλ€. | |
```bash | |
python -m venv .env | |
``` | |
κ°μ νκ²½μ νμ±νν΄μ£ΌμΈμ. Linuxλ MacOSμ κ²½μ°: | |
```bash | |
source .env/bin/activate | |
``` | |
Windowsμ κ²½μ°: | |
```bash | |
.env/Scripts/activate | |
``` | |
μ΄μ π€ Transformersλ₯Ό μ€μΉν μ€λΉκ° λμμ΅λλ€. λ€μ λͺ λ Ήμ μ λ ₯ν΄μ£ΌμΈμ. | |
```bash | |
pip install transformers | |
``` | |
CPUλ§ μ¨λ λλ€λ©΄, π€ Transformersμ λ₯λ¬λ λΌμ΄λΈλ¬λ¦¬λ₯Ό λ¨ 1μ€λ‘ μ€μΉν μ μμ΅λλ€. μλ₯Ό λ€μ΄ π€ Transformersμ PyTorchμ κ²½μ°: | |
```bash | |
pip install transformers[torch] | |
``` | |
π€ Transformersμ TensorFlow 2.0μ κ²½μ°: | |
```bash | |
pip install transformers[tf-cpu] | |
``` | |
π€ Transformersμ Flaxμ κ²½μ°: | |
```bash | |
pip install transformers[flax] | |
``` | |
λ§μ§λ§μΌλ‘ π€ Transformersκ° μ λλ‘ μ€μΉλμλμ§ νμΈν μ°¨λ‘μ λλ€. μ¬μ νλ ¨λ λͺ¨λΈμ λ€μ΄λ‘λνλ μ½λμ λλ€. | |
```bash | |
python -c "from transformers import pipeline; print(pipeline('sentiment-analysis')('we love you'))" | |
``` | |
λΌλ²¨κ³Ό μ μκ° μΆλ ₯λλ©΄ μ μ€μΉλ κ²μ λλ€. | |
```bash | |
[{'label': 'POSITIVE', 'score': 0.9998704791069031}] | |
``` | |
## μμ€μμ μ€μΉνκΈ°[[install-from-source]] | |
π€ Transformersλ₯Ό μμ€μμ μ€μΉνλ €λ©΄ μλ λͺ λ Ήμ μ€ννμΈμ. | |
```bash | |
pip install git+https://github.com/huggingface/transformers | |
``` | |
μ λͺ λ Ήμ μ΅μ μ΄μ§λ§ (μμ μ μΈ) `stable` λ²μ μ΄ μλ μ€νμ±μ΄ μ§μ `main` λ²μ μ μ€μΉν©λλ€. `main` λ²μ μ κ°λ° νν©κ³Ό λ°λ§μΆλλ° μ μ©ν©λλ€. μμλ‘ λ§μ§λ§ 곡μ λ¦΄λ¦¬μ€ μ΄ν λ°κ²¬λ λ²κ·Έκ° ν¨μΉλμμ§λ§, μ 릴리μ€λ‘ μμ§ λ‘€μμλμ§λ μμ κ²½μ°λ₯Ό λ€ μ μμ΅λλ€. λ°κΏ λ§νλ©΄ `main` λ²μ μ΄ μμ μ±κ³Όλ κ±°λ¦¬κ° μλ€λ λ»μ΄κΈ°λ ν©λλ€. μ ν¬λ `main` λ²μ μ μ¬μ©νλλ° λ¬Έμ κ° μλλ‘ λ Έλ ₯νκ³ μμΌλ©°, λλΆλΆμ λ¬Έμ λ λκ° λͺ μκ°μ΄λ ν루 μμ ν΄κ²°λ©λλ€. λ§μ½ λ¬Έμ κ° λ°μνλ©΄ [μ΄μ](https://github.com/huggingface/transformers/issues)λ₯Ό μ΄μ΄μ£Όμλ©΄ λ 빨리 ν΄κ²°ν μ μμ΅λλ€! | |
μ κ³Ό λ§μ°¬κ°μ§λ‘ π€ Transformersκ° μ λλ‘ μ€μΉλμλμ§ νμΈν μ°¨λ‘μ λλ€. | |
```bash | |
python -c "from transformers import pipeline; print(pipeline('sentiment-analysis')('I love you'))" | |
``` | |
## μμ κ°λ₯ν μ€μΉ[[editable-install]] | |
μμ κ°λ₯ν μ€μΉκ° νμν κ²½μ°λ λ€μκ³Ό κ°μ΅λλ€. | |
* `main` λ²μ μ μμ€ μ½λλ₯Ό μ¬μ©νκΈ° μν΄ | |
* π€ Transformersμ κΈ°μ¬νκ³ μΆμ΄μ μ½λμ λ³κ²½ μ¬νμ ν μ€νΈνκΈ° μν΄ | |
리ν¬μ§ν°λ¦¬λ₯Ό 볡μ νκ³ π€ Transformersλ₯Ό μ€μΉνλ €λ©΄ λ€μ λͺ λ Ήμ μ λ ₯ν΄μ£ΌμΈμ. | |
```bash | |
git clone https://github.com/huggingface/transformers.git | |
cd transformers | |
pip install -e . | |
``` | |
μ λͺ λ Ήμ 리ν¬μ§ν°λ¦¬λ₯Ό 볡μ ν μμΉμ ν΄λμ Python λΌμ΄λΈλ¬λ¦¬μ κ²½λ‘λ₯Ό μ°κ²°μν΅λλ€. Pythonμ΄ μΌλ° λΌμ΄λΈλ¬λ¦¬ κ²½λ‘ μΈμ 볡μ ν ν΄λ λ΄λΆλ₯Ό νμΈν κ²μ λλ€. μλ₯Ό λ€μ΄ Python ν¨ν€μ§κ° μΌλ°μ μΌλ‘ `~/anaconda3/envs/main/lib/python3.7/site-packages/`μ μ€μΉλμ΄ μλλ°, λͺ λ Ήμ λ°μ Pythonμ΄ μ΄μ 볡μ ν ν΄λμΈ `~/transformers/`λ κ²μνκ² λ©λλ€. | |
<Tip warning={true}> | |
λΌμ΄λΈλ¬λ¦¬λ₯Ό κ³μ μ¬μ©νλ €λ©΄ `transformers` ν΄λλ₯Ό κΌ μ μ§ν΄μΌ ν©λλ€. | |
</Tip> | |
볡μ λ³Έμ μ΅μ λ²μ μ π€ Transformersλ‘ μ½κ² μ λ°μ΄νΈν μ μμ΅λλ€. | |
```bash | |
cd ~/transformers/ | |
git pull | |
``` | |
Python νκ²½μ λ€μ μ€ννλ©΄ μ λ°μ΄νΈλ π€ Transformersμ `main` λ²μ μ μ°ΎμλΌ κ²μ λλ€. | |
## condaλ‘ μ€μΉνκΈ°[[install-with-conda]] | |
`huggingface` conda μ±λμμ μ€μΉν μ μμ΅λλ€. | |
```bash | |
conda install -c huggingface transformers | |
``` | |
## μΊμ ꡬμ±νκΈ°[[cache-setup]] | |
μ¬μ νλ ¨λ λͺ¨λΈμ λ€μ΄λ‘λλ ν λ‘컬 κ²½λ‘ `~/.cache/huggingface/hub`μ μΊμλ©λλ€. μ Έ νκ²½ λ³μ `TRANSFORMERS_CACHE`μ κΈ°λ³Έ λλ ν°λ¦¬μ λλ€. Windowsμ κ²½μ° κΈ°λ³Έ λλ ν°λ¦¬λ `C:\Users\username\.cache\huggingface\hub`μ λλ€. μλμ μ Έ νκ²½ λ³μλ₯Ό (μ°μ μμ) μμλλ‘ λ³κ²½νμ¬ λ€λ₯Έ μΊμ λλ ν 리λ₯Ό μ§μ ν μ μμ΅λλ€. | |
1. μ Έ νκ²½ λ³μ (κΈ°λ³Έ): `HUGGINGFACE_HUB_CACHE` λλ `TRANSFORMERS_CACHE` | |
2. μ Έ νκ²½ λ³μ: `HF_HOME` | |
3. μ Έ νκ²½ λ³μ: `XDG_CACHE_HOME` + `/huggingface` | |
<Tip> | |
κ³Όκ±° π€ Transformersμμ μ°μλ μ Έ νκ²½ λ³μ `PYTORCH_TRANSFORMERS_CACHE` λλ `PYTORCH_PRETRAINED_BERT_CACHE`μ΄ μ€μ λμλ€λ©΄, μ Έ νκ²½ λ³μ `TRANSFORMERS_CACHE`μ μ§μ νμ§ μλ ν μ°μ μ¬μ©λ©λλ€. | |
</Tip> | |
## μ€νλΌμΈ λͺ¨λ[[offline-mode]] | |
π€ Transformersλ₯Ό λ‘컬 νμΌλ§ μ¬μ©νλλ‘ ν΄μ λ°©νλ²½ λλ μ€νλΌμΈ νκ²½μμ μ€νν μ μμ΅λλ€. νμ±ννλ €λ©΄ `TRANSFORMERS_OFFLINE=1` νκ²½ λ³μλ₯Ό μ€μ νμΈμ. | |
<Tip> | |
`HF_DATASETS_OFFLINE=1` νκ²½ λ³μλ₯Ό μ€μ νμ¬ μ€νλΌμΈ νλ ¨ κ³Όμ μ [π€ Datasets](https://huggingface.co/docs/datasets/)μ μΆκ°ν μ μμ΅λλ€. | |
</Tip> | |
μλ₯Ό λ€μ΄ μΈλΆ κΈ°κΈ° μ¬μ΄μ λ°©νλ²½μ λ μΌλ° λ€νΈμν¬μμ νμμ²λΌ νλ‘κ·Έλ¨μ λ€μκ³Ό κ°μ΄ μ€νν μ μμ΅λλ€. | |
```bash | |
python examples/pytorch/translation/run_translation.py --model_name_or_path t5-small --dataset_name wmt16 --dataset_config ro-en ... | |
``` | |
μ€νλΌμΈ κΈ°κΈ°μμ λμΌν νλ‘κ·Έλ¨μ λ€μκ³Ό κ°μ΄ μ€νν μ μμ΅λλ€. | |
```bash | |
HF_DATASETS_OFFLINE=1 TRANSFORMERS_OFFLINE=1 \ | |
python examples/pytorch/translation/run_translation.py --model_name_or_path t5-small --dataset_name wmt16 --dataset_config ro-en ... | |
``` | |
μ΄μ μ€ν¬λ¦½νΈλ λ‘컬 νμΌμ νν΄μλ§ κ²μν κ²μ΄λ―λ‘, μ€ν¬λ¦½νΈκ° μ€λ¨λκ±°λ μκ°μ΄ μ΄κ³Όλ λκΉμ§ λ©μΆ°μμ§ μκ³ μ μ€νλ κ²μ λλ€. | |
### μ€νλΌμΈμ© λͺ¨λΈ λ° ν ν¬λμ΄μ λ§λ€μ΄λκΈ°[[fetch-models-and-tokenizers-to-use-offline]] | |
Another option for using π€ Transformers offline is to download the files ahead of time, and then point to their local path when you need to use them offline. There are three ways to do this: | |
π€ Transformersλ₯Ό μ€νλΌμΈμΌλ‘ μ¬μ©νλ λ λ€λ₯Έ λ°©λ²μ νμΌμ 미리 λ€μ΄λ‘λν λ€μ, μ€νλΌμΈμΌ λ μ¬μ©ν λ‘컬 κ²½λ‘λ₯Ό μ§μ ν΄λλ κ²μ λλ€. 3κ°μ§ μ€ νΈν λ°©λ²μ κ³ λ₯΄μΈμ. | |
* [Model Hub](https://huggingface.co/models)μ UIλ₯Ό ν΅ν΄ νμΌμ λ€μ΄λ‘λνλ €λ©΄ β μμ΄μ½μ ν΄λ¦νμΈμ. | |
![download-icon](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/download-icon.png) | |
* [`PreTrainedModel.from_pretrained`]μ [`PreTrainedModel.save_pretrained`] μν¬νλ‘λ₯Ό νμ©νμΈμ. | |
1. 미리 [`PreTrainedModel.from_pretrained`]λ‘ νμΌμ λ€μ΄λ‘λν΄λμΈμ. | |
```py | |
>>> from transformers import AutoTokenizer, AutoModelForSeq2SeqLM | |
>>> tokenizer = AutoTokenizer.from_pretrained("bigscience/T0_3B") | |
>>> model = AutoModelForSeq2SeqLM.from_pretrained("bigscience/T0_3B") | |
``` | |
2. [`PreTrainedModel.save_pretrained`]λ‘ μ§μ λ κ²½λ‘μ νμΌμ μ μ₯ν΄λμΈμ. | |
```py | |
>>> tokenizer.save_pretrained("./your/path/bigscience_t0") | |
>>> model.save_pretrained("./your/path/bigscience_t0") | |
``` | |
3. μ΄μ μ€νλΌμΈμΌ λ [`PreTrainedModel.from_pretrained`]λ‘ μ μ₯ν΄λλ νμΌμ μ§μ λ κ²½λ‘μμ λ€μ λΆλ¬μ€μΈμ. | |
```py | |
>>> tokenizer = AutoTokenizer.from_pretrained("./your/path/bigscience_t0") | |
>>> model = AutoModel.from_pretrained("./your/path/bigscience_t0") | |
``` | |
* [huggingface_hub](https://github.com/huggingface/huggingface_hub/tree/main/src/huggingface_hub) λΌμ΄λΈλ¬λ¦¬λ₯Ό νμ©ν΄μ νμΌμ λ€μ΄λ‘λνμΈμ. | |
1. κ°μνκ²½μ `huggingface_hub` λΌμ΄λΈλ¬λ¦¬λ₯Ό μ€μΉνμΈμ. | |
```bash | |
python -m pip install huggingface_hub | |
``` | |
2. [`hf_hub_download`](https://huggingface.co/docs/hub/adding-a-library#download-files-from-the-hub) ν¨μλ‘ νμΌμ νΉμ μμΉμ λ€μ΄λ‘λν μ μμ΅λλ€. μλ₯Ό λ€μ΄ μλ λͺ λ Ήμ [T0](https://huggingface.co/bigscience/T0_3B) λͺ¨λΈμ `config.json` νμΌμ μ§μ λ κ²½λ‘μ λ€μ΄λ‘λν©λλ€. | |
```py | |
>>> from huggingface_hub import hf_hub_download | |
>>> hf_hub_download(repo_id="bigscience/T0_3B", filename="config.json", cache_dir="./your/path/bigscience_t0") | |
``` | |
νμΌμ λ€μ΄λ‘λνκ³ λ‘컬μ μΊμ ν΄λκ³ λλ©΄, λμ€μ λΆλ¬μ μ¬μ©ν μ μλλ‘ λ‘컬 κ²½λ‘λ₯Ό μ§μ ν΄λμΈμ. | |
```py | |
>>> from transformers import AutoConfig | |
>>> config = AutoConfig.from_pretrained("./your/path/bigscience_t0/config.json") | |
``` | |
<Tip> | |
Hubμ μ μ₯λ νμΌμ λ€μ΄λ‘λνλ λ°©λ²μ λ μμΈν μμλ³΄λ €λ©΄ [Hubμμ νμΌ λ€μ΄λ‘λνκΈ°](https://huggingface.co/docs/hub/how-to-downstream) μΉμ μ μ°Έκ³ ν΄μ£ΌμΈμ. | |
</Tip> |