File size: 1,882 Bytes
f70460b aa81b56 f70460b 994a4c4 9795db4 aa81b56 9795db4 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 |
---
license: apache-2.0
datasets:
- mnli
metrics:
- accuracy
tags:
- sequence-classification
- int8
---
# Quantized BERT-base MNLI model with 90% of usntructured sparsity
The pruned and quantized model in the OpenVINO IR. The pruned model was taken from this [source](https://huggingface.co/neuralmagic/oBERT-12-downstream-pruned-unstructured-90-mnli) and quantized with the code below using HF Optimum for OpenVINO:
```python
from functools import partial
from transformers import AutoModelForSequenceClassification, AutoTokenizer
from optimum.intel.openvino import OVConfig, OVQuantizer
model_id = "neuralmagic/oBERT-12-downstream-pruned-unstructured-90-mnli" #"typeform/distilbert-base-uncased-mnli"
model = AutoModelForSequenceClassification.from_pretrained(model_id)
tokenizer = AutoTokenizer.from_pretrained(model_id)
save_dir = "./nm_mnli_90"
def preprocess_function(examples, tokenizer):
return tokenizer(examples["premise"], examples["hypothesis"], padding="max_length", max_length=128, truncation=True)
# Load the default quantization configuration detailing the quantization we wish to apply
quantization_config = OVConfig()
# Instantiate our OVQuantizer using the desired configuration
quantizer = OVQuantizer.from_pretrained(model, feature="sequence-classification")
# Create the calibration dataset used to perform static quantization
calibration_dataset = quantizer.get_calibration_dataset(
"glue",
dataset_config_name="mnli",
preprocess_function=partial(preprocess_function, tokenizer=tokenizer),
num_samples=100,
dataset_split="train",
)
# Apply static quantization and export the resulting quantized model to OpenVINO IR format
quantizer.quantize(
quantization_config=quantization_config,
calibration_dataset=calibration_dataset,
save_directory=save_dir,
)
# Save the tokenizer
tokenizer.save_pretrained(save_dir)
``` |