--- model-index: - name: opt-1b3 results: - task: type: text-generation dataset: name: Wikitext type: wikitext metrics: - type: perplexity (BASELINE) value: 16.41521097191486 - type: perplexity (BASIC) value: 16.675812735788504 --- This is a d-Matrix functional reference of the OPT-1B3 model. The reference provides the following functional *configurations*: Configuration | Explanation :-- | :-- **`BASELINE`** | a reference functionally equivalent to the original model **`BASIC`** | all linear algebraic operands quantized to `MXINT8-64`, and all other operations transformed to approximated kernel simulations ### Usage Install d-Matrix [Dmx_Compressor](https://github.com/d-matrix-ai/dmx-compressor) first. ```sh pip install dmx_compressor ``` The following is an example model and its evaluation. ```sh git clone https://github.com/EleutherAI/lm-evaluation-harness cd lm-evaluation-harness pip install -e . ``` ```python from dmx.compressor.modeling import DmxModel import lm_eval model_args = "pretrained='d-matrix/opt-1b3',trust_remote_code=True" lm = lm_eval.api.registry.get_model("hf").create_from_arg_string(model_args, {"batch_size": 1}) # Transform the model with DMX lm._model = DmxModel.from_torch(lm._model).to_basic_model() # Using BASIC configuration eval_results = lm_eval.evaluate(lm, lm_eval.tasks.get_task_dict([task])) # Assign desired task, i.e. "wikitext" ```