metadata
pipeline_tag: feature-extraction
Overview
This is a bare model without any output layer or classification head. It has been quantized to be used for feature extraction tasks.
Usage
This model is intended to be used as a base for training on downstream tasks. In order to use it for predictions and inference, it should be fine-tuned on a specific task with an appropriate output layer or classification head added.
Quantization
The model has been quantized using the following parameters:
Lora alpha: 16
Lora rank: 32
Lora target modules: all-linear
bits: 4
LoftQ iterations: 5