WillHeld's picture
add model
7492101
|
raw
history blame
1.21 kB
metadata
tags:
  - roberta
  - adapter-transformers
datasets:
  - glue
language:
  - en

Adapter SALT-NLP/pfadapter-roberta-base-sst2-combined-value for roberta-base

An adapter for the roberta-base model that was trained on the glue dataset and includes a prediction head for classification.

This adapter was created for usage with the adapter-transformers library.

Usage

First, install adapter-transformers:

pip install -U adapter-transformers

Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More

Now, the adapter can be loaded and activated like this:

from transformers import AutoAdapterModel

model = AutoAdapterModel.from_pretrained("roberta-base")
adapter_name = model.load_adapter("SALT-NLP/pfadapter-roberta-base-sst2-combined-value", source="hf", set_active=True)

Architecture & Training

Evaluation results

Citation