Edit model card
YAML Metadata Error: "tags" must be an array

BatteryOnlyBERT-cased for Battery Abstract Classification

Language model: batteryonlybert-cased Language: English
Downstream-task: Text Classification Training data: training_data.csv Eval data: val_data.csv Code: See example Infrastructure: 8x DGX A100

Hyperparameters

batch_size = 32
n_epochs = 14
base_LM_model = "batteryonlybert-cased"
learning_rate = 2e-5

Performance

"Validation accuracy": 97.33,
"Test accuracy": 97.34,

Usage

In Transformers

from transformers import AutoModelForSequenceClassification, AutoTokenizer, pipeline
model_name = "batterydata/batteryonlybert-cased-abstract"

# a) Get predictions
nlp = pipeline('text-classification', model=model_name, tokenizer=model_name)
input = {'The typical non-aqueous electrolyte for commercial Li-ion cells is a solution of LiPF6 in linear and cyclic carbonates.'}
res = nlp(input)

# b) Load model & tokenizer
model = AutoModelForSequenceClassification.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)

Authors

Shu Huang: sh2009 [at] cam.ac.uk

Jacqueline Cole: jmc61 [at] cam.ac.uk

Citation

BatteryBERT: A Pre-trained Language Model for Battery Database Enhancement

Downloads last month
22
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train batterydata/batteryonlybert-cased-abstract