YAML Metadata
Error:
"tags" must be an array
BERT-base-cased for Battery Abstract Classification
Language model: bert-base-cased
Language: English
Downstream-task: Text Classification
Training data: training_data.csv
Eval data: val_data.csv
Code: See example
Infrastructure: 8x DGX A100
Hyperparameters
batch_size = 32
n_epochs = 15
base_LM_model = "bert-base-cased"
learning_rate = 2e-5
Performance
"Validation accuracy": 96.84,
"Test accuracy": 96.83,
Usage
In Transformers
from transformers import AutoModelForSequenceClassification, AutoTokenizer, pipeline
model_name = "batterydata/bert-base-cased-abstract"
# a) Get predictions
nlp = pipeline('text-classification', model=model_name, tokenizer=model_name)
input = {'The typical non-aqueous electrolyte for commercial Li-ion cells is a solution of LiPF6 in linear and cyclic carbonates.'}
res = nlp(input)
# b) Load model & tokenizer
model = AutoModelForSequenceClassification.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
Authors
Shu Huang: sh2009 [at] cam.ac.uk
Jacqueline Cole: jmc61 [at] cam.ac.uk
Citation
BatteryBERT: A Pre-trained Language Model for Battery Database Enhancement
- Downloads last month
- 19
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.