Edit model card

SecBERT

This is the pretrained model presented in SecBERT: A Pretrained Language Model for Cyber Security Text, which is a BERT model trained on cyber security text.

The training corpus was papers taken from

SecBERT has its own wordpiece vocabulary (secvocab) that's built to best match the training corpus.

We trained SecBERT and SecRoBERTa versions.

Available models include:


Fill Mask

We proposed to build language model which work on cyber security text, as result, it can improve downstream tasks (NER, Text Classification, Semantic Understand, Q&A) in Cyber Security Domain.

First, as below shows Fill-Mask pipeline in Google Bert, AllenAI SciBert and our SecBERT .

fill-mask-result


The original repo can be found here.

Downloads last month
3,971
Safetensors
Model size
84.1M params
Tensor type
I64
Β·
F32
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jackaduma/SecBERT

Finetunes
6 models

Spaces using jackaduma/SecBERT 5