LeonardPuettmann
commited on
Commit
•
21faeae
1
Parent(s):
e814488
Update README.md
Browse files
README.md
CHANGED
@@ -11,15 +11,17 @@ widget:
|
|
11 |
|
12 |
# Finetuned destilBERT model for stock news classification
|
13 |
|
14 |
-
This
|
15 |
-
|
16 |
-
|
|
|
|
|
17 |
## Features
|
18 |
|
19 |
- The model can handle various text classification tasks, especially when it comes to stock and finance news sentiment classification.
|
20 |
-
- The model
|
21 |
-
- The model
|
22 |
-
- The model is currently supported by the PyTorch framework and can be easily deployed on various platforms using the HuggingFace Pipeline API
|
23 |
|
24 |
## Usage
|
25 |
|
|
|
11 |
|
12 |
# Finetuned destilBERT model for stock news classification
|
13 |
|
14 |
+
This destilbert model was fine-tuned on 50.000 stock news articles using the HuggingFace adapter from Kern AI refinery. The articles consisted of the headlines plus abstract of the article.
|
15 |
+
For the finetuning, a single NVidia K80 was used for about four hours.
|
16 |
+
DistilBERT is a smaller, faster and lighter version of BERT. It was trained by distilling BERT base and has 40% less parameters than bert-base-uncased.
|
17 |
+
It runs 60% faster while preserving over 95% of BERT’s performances as measured on the GLUE language understanding benchmark 1.
|
18 |
+
DistilBERT does not have token-type embeddings, pooler and retains only half of the layers from Google’s BERT 2
|
19 |
## Features
|
20 |
|
21 |
- The model can handle various text classification tasks, especially when it comes to stock and finance news sentiment classification.
|
22 |
+
- The output of the model are the three classes "positive", "neutral" and "negative" plus the models respective confidence score of the class.
|
23 |
+
- The model was fine-tuned on a custom datasets that was curated by Kern AI and labeled in our tool refinery.
|
24 |
+
- The model is currently supported by the PyTorch framework and can be easily deployed on various platforms using the HuggingFace Pipeline API.
|
25 |
|
26 |
## Usage
|
27 |
|