bert_uncased_fake_news
This model is a fine-tuned version of distilbert-base-uncased on the kaggle fake news detection english dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.0015
- Train Accuracy: 0.9997
- Validation Loss: 0.0048
- Validation Accuracy: 0.9983
- Test F1 Score (macro): 0.9989
How to use
You can use this model directly with a pipeline for text classification:
>>> from transformers import pipeline
>>> classifier = pipeline("text-classification", model="rasyosef/bert_uncased_fake_news")
>>> classifier(["Wow! Talk about clueless! Austen Fletcher approaches anti-Trump protesters and gets clueless answers on why they re against Trump:Thought you might enjoy this @PrisonPlanet @allidoisowen @JackPosobiec pic.twitter.com/kdYm2WlfdB austen fletcher (@fleccas) July 17, 2017"])
[{'label': 'Fake News', 'score': 0.9999557733535767}]
Model description
More information needed
Intended uses & limitations
More information needed
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 5e-05, 'decay_steps': 2814, 'end_learning_rate': 0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
Training results
Framework versions
- Transformers 4.35.2
- TensorFlow 2.15.0
- Datasets 2.16.1
- Tokenizers 0.15.0
- Downloads last month
- 7
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for rasyosef/bert_uncased_fake_news
Base model
distilbert/distilbert-base-uncased