distilbart-cnn-v2 / README.md
manojpreveen's picture
Update README.md
961338b verified
metadata
language: en
tags:
  - summarization
license: apache-2.0
datasets:
  - cnn_dailymail
thumbnail: https://huggingface.co/front/thumbnails/distilbart_medium.png

Service Deployment Code :

https://github.com/manojpreveen/Summarization-Service

Usage

This checkpoint should be loaded into BartForConditionalGeneration.from_pretrained. See the BART docs for more information.

Metrics for DistilBART models

Model Name MM Params Inference Time (MS) Speedup Rouge 2 Rouge-L
facebook/bart-large-cnn (baseline) 406 381 1 21.06 30.63
manojpreveen/distilbart-cnn-v3 306 307 1.24 21.26 30.59
manojpreveen/distilbart-cnn-v2 255 214 1.78 20.57 30.00
manojpreveen/distilbart-cnn-v1 230 182 2.09 20.17 29.70