Gabriel commited on
Commit
08b0470
1 Parent(s): 5aaa406

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +3 -3
app.py CHANGED
@@ -76,12 +76,12 @@ with gr.Blocks() as demo:
76
 
77
  with gr.TabItem("The Summarization Engine"):
78
  gr.Markdown("""
79
- <h3>Abstractive vs Extractive.</h3>
80
  <p>
81
  Abstractive
82
  The underlying engines for the Abstractive part are transformer based model BART, a sequence-to-sequence model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder. The BART-model was pre-trained by KBLab/bart-base-swedish-cased (link) to learn general knowledge about language. Afterwards, the model was further fine-tuned on two labelled datasets that have been open-sourced:
83
- Gabriel/cnn_daily_swe (link)
84
- Gabriel/xsum_swe (link)
85
 
86
  To see more in depth regarding the training go to link.
87
 
 
76
 
77
  with gr.TabItem("The Summarization Engine"):
78
  gr.Markdown("""
79
+ <h3>Abstractive vs Extractive</h3>
80
  <p>
81
  Abstractive
82
  The underlying engines for the Abstractive part are transformer based model BART, a sequence-to-sequence model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder. The BART-model was pre-trained by KBLab/bart-base-swedish-cased (link) to learn general knowledge about language. Afterwards, the model was further fine-tuned on two labelled datasets that have been open-sourced:
83
+ - Gabriel/cnn_daily_swe (link)
84
+ - Gabriel/xsum_swe (link)
85
 
86
  To see more in depth regarding the training go to link.
87