There's a typo in Readme
#32
by
Kurapika993
- opened
Model description
BART is a transformer encoder-encoder (seq2seq) -------->> encoder-decoder
"BART is a transformer encoder-encoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder."
Why this is not fixed?