Muennighoff's picture
Add eval
ac2efad
raw
history blame
294 Bytes
dataset,prompt,metric,value
e2e_nlg_cleaned,generate_text_restaurant,rouge2_fmeasure,0.10538118293432128
gem_xsum,article_DOC_summary,rouge2_fmeasure,0.030418541159472234
web_nlg_en,PALM_prompt,rouge2_fmeasure,0.05042435840541935
wiki_lingua_en,tldr_en,rouge2_fmeasure,0.04655649352957454