Falcon 2: An 11B parameter pretrained language model and VLM, trained on over 5000B tokens tokens and 11 languages May 24 โข 25
TEXT_Datasets Collection Datasets for fine-tunning, instruction and evaluation of text models from projecte-aina โข 46 items โข Updated May 8 โข 1