--- license: other license_name: qwen2 license_link: https://huggingface.co/Qwen/Qwen2-72B/blob/main/LICENSE --- ![Tess-v2.5](https://huggingface.co/migtissera/Tess-v2.5-Qwen2-72B/resolve/main/Tess-v2.5.png) # Tess-v2.5-Qwen2-72B We've created Tess-v2.5-Qwen2-72B, the latest state-of-the-art model in the Tess series of Large Language Models (LLMs). Tess, short for Tesoro (Treasure in Italian), is the flagship LLM series created by Migel Tissera. Tess-v2.5 brings significant improvements in reasoning capabilities, coding capabilities and mathematics. It is currently the #1 ranked open weight model when evaluated on MMLU (Massive Multitask Language Understanding). It scores higher than all other open weight models including Qwen2-72B-Instruct, Llama3-70B-Instruct, Mixtral-8x22B-Instruct and DBRX-Instruct. Further, when evaluated on MMLU, Tess-v2.5-Qwen2-72B model outperforms even the frontier closed models Gemini-1.0-Ultra, Gemini-1.5-Pro, Mistral-Large and Claude-3-Sonnet. Tess-v2.5-Qwen2-72B was fine-tuned over the Qwen2-72B base, using the Tess-v2.5 dataset that contains 300K samples spanning multiple topics, including business and management, marketing, history, social sciences, arts,STEM subjects and computer programming. This dataset was synthetically generated with the [Sensei](https://github.com/migtissera/Sensei) framework, using multiple frontier models such as GPT-4-Turbo, Claude-Opus and Mistral-Large. # Evaluation ## MMLU (Massive Multitask Language Understanding) ![MMLU_open](https://huggingface.co/migtissera/Tess-v2.5-Qwen2-72B/resolve/main/Figures/mmlu_open_models.png) ![MMLU_closed](https://huggingface.co/migtissera/Tess-v2.5-Qwen2-72B/resolve/main/Figures/mmlu_closed_models.png) ## AGIEval ![AGIEval](https://huggingface.co/migtissera/Tess-v2.5-Qwen2-72B/resolve/main/Figures/AGIEval.png)