nunonmg commited on
Commit
c859eda
1 Parent(s): 76c1288

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -23,7 +23,7 @@ pipeline_tag: translation
23
 
24
  TowerInstruct-7B is a language model that results from fine-tuning TowerBase on the TowerBlocks supervised fine-tuning dataset. TowerInstruct-7B-v0.2 is the first model in the series.
25
  The model is trained to handle several translation-related tasks, such as general machine translation (e.g., sentence- and paragraph/document-level translation, terminology-aware translation, context-aware translation), automatic post edition, named-entity recognition, gramatical error correction, and paraphrase generation.
26
- We will release more details in the upcoming technical report.
27
 
28
  - **Developed by:** Unbabel, Instituto Superior Técnico, CentraleSupélec University of Paris-Saclay
29
  - **Model type:** A 7B parameter model fine-tuned on a mix of publicly available, synthetic datasets on translation-related tasks, as well as conversational datasets and code instructions.
@@ -33,6 +33,8 @@ We will release more details in the upcoming technical report.
33
 
34
  **Update**: TowerInstruct-7B-v0.2 has more reliable document-level translation capabilities in comparison with TowerInstruct-7B-v0.1. The new version of TowerBlocks used to train v0.2 is also available in the Tower collection.
35
 
 
 
36
  ## Intended uses & limitations
37
 
38
  The model was initially fine-tuned on a filtered and preprocessed supervised fine-tuning dataset ([TowerBlocks](https://huggingface.co/datasets/Unbabel/TowerBlocks-v0.1)), which contains a diverse range of data sources:
 
23
 
24
  TowerInstruct-7B is a language model that results from fine-tuning TowerBase on the TowerBlocks supervised fine-tuning dataset. TowerInstruct-7B-v0.2 is the first model in the series.
25
  The model is trained to handle several translation-related tasks, such as general machine translation (e.g., sentence- and paragraph/document-level translation, terminology-aware translation, context-aware translation), automatic post edition, named-entity recognition, gramatical error correction, and paraphrase generation.
26
+ We will release more details in the upcoming technical report. For now, you can check results obtained with the model [here](https://unbabel.com/announcing-tower-an-open-multilingual-llm-for-translation-related-tasks/).
27
 
28
  - **Developed by:** Unbabel, Instituto Superior Técnico, CentraleSupélec University of Paris-Saclay
29
  - **Model type:** A 7B parameter model fine-tuned on a mix of publicly available, synthetic datasets on translation-related tasks, as well as conversational datasets and code instructions.
 
33
 
34
  **Update**: TowerInstruct-7B-v0.2 has more reliable document-level translation capabilities in comparison with TowerInstruct-7B-v0.1. The new version of TowerBlocks used to train v0.2 is also available in the Tower collection.
35
 
36
+
37
+
38
  ## Intended uses & limitations
39
 
40
  The model was initially fine-tuned on a filtered and preprocessed supervised fine-tuning dataset ([TowerBlocks](https://huggingface.co/datasets/Unbabel/TowerBlocks-v0.1)), which contains a diverse range of data sources: