mlabonne commited on
Commit
95baf36
1 Parent(s): 44c48b2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -1
README.md CHANGED
@@ -171,4 +171,8 @@ messages = [{"role": "user", "content": "Explain what a Mixture of Experts is in
171
  prompt = pipeline.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
172
  outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
173
  print(outputs[0]["generated_text"])
174
- ```
 
 
 
 
 
171
  prompt = pipeline.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
172
  outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
173
  print(outputs[0]["generated_text"])
174
+ ```
175
+
176
+ Output:
177
+
178
+ > A Mixture of Experts (ME) is a machine learning technique that combines multiple expert models to make predictions or decisions. Each expert model is specialized in a different aspect of the problem, and their outputs are combined to produce a more accurate and robust solution. This approach allows the model to leverage the strengths of individual experts and compensate for their weaknesses, improving overall performance.