Update README.md
Browse files
README.md
CHANGED
@@ -69,7 +69,6 @@ processor = AutoProcessor.from_pretrained(model_name_moe, trust_remote_code=True
|
|
69 |
moe_model = AutoModelForCausalLM.from_pretrained(
|
70 |
model_name_moe,config=config,
|
71 |
trust_remote_code=True, torch_dtype=torch.bfloat16,
|
72 |
-
# quantization_config=quantization_config,
|
73 |
).to(device)
|
74 |
|
75 |
count_parameters(moe_model)
|
@@ -108,8 +107,10 @@ Output:
|
|
108 |
|
109 |
<pre style="white-space: pre-wrap;">
|
110 |
The image shows a group of ants climbing over a vertical surface. The ants are using their legs and antennae to navigate the surface, demonstrating their ability to adapt to different environments and overcome obstacles. This behavior is relevant for materials design because it highlights the ants' ability to optimize their movements and interactions with their surroundings, which can inspire the development of advanced materials that mimic these natural adaptations.
|
111 |
-
|
112 |
-
Multi-agent AI refers to the use of artificial intelligence algorithms to simulate and analyze the behavior of multiple agents, such as ants, in a system. This approach allows for the study of complex interactions and emergent properties that arise from the collective actions of individual agents. By understanding how ants navigate and interact with their environment, researchers can gain insights into the design of materials that exhibit similar properties, such as self-healing, adaptive behavior, and enhanced functionality.
|
|
|
|
|
113 |
</pre>
|
114 |
|
115 |
## Make a Idefics-2-MoE model from scratch using several pre-trained models
|
@@ -162,20 +163,12 @@ from transformers import BitsAndBytesConfig
|
|
162 |
|
163 |
DEVICE='cuda'
|
164 |
|
165 |
-
quantization_config = BitsAndBytesConfig(
|
166 |
-
load_in_4bit=True,
|
167 |
-
bnb_4bit_quant_type="nf4",
|
168 |
-
bnb_4bit_use_double_quant=True,
|
169 |
-
bnb_4bit_compute_dtype=torch.bfloat16
|
170 |
-
)
|
171 |
-
|
172 |
model_id_1='lamm-mit/Cephalo-Idefics-2-vision-8b-beta'
|
173 |
|
174 |
model_1 = Idefics2ForConditionalGeneration.from_pretrained( model_id_1,
|
175 |
torch_dtype=torch.bfloat16, #if your GPU allows
|
176 |
_attn_implementation="flash_attention_2", #make sure Flash Attention 2 is installed
|
177 |
trust_remote_code=True,
|
178 |
-
#quantization_config=quantization_config,
|
179 |
)
|
180 |
processor = AutoProcessor.from_pretrained(
|
181 |
f"{model_id_1}",
|
@@ -196,7 +189,6 @@ model_2 = Idefics2ForConditionalGeneration.from_pretrained( model_id_2,
|
|
196 |
torch_dtype=torch.bfloat16, #if your GPU allows
|
197 |
_attn_implementation="flash_attention_2", #make sure Flash Attention 2 is installed
|
198 |
trust_remote_code=True,
|
199 |
-
#quantization_config=quantization_config,
|
200 |
)
|
201 |
|
202 |
model_id_3='HuggingFaceM4/idefics2-8b'
|
@@ -205,7 +197,6 @@ model_3 = Idefics2ForConditionalGeneration.from_pretrained( model_id_3,
|
|
205 |
torch_dtype=torch.bfloat16, #if your GPU allows
|
206 |
_attn_implementation="flash_attention_2", #make sure Flash Attention 2 is installed
|
207 |
trust_remote_code=True,
|
208 |
-
#quantization_config=quantization_config,
|
209 |
)
|
210 |
```
|
211 |
Put on device:
|
@@ -363,7 +354,6 @@ processor = AutoProcessor.from_pretrained(model_name_moe, trust_remote_code=True
|
|
363 |
moe_model = AutoModelForCausalLM.from_pretrained(
|
364 |
model_name_moe,config=config,
|
365 |
trust_remote_code=True, torch_dtype=torch.bfloat16,
|
366 |
-
# quantization_config=quantization_config,
|
367 |
).to(device)
|
368 |
|
369 |
count_parameters(moe_model)
|
|
|
69 |
moe_model = AutoModelForCausalLM.from_pretrained(
|
70 |
model_name_moe,config=config,
|
71 |
trust_remote_code=True, torch_dtype=torch.bfloat16,
|
|
|
72 |
).to(device)
|
73 |
|
74 |
count_parameters(moe_model)
|
|
|
107 |
|
108 |
<pre style="white-space: pre-wrap;">
|
109 |
The image shows a group of ants climbing over a vertical surface. The ants are using their legs and antennae to navigate the surface, demonstrating their ability to adapt to different environments and overcome obstacles. This behavior is relevant for materials design because it highlights the ants' ability to optimize their movements and interactions with their surroundings, which can inspire the development of advanced materials that mimic these natural adaptations.
|
110 |
+
|
111 |
+
Multi-agent AI refers to the use of artificial intelligence algorithms to simulate and analyze the behavior of multiple agents, such as ants, in a system. This approach allows for the study of complex interactions and emergent properties that arise from the collective actions of individual agents. By understanding how ants navigate and interact with their environment, researchers can gain insights into the design of materials that exhibit similar properties, such as self-healing, adaptive behavior, and enhanced functionality.
|
112 |
+
|
113 |
+
The image of ants climbing over a vertical surface highlights their ability to adapt and optimize their movements, which can inspire the development of advanced materials that mimic these natural adaptations. Multi-agent AI provides a framework for analyzing and understanding the behavior of these agents, enabling the design of materials that exhibit similar properties.
|
114 |
</pre>
|
115 |
|
116 |
## Make a Idefics-2-MoE model from scratch using several pre-trained models
|
|
|
163 |
|
164 |
DEVICE='cuda'
|
165 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
166 |
model_id_1='lamm-mit/Cephalo-Idefics-2-vision-8b-beta'
|
167 |
|
168 |
model_1 = Idefics2ForConditionalGeneration.from_pretrained( model_id_1,
|
169 |
torch_dtype=torch.bfloat16, #if your GPU allows
|
170 |
_attn_implementation="flash_attention_2", #make sure Flash Attention 2 is installed
|
171 |
trust_remote_code=True,
|
|
|
172 |
)
|
173 |
processor = AutoProcessor.from_pretrained(
|
174 |
f"{model_id_1}",
|
|
|
189 |
torch_dtype=torch.bfloat16, #if your GPU allows
|
190 |
_attn_implementation="flash_attention_2", #make sure Flash Attention 2 is installed
|
191 |
trust_remote_code=True,
|
|
|
192 |
)
|
193 |
|
194 |
model_id_3='HuggingFaceM4/idefics2-8b'
|
|
|
197 |
torch_dtype=torch.bfloat16, #if your GPU allows
|
198 |
_attn_implementation="flash_attention_2", #make sure Flash Attention 2 is installed
|
199 |
trust_remote_code=True,
|
|
|
200 |
)
|
201 |
```
|
202 |
Put on device:
|
|
|
354 |
moe_model = AutoModelForCausalLM.from_pretrained(
|
355 |
model_name_moe,config=config,
|
356 |
trust_remote_code=True, torch_dtype=torch.bfloat16,
|
|
|
357 |
).to(device)
|
358 |
|
359 |
count_parameters(moe_model)
|