model.resize_token_embeddings() method is broken - resizes embedding table but not lm_head
#21
by
alexpeys
- opened
minimal code to repro:
import requests
import torch
from PIL import Image
from transformers import MllamaForConditionalGeneration, AutoProcessor
model_id = "meta-llama/Llama-3.2-11B-Vision"
model = MllamaForConditionalGeneration.from_pretrained(
model_id,
torch_dtype=torch.bfloat16
)
processor = AutoProcessor.from_pretrained(model_id)
print(f"{model.language_model.model.embed_tokens}")
print(f"{model.language_model.lm_head}")
processor.tokenizer.add_tokens(['|atoken|, '|alsoatoken|'])
print(f"{model.language_model.model.embed_tokens}")
print(f"{model.language_model.lm_head}")