accessing to embedding layer and generate embeddings step by step

#9
by francescopatane - opened

hi ! i'm trying using captum in order to use integrated gradients. actually i'm using a finetuned model :

model = EsmForSequenceClassification.from_pretrained("francescopatane/esm2_t6_8M_UR50D-xAI")

in order to use integratedgradients, captum needs output from the model (class 0 or 1) and model input (embeddings) :

lig = LayerIntegratedGradients(model_output, model_input)

model_input must not be a simple tensor because captum needs to calculate a new embedding for every step from the input baseline, so i need a method to call the embedding layer directly (like model.bert.embeddings). i tried using model.esm.embeddings but this method gives me only the architecture layer. how can integrate embedding generation in a variable? thank you very much
Francesco, Ms

Sign up or log in to comment