Update of modeling_cogvlm.py for Transformers newer version

#15

Compatible with Transformers > 4.41.2

Details:

The current implementation has an error with the line:

if past_key_values is not None:
            past_key_values_length = past_key_values[0][0].shape[2]
            seq_length_with_past = seq_length_with_past + past_key_values_length

When the Transformers version is > 4.41.2.
The issue is caused by the change in the output of
_extract_past_from_model_output function defined in Transformers src/transformers/generation/utils.py since version v4.42.0. I tested that this fix works with transformers 4.44.0 as well.
Screenshot 2024-08-15 at 11.45.29 AM.png

Therefore, my pr includes checking the version of Transformers and modifying the process of the output of _extract_past_from_model_output to make sure cogvlm2 can work with both the newer version of Transformers, e.g., 4.44.0 and the version below 4.42.0

Qishuai changed pull request title from Upload modeling_cogvlm.py to Update of modeling_cogvlm.py for Transformers newer version

This solved my issue!

Knowledge Engineering Group (KEG) & Data Mining at Tsinghua University org

cogvlm-videos have changed to support transformers 4.44.0, I will copy from that changed asap

This pr works for transformer 4.44.0 as well. I think it can be an option to just merge this pr πŸ˜ƒ

hi can you test and merge this pr?

zRzRzRzRzRzRzR changed pull request status to merged

Sign up or log in to comment