Update transformer

#14
by dn6 HF staff - opened
diffusers-internal-dev org
edited Jun 7

Update transformer config to use joint_attention_dim and make it so that pos_embed_max_size is configurable in order to deal with multiple model sizes.

dn6 changed pull request status to open
dn6 changed pull request status to merged

Sign up or log in to comment