You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I find it interesting that the checkpoint in your cvpr24 results do not have in_proj_weight, only in_proj_bias in nn.MultiHeadAttention parameters, which block me from using and evaluating them.
In addition, what is the corresponding config yaml file for them? The readme seems unmatched with the code and ckpt.
The text was updated successfully, but these errors were encountered:
you need to check model_vpt.py since the attn.in_proj_weight has been chunked as q_proj_weight, k_proj_weight, v_proj_weight to only finetune query and value, where in_proj_bias was not modified. This might help you.
I find it interesting that the checkpoint in your cvpr24 results do not have in_proj_weight, only in_proj_bias in nn.MultiHeadAttention parameters, which block me from using and evaluating them.
In addition, what is the corresponding config yaml file for them? The readme seems unmatched with the code and ckpt.
The text was updated successfully, but these errors were encountered: