You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to quantize the 7B model, but I keep running into the error
ValueError: The checkpoint you are trying to load has model type multi_modality but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
You can update Transformers with the command pip install --upgrade transformers. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command pip install git+https://github.com/huggingface/transformers.git
is there a way to quantize the 7B model?
The text was updated successfully, but these errors were encountered:
I am trying to quantize the 7B model, but I keep running into the error
ValueError: The checkpoint you are trying to load has model type
multi_modality
but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.You can update Transformers with the command
pip install --upgrade transformers
. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the commandpip install git+https://github.com/huggingface/transformers.git
is there a way to quantize the 7B model?
The text was updated successfully, but these errors were encountered: