ValueError: The checkpoint you are trying to load has model type `qwen3` but Transformers does not recognize this architecture.
Hey,
As the title says, I am trying to download the model via Hugging Face, but got this error. The process was interrupted before the weights could be downloaded.
I have the latest versions of Transformers and Sentence-Transformers:transformers 4.55.2sentence-transformers 5.1.0
So I was wondering what is the issue here? should I downgrade them? In your readme says:
Requires transformers>=4.51.0
Requires sentence-transformers>=2.7.0
Thanks for any help!
Hello @DaniDubi
This repository contains a script which works.
If you're not using a venv (python3.11 -m venv .venv + .\.venv/Scripts/activate), you should do so.
If you just want to download the model, you can try hf download Qwen/Qwen3-Embedding-0.6B.
Afterwards, you should be able to use the downloaded directory using something similar to
from sentence_transformers import SentenceTransformer
MODEL_PATH: str = "Qwen/Qwen3-Embedding-0.6B" # where the hf command downloaded the model to
model: SentenceTransformer = SentenceTransformer(MODEL_PATH)
Assuming you're using a virtual environment, this shouldn't throw/raise any error.
Hope you have a great day.
Thanks @Hamzah-Asadullah!
Now (after actually repeating the same steps as before) I managed to download it and loaded the model.
But once I tried model.encode() on a simple short sentence as shown in the examples in the model card, I got a Kernel crash with the following error:
loc("mps_matmul"("(mpsFileLoc): /AppleInternal/Library/BuildRoots/4~B5E4ugDCh2RsPWAjMEoPu8LC5w1yXEwd7XweDhg/Library/Caches/com.apple.xbs/Sources/MetalPerformanceShadersGraph/mpsgraph/MetalPerformanceShadersGraph/Core/Files/MPSGraphUtilities.mm":43:0)): error: incompatible dimensions
loc("mps_matmul"("(mpsFileLoc): /AppleInternal/Library/BuildRoots/4~B5E4ugDCh2RsPWAjMEoPu8LC5w1yXEwd7XweDhg/Library/Caches/com.apple.xbs/Sources/MetalPerformanceShadersGraph/mpsgraph/MetalPerformanceShadersGraph/Core/Files/MPSGraphUtilities.mm":43:0)): error: invalid shape
LLVM ERROR: Failed to infer result type(s).
[1] 63443 abort python
/opt/homebrew/Cellar/[email protected]/3.12.9/Frameworks/Python.framework/Versions/3.12/lib/python3.12/multiprocessing/resource_tracker.py:255: UserWarning: resource_tracker: There appear to be 1 leaked semaphore objects to clean up at shutdown
warnings.warn('resource_tracker: There appear to be %d '
My hardware is M4 Mac silicon, looking at the error it might be related to my chip ("MPS" backend). Have you encountered something similar?
Thank you
Vadim
Hey @DaniDubi
I'm currently not at home, but I will be investigating this as soon as possible.
Please note that I don't have an Apple device so my comments will likely not be of much help.
I will be commenting here shortly.
Hope you have a great day.
Hello again @DaniDubi
Looking at the error you provided, I can see a LLVM ERROR (line 3), so maybe you used LLVM? I can't really prove that but from my personal experience it only really runs on Linux, and even there there are major bugs.
I won't really try to convince you to use sentence_transformers over LLVM, but if you're generally aiming for something more efficient than transformers-based libraries, you might want to try llama.cpp (since it's main focus is Apple Silicon). However getting sentence/text-similarity running is quite a headache as far as I've read it's documentation.
If you really just want a quick server to connect to you can either use Hamzah-Asadullah/TSE which is quite straightforward to set up, or maybe ask ChatGPT/Claude/Deepseek to generate a small script for you. Other than that, I can't really help you much since I'm only some years into Python and I've never dealt with an Apple device.
Hope you have a great day.
It is OK, do not worry, there are other embedding models that are good as well. E.g. mixedbread-ai/mxbai-embed-large-v1 - works out of the box with sentence_transformers no any problems so far.
I appreciate your help!
Have a great week as well
Dani