Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
robolamp
/
Ministral-3-8B-Base-2512-GGUF
like
0
GGUF
conversational
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
Ministral-3-8B-Base-2512-GGUF
66.6 GB
1 contributor
History:
5 commits
robolamp
Update README.md
fda9b52
verified
6 days ago
.gitattributes
Safe
2.24 kB
Upload folder using huggingface_hub
7 days ago
Ministral-3-8B-Base-2512-BF16.gguf
Safe
17 GB
xet
Upload folder using huggingface_hub
7 days ago
Ministral-3-8B-Base-2512-Q2_K.gguf
Safe
3.35 GB
xet
Upload folder using huggingface_hub
7 days ago
Ministral-3-8B-Base-2512-Q3_K_M.gguf
Safe
4.24 GB
xet
Upload folder using huggingface_hub
7 days ago
Ministral-3-8B-Base-2512-Q3_K_S.gguf
Safe
3.87 GB
xet
Upload folder using huggingface_hub
7 days ago
Ministral-3-8B-Base-2512-Q4_K_M.gguf
Safe
5.2 GB
xet
Upload folder using huggingface_hub
7 days ago
Ministral-3-8B-Base-2512-Q4_K_S.gguf
Safe
4.95 GB
xet
Upload folder using huggingface_hub
7 days ago
Ministral-3-8B-Base-2512-Q5_K_M.gguf
Safe
6.06 GB
xet
Upload folder using huggingface_hub
7 days ago
Ministral-3-8B-Base-2512-Q5_K_S.gguf
Safe
5.92 GB
xet
Upload folder using huggingface_hub
7 days ago
Ministral-3-8B-Base-2512-Q6_K.gguf
Safe
6.97 GB
xet
Upload folder using huggingface_hub
7 days ago
Ministral-3-8B-Base-2512-Q8_0.gguf
Safe
9.03 GB
xet
Upload folder using huggingface_hub
7 days ago
README.md
Safe
178 Bytes
Update README.md
6 days ago