Original model: Stellar-Umbra-12B by Vortex5
Available ExLlamaV3 0.0.16 quants
| Type | Size | CLI |
|---|---|---|
| H8-4.0BPW | 7.49 GB | Copy-paste the line / Download the batch file |
| H8-6.0BPW | 10.22 GB | Copy-paste the line / Download the batch file |
| H8-8.0BPW | 12.95 GB | Copy-paste the line / Download the batch file |
Requirements: A python installation with huggingface-hub module to use CLI.
Licensing: The license for the provided quantized models is derived from the original model (see the source above)
Model tree for DeathGodlike/Stellar-Umbra-12B_EXL3
Base model
Vortex5/Stellar-Umbra-12B