--- library_name: transformers license: apache-2.0 datasets: - schneewolflabs/Athanorlite-DPO base_model: - nbeerbower/Schreiber-mistral-nemo-12B --- # A0l-12B Same training run as [schneewolflabs/A0-12B](https://huggingface.co/schneewolflabs/A0-12B), but using [schneewolflabs/Athanorlite-DPO](https://huggingface.co/schneewolflabs/Athanorlite-DPO) as the dataset. Preliminary tests have shown this model has superior writing capabilities to A0-12B. ## Configuration ![image/png](https://raw.githubusercontent.com/Schneewolf-Labs/Merlina/refs/heads/main/frontend/madewithmerlina_smol.png)