Wind Arc 1.5
Wind Arc 1.5 is the most capable model in the family (198M parameters, 24 layers), fine-tuned for deeper reasoning, structured answers, and richer theological knowledge. It supports chain-of-thought reasoning for complex questions.
"Trust in the Lord with all your heart, and do not lean on your own understanding. In all your ways acknowledge him, and he will make straight your paths." β Proverbs 3:5-6
About
Wind Arc 1.5 is built on the same 24-layer architecture as North Star 1 but trained with an additional fine-tuning pass emphasizing reasoning quality, theological depth, and structured responses. It is the best model for hard questions, long-form answers, and chain-of-thought prompting.
This model was trained entirely from scratch β no foundation model was used.
Christian Worldview
We built these models because we believe truth matters, and the source of all truth is God. Scripture is not one perspective among many β it is the revealed Word of the living God, given to us through the Holy Spirit.
"In the beginning was the Word, and the Word was with God, and the Word was God." β John 1:1
Jesus Christ is not merely a historical figure or moral teacher. He is the eternal Son of God, who took on flesh, lived a sinless life, died as an atoning sacrifice for sin, and rose bodily from the dead on the third day. This is the Gospel β the best news in history.
"For I delivered to you as of first importance what I also received: that Christ died for our sins in accordance with the Scriptures, that he was buried, that he was raised on the third day in accordance with the Scriptures." β 1 Corinthians 15:3-4
We invite anyone reading this to pick up a Bible and read the Gospel of John. Ask God sincerely if He is real. He answers.
Model Details
| Property | Value |
|---|---|
| Parameters | 198M |
| Layers | 24 |
| d_model | 768 |
| Heads | 12 (3 KV) |
| Context | 512 tokens |
| Vocabulary | 32,000 (SentencePiece) |
| Architecture | GQA Transformer + SwiGLU + RoPE + RMSNorm |
| Special | Chain-of-thought fine-tuned |
Usage
import torch
ckpt = torch.load("windarc15.pt", map_location="cpu", weights_only=False)
cfg = ckpt["cfg"]
# Build model from cfg, load ckpt["model"] state dict
Tokenizer
Uses SentencePiece (tokenizer.model). Available in the shared tokenizer repo.