NLLB-200 1.3B (ONNX, int8 Quantized)

Production-ready ONNX conversion of facebook/nllb-200-1.3B for in-browser translation across 200+ languages — zero server cost, zero latency, complete privacy.

Highlights

  • 200+ languages — broadest language coverage of any translation model
  • 1.3B full (non-distilled) — higher quality than distilled variants
  • ~1.81 GB quantized (encoder: 736 MB + decoder: 1.08 GB)
  • transformers.js compatible — drop-in pipeline('translation')

Quick Start

import { pipeline } from '@huggingface/transformers';

const translator = await pipeline(
  'translation',
  'affectively-ai/nllb-200-1.3B-onnx',
  { dtype: 'q8' }
);

const result = await translator('How are you feeling today?', {
  src_lang: 'eng_Latn',
  tgt_lang: 'fra_Latn',
});
// [{ translation_text: "Comment vous sentez-vous aujourd'hui ?" }]

Supported Languages (Sample)

Language Code Language Code
English eng_Latn French fra_Latn
Spanish spa_Latn German deu_Latn
Chinese (Simplified) zho_Hans Japanese jpn_Jpan
Korean kor_Hang Arabic arb_Arab
Hindi hin_Deva Portuguese por_Latn

See NLLB language codes for the full list.

Conversion Details

Property Value
Base model facebook/nllb-200-1.3B
Export PyTorch → ONNX (fp32) via Optimum
Quantization int8 dynamic (ORTQuantizer, avx512_vnni)
Components Encoder (736 MB) + Decoder (1.08 GB) quantized separately

Use Cases

This model powers multilingual translation in Edgework.ai — bringing fast, cheap, and private inference as close to the user as possible. Ideal for:

  • Translating emotion journals across 200+ languages
  • Making mental wellness content accessible globally
  • Cross-lingual sentiment analysis pipelines
  • Private on-device translation for sensitive content

About

Published by AFFECTIVELY · Managed by @buley

We convert, quantize, and publish production-ready ONNX models for edge and in-browser inference. Every release is tested for correctness and stability before publication.

Downloads last month
45
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for affectively-ai/nllb-200-1.3B-onnx

Quantized
(3)
this model

Dataset used to train affectively-ai/nllb-200-1.3B-onnx