license: mit
datasets:
- wikimedia/wikipedia
- bookcorpus/bookcorpus
- SetFit/mnli
- sentence-transformers/all-nli
language:
- en
new_version: v1.1
base_model:
- google-bert/bert-base-uncased
pipeline_tag: text-classification
tags:
- BERT
- MNLI
- NLI
- transformer
- pre-training
- nlp
- tiny-bert
- edge-ai
- transformers
- low-resource
- micro-nlp
- quantized
- iot
- wearable-ai
- offline-assistant
- intent-detection
- real-time
- smart-home
- embedded-systems
- command-classification
- toy-robotics
- voice-ai
- eco-ai
- english
- lightweight
- mobile-nlp
metrics:
- accuracy
- f1
- inference
- recall
library_name: transformers
π bert-lite: A Lightweight BERT for Efficient NLP π
π Overview
Meet bert-liteβa streamlined marvel of NLP! π Designed with efficiency in mind, this model features a compact architecture tailored for tasks like MNLI and NLI, while excelling in low-resource environments. With a lightweight footprint, bert-lite is perfect for edge devices, IoT applications, and real-time NLP needs. π
π Why bert-lite? The Lightweight Edge
- π Compact Power: Optimized for speed and size
- β‘ Fast Inference: Blazing quick on constrained hardware
- πΎ Small Footprint: Minimal storage demands
- π± Eco-Friendly: Low energy consumption
- π― Versatile: IoT, wearables, smart homes, and more!
π§ Model Details
| Property | Value |
|---|---|
| π§± Layers | Custom lightweight design |
| π§ Hidden Size | Optimized for efficiency |
| ποΈ Attention Heads | Minimal yet effective |
| βοΈ Parameters | Ultra-low parameter count |
| π½ Size | Quantized for minimal storage |
| π Base Model | google-bert/bert-base-uncased |
| π Version | v1.1 (April 04, 2025) |
π License
MIT License β free to use, modify, and share.
π€ Usage Example β Masked Language Modeling (MLM)
from transformers import pipeline
# π’ Start demo
mlm_pipeline = pipeline("fill-mask", model="boltuix/bert-lite")
masked_sentences = [
"The robot can [MASK] the room in minutes.",
"He decided to [MASK] the project early.",
"This device is [MASK] for small tasks.",
"The weather will [MASK] by tomorrow.",
"She loves to [MASK] in the garden.",
"Please [MASK] the door before leaving.",
]
for sentence in masked_sentences:
print(f"Input: {sentence}")
predictions = mlm_pipeline(sentence)
for pred in predictions[:3]:
print(f"β¨ β {pred['sequence']} (score: {pred['score']:.4f})")
π€ Masked Language Model (MLM)'s Output
Input: The robot can [MASK] the room in minutes.
β¨ β the robot can leave the room in minutes. (score: 0.1608)
β¨ β the robot can enter the room in minutes. (score: 0.1067)
β¨ β the robot can open the room in minutes. (score: 0.0498)
Input: He decided to [MASK] the project early.
β¨ β he decided to start the project early. (score: 0.1503)
β¨ β he decided to continue the project early. (score: 0.0812)
β¨ β he decided to leave the project early. (score: 0.0412)
Input: This device is [MASK] for small tasks.
β¨ β this device is used for small tasks. (score: 0.4118)
β¨ β this device is useful for small tasks. (score: 0.0615)
β¨ β this device is required for small tasks. (score: 0.0427)
Input: The weather will [MASK] by tomorrow.
β¨ β the weather will be by tomorrow. (score: 0.0980)
β¨ β the weather will begin by tomorrow. (score: 0.0868)
β¨ β the weather will come by tomorrow. (score: 0.0657)
Input: She loves to [MASK] in the garden.
β¨ β she loves to live in the garden. (score: 0.3112)
β¨ β she loves to stay in the garden. (score: 0.0823)
β¨ β she loves to be in the garden. (score: 0.0796)
Input: Please [MASK] the door before leaving.
β¨ β please open the door before leaving. (score: 0.3421)
β¨ β please shut the door before leaving. (score: 0.3208)
β¨ β please closed the door before leaving. (score: 0.0599)
π‘ Who's It For?
π¨βπ» Developers: Lightweight NLP apps for mobile or IoT
π€ Innovators: Power wearables, smart homes, or robots
π§ͺ Enthusiasts: Experiment on a budget
πΏ Eco-Warriors: Reduce AIβs carbon footprint
π Metrics That Matter
β Accuracy: Competitive with larger models
π― F1 Score: Balanced precision and recall
β‘ Inference Time: Optimized for real-time use
π§ͺ Trained On
π Wikipedia π BookCorpus π§Ύ MNLI (Multi-Genre NLI) π sentence-transformers/all-nli
π Tags
#tiny-bert #iot #wearable-ai #intent-detection #smart-home #offline-assistant #nlp #transformers
