--- license: mit datasets: - wikimedia/wikipedia - bookcorpus/bookcorpus - SetFit/mnli - sentence-transformers/all-nli language: - en new_version: v1.1 base_model: - google-bert/bert-base-uncased pipeline_tag: text-classification tags: - BERT - MNLI - NLI - transformer - pre-training - nlp - tiny-bert - edge-ai - transformers - low-resource - micro-nlp - quantized - iot - wearable-ai - offline-assistant - intent-detection - real-time - smart-home - embedded-systems - command-classification - toy-robotics - voice-ai - eco-ai - english - lightweight - mobile-nlp metrics: - accuracy - f1 - inference - recall library_name: transformers --- ![Banner](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiWsG0Nmwt7QDnCpZuNrWGRaDGURIV9QWifhhaDbBDaCb0wPEeGQidUl-jgE-GC21QDa-3WXgpM6y9OTWjvhnpho9nDmDNf3MiHqhs-sfhwn-Rphj3FtASbbQMxyPx9agHSib-GPj18nAxkYonB6hOqCDAj0zGis2qICirmYI8waqxTo7xNtZ6Ju3yLQM8/s1920/bert-%20lite.png) # ๐ŸŒŸ bert-lite: A Lightweight BERT for Efficient NLP ๐ŸŒŸ ## ๐Ÿš€ Overview Meet **bert-lite**โ€”a streamlined marvel of NLP! ๐ŸŽ‰ Designed with efficiency in mind, this model features a compact architecture tailored for tasks like **MNLI** and **NLI**, while excelling in low-resource environments. With a lightweight footprint, `bert-lite` is perfect for edge devices, IoT applications, and real-time NLP needs. ๐ŸŒ --- ## ๐ŸŒŸ Why bert-lite? The Lightweight Edge - ๐Ÿ” **Compact Power**: Optimized for speed and size - โšก **Fast Inference**: Blazing quick on constrained hardware - ๐Ÿ’พ **Small Footprint**: Minimal storage demands - ๐ŸŒฑ **Eco-Friendly**: Low energy consumption - ๐ŸŽฏ **Versatile**: IoT, wearables, smart homes, and more! --- ## ๐Ÿง  Model Details | Property | Value | |-------------------|------------------------------------| | ๐Ÿงฑ Layers | Custom lightweight design | | ๐Ÿง  Hidden Size | Optimized for efficiency | | ๐Ÿ‘๏ธ Attention Heads | Minimal yet effective | | โš™๏ธ Parameters | Ultra-low parameter count | | ๐Ÿ’ฝ Size | Quantized for minimal storage | | ๐ŸŒ Base Model | google-bert/bert-base-uncased | | ๐Ÿ†™ Version | v1.1 (April 04, 2025) | --- ## ๐Ÿ“œ License MIT License โ€” free to use, modify, and share. ## ๐Ÿ”ค Usage Example โ€“ Masked Language Modeling (MLM) ```python from transformers import pipeline # ๐Ÿ“ข Start demo mlm_pipeline = pipeline("fill-mask", model="boltuix/bert-lite") masked_sentences = [ "The robot can [MASK] the room in minutes.", "He decided to [MASK] the project early.", "This device is [MASK] for small tasks.", "The weather will [MASK] by tomorrow.", "She loves to [MASK] in the garden.", "Please [MASK] the door before leaving.", ] for sentence in masked_sentences: print(f"Input: {sentence}") predictions = mlm_pipeline(sentence) for pred in predictions[:3]: print(f"โœจ โ†’ {pred['sequence']} (score: {pred['score']:.4f})") ``` --- ## ๐Ÿ”ค Masked Language Model (MLM)'s Output ```python Input: The robot can [MASK] the room in minutes. โœจ โ†’ the robot can leave the room in minutes. (score: 0.1608) โœจ โ†’ the robot can enter the room in minutes. (score: 0.1067) โœจ โ†’ the robot can open the room in minutes. (score: 0.0498) Input: He decided to [MASK] the project early. โœจ โ†’ he decided to start the project early. (score: 0.1503) โœจ โ†’ he decided to continue the project early. (score: 0.0812) โœจ โ†’ he decided to leave the project early. (score: 0.0412) Input: This device is [MASK] for small tasks. โœจ โ†’ this device is used for small tasks. (score: 0.4118) โœจ โ†’ this device is useful for small tasks. (score: 0.0615) โœจ โ†’ this device is required for small tasks. (score: 0.0427) Input: The weather will [MASK] by tomorrow. โœจ โ†’ the weather will be by tomorrow. (score: 0.0980) โœจ โ†’ the weather will begin by tomorrow. (score: 0.0868) โœจ โ†’ the weather will come by tomorrow. (score: 0.0657) Input: She loves to [MASK] in the garden. โœจ โ†’ she loves to live in the garden. (score: 0.3112) โœจ โ†’ she loves to stay in the garden. (score: 0.0823) โœจ โ†’ she loves to be in the garden. (score: 0.0796) Input: Please [MASK] the door before leaving. โœจ โ†’ please open the door before leaving. (score: 0.3421) โœจ โ†’ please shut the door before leaving. (score: 0.3208) โœจ โ†’ please closed the door before leaving. (score: 0.0599) ``` --- ## ๐Ÿ’ก Who's It For? ๐Ÿ‘จโ€๐Ÿ’ป Developers: Lightweight NLP apps for mobile or IoT ๐Ÿค– Innovators: Power wearables, smart homes, or robots ๐Ÿงช Enthusiasts: Experiment on a budget ๐ŸŒฟ Eco-Warriors: Reduce AIโ€™s carbon footprint ## ๐Ÿ“ˆ Metrics That Matter โœ… Accuracy: Competitive with larger models ๐ŸŽฏ F1 Score: Balanced precision and recall โšก Inference Time: Optimized for real-time use ## ๐Ÿงช Trained On ๐Ÿ“˜ Wikipedia ๐Ÿ“š BookCorpus ๐Ÿงพ MNLI (Multi-Genre NLI) ๐Ÿ”— sentence-transformers/all-nli ## ๐Ÿ”– Tags #tiny-bert #iot #wearable-ai #intent-detection #smart-home #offline-assistant #nlp #transformers