Gemma 3n E4B IT LiteRT (Mirror Copy)

Model Summary

This repository contains an unaltered mirror copy of the original model
google/gemma-3n-E4B-it-litert-lm, created for backup and redistribution purposes only.

The author of this repository does not claim authorship, contribution, or credit for the model.

  • Source model: google/gemma-3n-E4B-it-litert-lm
  • Copy date: 2026-02-01
  • Original file name: gemma-3n-E4B-it-int4-Web.litertlm
  • File size: 4,275,044,352 bytes
  • Model hash: 63730ba3225a23a90d3292d89fdeff1a7537cedeb72aa687de9a35732d057e52 (SHA-256, single-file bundle)
  • Modification status: No changes have been made to the model weights, configuration, or tokenizer files.

This repository does not introduce any additional training, fine-tuning, pruning, quantization, or structural changes.


Original Model Information

All architectural details, training procedures, and evaluation results are identical to the original model and should be referenced directly from the source repository.


License and Terms of Use

This model is distributed under the Gemma Terms of Use, as published by Google:

https://ai.google.dev/gemma/terms

Users of this repository must comply with all applicable terms and restrictions, including but not limited to the usage limitations described in Section 3.2 of the Gemma Terms of Use.

A copy of the Gemma Terms of Use is included in this repository for reference.


Redistribution Notice

This repository is provided as a mirror copy of the original Gemma model.
Redistribution is performed in accordance with the Gemma Terms of Use.

Notice:

Gemma is provided under and subject to the Gemma Terms of Use found at
https://ai.google.dev/gemma/terms

If you redistribute this model or any derivative work, you are responsible for ensuring that all required notices and terms are preserved.


Intended Use

The intended use of this model is the same as the original Gemma 3n E4B IT LiteRT model, including:

  • Research and experimentation
  • Prototyping and evaluation of language model applications
  • Educational and non-production analysis

This repository does not define any new intended use beyond what is specified by the original authors.


Limitations and Risks

This model inherits all known limitations, biases, and risks of the original Gemma model, including but not limited to:

  • Potential generation of incorrect or misleading information
  • Sensitivity to prompt phrasing
  • Biases present in the training data

Users should apply appropriate safeguards, evaluation, and monitoring when using the model in any downstream application.


Acknowledgements

Gemma is developed and released by Google.

All credit for model design, training, and evaluation belongs to the original authors.
This repository exists solely to provide a faithful copy for backup and compliant redistribution.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 1 Ask for provider support

Model tree for notabilia/gemma-3n-E4B-it-litert-lm

Finetuned
(1)
this model