SigLIP 2 So400m Patch16 384 (MambaMia Compatible)
This is a modified version of google/siglip2-so400m-patch16-384 with preprocessing configurations adjusted for compatibility with MambaMia.
Modifications
The only modification from the original model is in preprocessor_config.json to ensure compatibility with the MambaMia framework. The model weights remain unchanged.
Original Model
- Base Model: google/siglip2-so400m-patch16-384
- Developed by: Google
- Model type: Vision Encoder (SigLIP 2)
- License: Apache 2.0
Usage
This model is intended to be used as a vision encoder within the MambaMia framework. Please refer to the MambaMia repository for detailed usage instructions.
Citation
If you use this model, please cite the original SigLIP 2 paper and MambaMia:
@article{tschannen2025siglip,
title={SigLIP 2: Multilingual Vision-Language Encoders with Improved Semantic Understanding, Localization, and Dense Features},
author={Tschannen, Michael and others},
journal={arXiv preprint},
year={2025}
}
License
This model is released under the Apache 2.0 License, following the original model's license.
- Downloads last month
- 11
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for gwkrsrch/siglip2-so400m-patch16-384
Base model
google/siglip2-so400m-patch16-384