You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

mumo-pin1

This model was trained using MuMo (Multi-Modal Molecular) framework.

Model Description

  • Model Type: MuMo Pretrained Model
  • Training Data: Molecular structures and properties
  • Framework: PyTorch + Transformers

Usage

Loading the Model MuMo uses a custom loading function. Here's how to load the pretrained model:

git clone https://github.com/selmiss/MuMo.git

from transformers import AutoConfig, AutoTokenizer from model.load_model import load_model from dataclasses import dataclass

Load configuration and tokenizer

repo = "zihaojing/MuMo-pin1" config = AutoConfig.from_pretrained(repo, trust_remote_code=True) tokenizer = AutoTokenizer.from_pretrained(repo)

Set up model arguments

class ModelArgs: model_name_or_path: str = repo model_class: str = "MuMoFinetunePairwise" # or "MuMoPretrain" for pretraining cache_dir: str = None model_revision: str = "main" use_auth_token: bool = False task_type: str = None # e.g., "classification" or "regression" for finetuning

model_args = ModelArgs()

Load the model

model = load_model(config, tokenizer=tokenizer, model_args=model_args)

Notes:

Use model_class="MuMoPretrain" for pretraining or inference Use model_class="MuMoFinetune" or "MuMoFinetunePairwise" for finetuning tasks Set task_type to "classification" or "regression" when using MuMoFinetune The model supports loading from both Hugging Face Hub (e.g., "zihaojing/MuMo-pin1") and local paths (e.g., "/path/to/model")

Training Details

  • Training script: See repository for details
  • Framework: Transformers + DeepSpeed

Citation

If you use this model, please cite the original MuMo paper.

Downloads last month
15
Safetensors
Model size
0.4B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support