emo-detector
emo-detector is a multi-label emotion detection model for text. It predicts one or more emotions from the following labels:
- anger
- fear
- joy
- sadness
- surprise
Model Details
- Architecture: Pretrained DeBERTa + custom FFNN classifier
- Task: Multi-label text classification
- Tokenizer: DeBERTa tokenizer (
microsoft/deberta-v3-base) - Output: Probabilities β Thresholded to 0/1
Custom Model Class
This model uses a custom architecture defined inside the emo_detector/ module:
emo_detector/configuration_bert_ffnn.pyβBertFFNNConfigemo_detector/modeling_bert_ffnn.pyβBERT_FFNN
To load or fine-tune this model, you must download the full repository (including the emo_detector/ folder).
The recommended way is to use snapshot_download() from Hugging Face Hub.
Installation
pip install torch transformers huggingface_hub
Usage
import sys
import torch
from transformers import AutoTokenizer
from huggingface_hub import snapshot_download
# Download entire repository
repo_dir = snapshot_download("NeuralNest05/emo-detector")
sys.path.append(repo_dir)
# Import custom architecture + config
from emo_detector.configuration_bert_ffnn import BertFFNNConfig
from emo_detector.modeling_bert_ffnn import BERT_FFNN
DEVICE = "cuda" if torch.cuda.is_available() else "cpu"
# Load tokenizer
tokenizer = AutoTokenizer.from_pretrained("NeuralNest05/emo-detector")
# Load model config and architecture
config = BertFFNNConfig.from_pretrained("NeuralNest05/emo-detector")
model = BERT_FFNN(config)
# Load weights
model.load_state_dict(torch.load(f"{repo_dir}/pytorch_model.bin", map_location=DEVICE))
model.to(DEVICE)
model.eval()
# Example prediction
texts = ["I am very happy today!", "This is scary..."]
encodings = tokenizer(texts, truncation=True, padding=True, return_tensors="pt").to(DEVICE)
with torch.no_grad():
logits = model(**encodings)
probs = torch.sigmoid(logits)
threshold = 0.5
preds = (probs > threshold).int()
print(preds)
Output Format
Each prediction corresponds to the five emotion labels in this order:
["anger", "fear", "joy", "sadness", "surprise"]
Output is a multi-hot vector, e.g.:
[0, 0, 1, 0, 0] β joy
License
MIT License
Acknowledgements
- Microsoft DeBERTa-v3
- Hugging Face Transformers
- PyTorch
- Downloads last month
- 63