Direct Use

  from transformers import AutoTokenizer, AutoModelForSequenceClassification
  import torch
  
  model_name = "SkyAsl/Bert-Emotion_classifier"
  tokenizer = AutoTokenizer.from_pretrained(model_name)
  model = AutoModelForSequenceClassification.from_pretrained(model_name)
  
  text = "I am so happy to see you!"
  inputs = tokenizer(text, return_tensors="pt")
  outputs = model(**inputs)
  predicted_class = torch.argmax(outputs.logits, dim=1).item()
  
  id2label = {
      0: "sadness", 1: "joy", 2: "love",
      3: "anger", 4: "fear", 5: "surprise"
  }
  print("Predicted emotion:", id2label[predicted_class])

Training Details

Training Data

https://huggingface.co/datasets/dair-ai/emotion

Training Hyperparameters

lr = 2e-4 batch_size = 128 epochs = 5 weight_decay = 0.01

Metrics

training_loss: 0.106100 validation_loss: 0.143851 accuracy: 0.940000

Downloads last month
4
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for SkyAsl/Bert-Emotion_classifier

Finetuned
(6267)
this model

Dataset used to train SkyAsl/Bert-Emotion_classifier