YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

BERT + LSTM Sentiment Analysis Model

This repository contains a custom fine-tuned BERT + LSTM model for sentiment classification, trained on journal-style emotional writing. The model is designed to detect nuanced emotional states and works well for reflective, personal text (e.g., journaling, social media, diary entries).


🧠 Model Architecture

  • Base Encoder: bert-base-uncased (Hugging Face Transformers)
  • Sequence Modeling: One-layer LSTM to capture sentence-level emotional flow
  • Classification Layer: Fully connected layer with softmax
  • Loss Function: CrossEntropyLoss
  • Optimizer: AdamW

πŸ“¦ Files Included

File Description
bert_lstm_sentiment_model.pt Trained PyTorch model (BERT + LSTM + Classifier)
label_encoder.pkl sklearn.preprocessing.LabelEncoder used during training

🧾 Classes

The model was trained to classify the following emotional labels:

  • Positive
  • Negative
  • Neutral

πŸš€ How to Use

#pip install torch transformers scikit-learn #if not already installed, remove the first comment

import torch
import pickle
from transformers import BertTokenizer

# Load model
model = torch.load("bert_lstm_sentiment_model.pt", map_location=torch.device('cpu'))
model.eval()

# Load tokenizer and label encoder
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
with open("label_encoder.pkl", "rb") as f:
    label_encoder = pickle.load(f)

# Sample input
text = "I feel empty and overwhelmed today."

# Tokenize
inputs = tokenizer(text, return_tensors="pt", truncation=True, padding=True)
with torch.no_grad():
    outputs = model(**inputs)
    predicted_class = torch.argmax(outputs.logits, dim=1).item()
    emotion = label_encoder.inverse_transform([predicted_class])[0]

print(f"Predicted Emotion: {emotion}")
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support