Bertha ↔ English Translator
Bertha-Translation is a bilingual Seq2Seq neural machine translation model that translates text between English and Bertha.
It uses an Encoder–Decoder GRU architecture with attention.
Features
- Translate English → Bertha
- Translate Bertha → English
- Bidirectional Seq2Seq model with attention
- Built with PyTorch following the official Seq2Seq tutorial
- Lightweight and suitable for low-latency translation
Model Details
- Author: Mikiyas Zenebe
- Model type: Seq2Seq (Encoder + Decoder with Attention)
- Framework: PyTorch
- Source / Target languages: English, Bertha
- License: Apache 2.0
Usage Example
import torch
from huggingface_hub import hf_hub_download
from model import EncoderRNN, DecoderRNN, tensorFromSentence
# Download trained models
encoder_path = hf_hub_download("Mikile/Bertha-translation", "encoder.pth")
decoder_path = hf_hub_download("Mikile/Bertha-translation", "decoder.pth")
# Load your encoder and decoder
encoder = EncoderRNN(input_size, hidden_size)
decoder = DecoderRNN(hidden_size, output_size)
encoder.load_state_dict(torch.load(encoder_path))
decoder.load_state_dict(torch.load(decoder_path))
# Translate a sentence
sentence = "hello my friend"
input_tensor = tensorFromSentence(input_lang, sentence)
encoder_outputs, encoder_hidden = encoder(input_tensor)
decoder_outputs, _, _ = decoder(encoder_outputs, encoder_hidden)
@misc{bertha_translation,
author = {Mikiyas Zenebe},
title = {Bertha ↔ English Translation Model},
year = {2025},
howpublished = {Hugging Face Model Hub},
url = {https://huggingface.co/Mikile/Bertha-translation}
}
- Downloads last month
- 6