textrm1.5-25M-bizmail

A 25.95M parameter Transformer-based model that generates surprisingly coherent business-style emails.

Github: https://github.com/kamisori-daijin/textrm

Overview

This project explores how far a small (~25M parameter) model can go in generating structured business email text.

The model is not fully instruction-following and may produce inconsistent or mixed outputs, but it can often generate realistic email-like text.

Features

  • Small size (~25M parameters)
  • Generates business-style email text
  • Works best with simple prompts
  • Occasionally produces surprisingly coherent outputs

Limitations

  • Weak instruction following
  • May mix multiple prompts or contexts
  • Inconsistent tone and intent
  • Not suitable for production use

Example

Prompt: Write a professional business email

Output: Dear [Competitor Name], Thank you for reaching out and suggesting the [Feature Name] feature. We appreciate you bringing this to our attention. However, due to the current industry situation and a partial system outage at [Company Name], we are unable to proceed at this time. We would be happy to revisit this discussion in the future. Please let us know your availability for a brief call next week. Sincerely, [Name] Security Researcher

Training

Usage

  1. Clone this repository
  2. Install dependencies:
    pip install torch transformers tqdm
    
  3. Run inference: python inference.py

Notes

This model is intended for research and experimentation purposes only.

License

Apache2

Disclaimer

This model was trained on synthetic data generated using Gemma3-4B (Google). This project is independent and does not replicate or fine-tune Gemma3-4B.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train Kamisori-daijin/textrm1.5-25M-bizmail