YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Quantization made by Richard Erkhov.
LLaMmlein_120M - bnb 8bits
- Model creator: https://huggingface.co/LSX-UniWue/
- Original model: https://huggingface.co/LSX-UniWue/LLaMmlein_120M/
Original model description:
datasets: - togethercomputer/RedPajama-Data-V2 language: - de pipeline_tag: text-generation library_name: transformers license: other
LLäMmlein 120M
This is a German Tinyllama 120M language model trained from scratch using the Tinyllama codebase on the German portion of RedPajama V2. Find more details on our page and our preprint!
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("LSX-UniWue/LLaMmlein_120M")
tokenizer = AutoTokenizer.from_pretrained("LSX-UniWue/LLaMmlein_120M")
Performance
We evaluated our model on the SuperGLEBer benchmark.
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support