this is an DialoGPT-small version: GPT-Nano it uses the DialogGPT-small tokenizer. WARNING: Model was trained on small amount of data and it says nonsenses. Please do not use it for getting informations and for organization purposes.

How to use: download the ai from main, then install Transformers libray and copy this code to code editor and run it:

# Load model directly
from transformers import AutoModel, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("microsoft/dialoGPT-small") # do not change, gpt-nano is an pretrained dialogpt and it needs this tokenizer
model = AutoModel.from_pretrained("path/to/ai")

# ask ai
input_text = "What is the capital of France?"
input_ids = tokenizer.encode(input_text, return_tensors="pt")

output = model.generate(
    input_ids,
    max_length=50,  # You can adjust the max_length as needed
    num_return_sequences=1,
    no_repeat_ngram_size=2,
    do_sample=True,
    top_k=50,
    top_p=0.95,
    temperature=0.7
)

generated_text = tokenizer.decode(output[0], skip_special_tokens=True)
print(generated_text)
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Luanium/gpt-nano

Finetuned
(49)
this model