LURE 5.3

This is the LURE model for Lua 5.3.

Usage:

from transformers import AutoTokenizer, AutoModelForCausalLM

model_id = "lurepaper/LURE_5.3"

model = AutoModelForCausalLM.from_pretrained(model_id, device_map="auto", torch_dtype="auto")
tokenizer = AutoTokenizer.from_pretrained(model_id)

prompt = "You are a Lua programming language expert. Please generate generate concise Lua code that produces the Lua 5.3 opcode OP_ADD. Use a print function call at the end to show the execution result of the opcode."

messages = [
        {"role": "user", "content": prompt},
    ]

text = tokenizer.apply_chat_template(
         messages,
         tokenize=False,
         add_generation_prompt=True,
         enable_thinking=True,
)

model_inputs = tokenizer([text], return_tensors="pt").to(model.device)

generated_ids = model.generate(
        **model_inputs,
        max_new_tokens=32768,
)

output_ids = generated_ids[0][len(model_inputs.input_ids[0]):].tolist()

# parsing thinking content
try:
  # rindex finding 151668 (</think>)
  index = len(output_ids) - output_ids[::-1].index(151668)
except ValueError:
  index = 0

thinking_content = tokenizer.decode(output_ids[:index], skip_special_tokens=True).strip("\n")
content = tokenizer.decode(output_ids[index:], skip_special_tokens=True).strip("\n")

print("Thinking content:")
print(thinking_content)

print("Generated LuaGadget:")
print(content)
Downloads last month
-
Safetensors
Model size
33B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for lurepaper/LURE_5.3

Base model

Qwen/Qwen3-32B
Finetuned
(125)
this model
Quantizations
2 models

Collection including lurepaper/LURE_5.3