decompiler-v4
This repository contains merged fine‑tuned weights for the base model deepseek-ai/DeepSeek-R1-0528-Qwen3-8B.
- Task: idiomatic decompilation (assembly → high-level code)
- Training: LoRA/DoRA adapters trained with TRL SFT on custom assembly→Dart/Swift pairs
- How to load (merged):
from transformers import AutoModelForCausalLM, AutoTokenizer
repo_id = "raafatabualazm/decompiler-v4"
tok = AutoTokenizer.from_pretrained(repo_id, use_fast=True)
model = AutoModelForCausalLM.from_pretrained(repo_id, torch_dtype="bfloat16", trust_remote_code=True)
Replace the repo id with your own if you fork or rename this repository.
- Downloads last month
- 15
Model tree for raafatabualazm/decompiler-v4
Base model
deepseek-ai/DeepSeek-R1-0528-Qwen3-8B