Improve model card: Add pipeline tag, library name, update license, and correct paper link

#1
by nielsr HF Staff - opened

This PR significantly enhances the model card for LLM4Binary/llm4decompile-6.7b-v2 by:

  • Adding pipeline_tag: text-generation to improve discoverability on the Hugging Face Hub for code generation tasks.
  • Adding library_name: transformers to enable the automated "How to use" widget, as the model explicitly uses the transformers library for inference.
  • Updating the metadata license to other and the content's license section to "MIT and DeepSeek License" to accurately reflect the dual licensing mentioned in the GitHub repository.
  • Ensuring the primary paper link, "Decompile-Bench: Million-Scale Binary-Source Function Pairs for Real-World Binary Decompilation", is prominently displayed at the top.
  • Integrating a more detailed "About" section from the GitHub README to provide better context for the model.
  • Adding a comprehensive "Citation" section for both the Decompile-Bench and LLM4Decompile papers.

The existing detailed "How to Use" section, which includes Ghidra setup and model inference, is preserved as it directly reflects the official usage instructions. All code snippets and literal newline characters (\n) in the widget text have been carefully maintained as per instructions.

Please review and merge this PR.

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment