Enhance model card for VeriCoder: Add paper, GitHub link, pipeline tag, and library name
Browse filesThis PR significantly enhances the model card for **VeriCoder**, aligning it with Hugging Face best practices and providing crucial information for users.
Key updates include:
- **Paper Details**: Explicitly links the model to its paper, "[VeriCoder: Enhancing LLM-Based RTL Code Generation through Functional Correctness Validation](https://huggingface.co/papers/2504.15659)", along with an overview of its core functionality.
- **GitHub Repository**: Includes a direct link to the official GitHub repository for easy access to the codebase and additional resources.
- **Pipeline Tag**: Addition of the `pipeline_tag: text-generation` metadata, which correctly categorizes the model and improves its discoverability on the Hugging Face Hub (e.g., at https://huggingface.co/models?pipeline_tag=text-generation).
- **Library Name**: Integration with the `transformers` library via the `library_name` tag, which will enable the "how to use" widget on the model page, providing automated code snippets for loading the model.
These changes aim to provide a more informative and user-friendly experience for anyone interacting with the VeriCoder model on the Hugging Face Hub.
|
@@ -1,10 +1,40 @@
|
|
| 1 |
---
|
| 2 |
-
license: apache-2.0
|
| 3 |
-
datasets:
|
| 4 |
-
- LLM4Code/expanded_origen_126k
|
| 5 |
base_model:
|
| 6 |
- Qwen/Qwen2.5-14B-Instruct
|
|
|
|
|
|
|
|
|
|
| 7 |
tags:
|
| 8 |
- Verilog
|
| 9 |
- CodeGen
|
| 10 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
|
|
|
|
|
|
|
|
|
| 2 |
base_model:
|
| 3 |
- Qwen/Qwen2.5-14B-Instruct
|
| 4 |
+
datasets:
|
| 5 |
+
- LLM4Code/expanded_origen_126k
|
| 6 |
+
license: apache-2.0
|
| 7 |
tags:
|
| 8 |
- Verilog
|
| 9 |
- CodeGen
|
| 10 |
+
pipeline_tag: text-generation
|
| 11 |
+
library_name: transformers
|
| 12 |
+
---
|
| 13 |
+
|
| 14 |
+
# VeriCoder: Enhancing LLM-Based RTL Code Generation through Functional Correctness Validation
|
| 15 |
+
|
| 16 |
+
This repository hosts **VeriCoder**, a model presented in the paper [VeriCoder: Enhancing LLM-Based RTL Code Generation through Functional Correctness Validation](https://huggingface.co/papers/2504.15659).
|
| 17 |
+
|
| 18 |
+
VeriCoder is a model for Register Transfer Level (RTL) code generation fine-tuned on a dataset validated for functional correctness. This fine-tuning dataset is constructed using a novel methodology that combines unit test generation with feedback-directed refinement. Given a natural language specification and an initial RTL design, a teacher model iteratively revises the RTL design based on simulation results using generated tests. Every example in the dataset is functionally validated, consisting of a natural language description, an RTL implementation, and passing tests.
|
| 19 |
+
|
| 20 |
+
For more details and code, visit the [GitHub Repository](https://github.com/Anjiang-Wei/VeriCoder).
|
| 21 |
+
|
| 22 |
+
## Key Highlights
|
| 23 |
+
|
| 24 |
+
- **Functionally Validated Dataset**: 125,000+ examples with simulation-passing RTL designs.
|
| 25 |
+
- **Feedback-Driven Construction**: Iteratively refine designs and tests based on test results.
|
| 26 |
+
- **Superior Performance**: Achieves up to +71.7% relative improvement on VerilogEval benchmarks.
|
| 27 |
+
- **Comprehensive Resources**: Includes dataset, model weights, inference scripts, and training pipeline.
|
| 28 |
+
|
| 29 |
+
## Citation
|
| 30 |
+
|
| 31 |
+
If you find VeriCoder helpful in your research, please consider citing:
|
| 32 |
+
|
| 33 |
+
```plaintext
|
| 34 |
+
@article{wei2025vericoder,
|
| 35 |
+
title={VeriCoder: Enhancing LLM-Based RTL Code Generation through Functional Correctness Validation},
|
| 36 |
+
author={Wei, Anjiang and Tan, Huanmi and Suresh, Tarun and Mendoza, Daniel and Teixeira, Thiago SFX and Wang, Ke and Trippel, Caroline and Aiken, Alex},
|
| 37 |
+
journal={arXiv preprint arXiv:2504.15659},
|
| 38 |
+
year={2025}
|
| 39 |
+
}
|
| 40 |
+
```
|