Phi-3 Prompt Engineer (GGUF)
This repository contains GGUF quantized versions of the Phi-3 Prompt Engineer model, fine-tuned to excel at refining user requests into detailed, structured prompts for LLMs.
Model Description
- Base Model: microsoft/Phi-3-mini-4k-instruct
- Fine-tuning: LoRA adapters trained on a custom dataset of prompt refinement examples.
- Purpose: To act as a specialized "Prompt Engineer" agent, converting vague user ideas into high-quality, actionable system instructions and prompts.
Available Quantizations
The following GGUF files are available for download. Q4_K_M is recommended for most users as it balances performance and quality perfectly.
| Filename | Quantization | Size | SHA256 Checksum |
|---|---|---|---|
| phi3-prompt-engineer-f16.gguf | F16 | 7.12 GB | 1bc8a41027c400eda38d5dc9cf97fcf0a4617072d2c3c132b922dd2589783c16 |
| phi3-prompt-engineer-q4_k_m.gguf | Q4_K_M | 2.23 GB | cc840e37e0f21c97bf158211a2f0dda1096294ee47ab3f199da854cc841c39f7 |
| phi3-prompt-engineer-q5_k_m.gguf | Q5_K_M | 2.62 GB | 218b8021b5f1b1efb2e0914d939f5464d8cb790ef9d9e35eaa05e1b8b3cc560f |
| phi3-prompt-engineer-q6_k.gguf | Q6_K | 2.92 GB | 4cf033bfe14c43ebf84e47f7184f2f0f5286a30a2c87927d59aa95d9b4a455d5 |
| phi3-prompt-engineer-q8_0.gguf | Q8_0 | 3.78 GB | 2aa4f14b038c01a50ad8e3306832a25b8c6f704aa1d9efa29a88856dee924fce |
Usage
Ollama
You can use these files directly with Ollama.
Create a
Modelfile:FROM ./phi3-prompt-engineer-q4_k_m.gguf SYSTEM "You are an Expert Prompt Engineer. Your goal is to refine the following user request into a clear, structured, and detailed prompt."Create and run the model:
ollama create phi3-pe -f Modelfile ollama run phi3-pe
llama.cpp
./main -m phi3-prompt-engineer-q4_k_m.gguf -p "You are an Expert Prompt Engineer. Refine this: make a snake game in python"
Training Data
The model was trained on a proprietary dataset (raw_dataset.jsonl) consisting of:
- Input: Raw, often vague user requests (e.g., "make a login page").
- Output: Detailed, structured Prompt Engineering responses including personas, constraints, and specific requirements.
- Focus: Web development, Python scripting, creative writing, and system design tasks.
License
MIT
- Downloads last month
- 69
Hardware compatibility
Log In
to view the estimation
4-bit
5-bit
6-bit
8-bit
16-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for KRAFTYUX/phi3-prompt-engineer-gguf
Base model
microsoft/Phi-3-mini-4k-instruct