Update README.md
Browse files
README.md
CHANGED
|
@@ -24,7 +24,7 @@ For more information on quantization, check the [OpenVINO model optimization gui
|
|
| 24 |
|
| 25 |
The provided OpenVINO™ IR model is compatible with:
|
| 26 |
|
| 27 |
-
* OpenVINO version **2024.
|
| 28 |
* Optimum Intel **1.20.0** and higher
|
| 29 |
|
| 30 |
## Running Model Inference with OpenVINO GenAI
|
|
@@ -32,7 +32,7 @@ The provided OpenVINO™ IR model is compatible with:
|
|
| 32 |
1. Install packages required for using [OpenVINO GenAI](https://github.com/openvinotoolkit/openvino.genai) with Speculative decoding:
|
| 33 |
|
| 34 |
```
|
| 35 |
-
pip install openvino-genai huggingface_hub
|
| 36 |
```
|
| 37 |
|
| 38 |
2. Download and convert main model and tokenizer
|
|
@@ -62,7 +62,7 @@ hf_hub.snapshot_download(draft_model_id, local_dir=draft_model_path)
|
|
| 62 |
```python
|
| 63 |
import openvino_genai
|
| 64 |
|
| 65 |
-
prompt =
|
| 66 |
|
| 67 |
config = openvino_genai.GenerationConfig()
|
| 68 |
config.num_assistant_tokens = 3
|
|
|
|
| 24 |
|
| 25 |
The provided OpenVINO™ IR model is compatible with:
|
| 26 |
|
| 27 |
+
* OpenVINO version **2024.5** and higher
|
| 28 |
* Optimum Intel **1.20.0** and higher
|
| 29 |
|
| 30 |
## Running Model Inference with OpenVINO GenAI
|
|
|
|
| 32 |
1. Install packages required for using [OpenVINO GenAI](https://github.com/openvinotoolkit/openvino.genai) with Speculative decoding:
|
| 33 |
|
| 34 |
```
|
| 35 |
+
pip install -U openvino-genai>=2024.5 huggingface_hub
|
| 36 |
```
|
| 37 |
|
| 38 |
2. Download and convert main model and tokenizer
|
|
|
|
| 62 |
```python
|
| 63 |
import openvino_genai
|
| 64 |
|
| 65 |
+
prompt = "What is OpenVINO?"
|
| 66 |
|
| 67 |
config = openvino_genai.GenerationConfig()
|
| 68 |
config.num_assistant_tokens = 3
|