Update README.md
Browse files
README.md
CHANGED
|
@@ -73,23 +73,23 @@ to implement production-ready inference pipelines.
|
|
| 73 |
|
| 74 |
**_Installation_**
|
| 75 |
|
| 76 |
-
Make sure you install [`vLLM >=
|
| 77 |
|
| 78 |
```
|
| 79 |
pip install --upgrade vllm
|
| 80 |
```
|
| 81 |
|
| 82 |
-
Also make sure you have [`mistral_common >= 1
|
| 83 |
|
| 84 |
```
|
| 85 |
pip install --upgrade mistral_common
|
| 86 |
```
|
| 87 |
|
| 88 |
-
You can also make use of a ready-to-go [docker image](https://github.com/vllm-project/vllm/blob/main/Dockerfile) or on the [docker hub](https://hub.docker.com/layers/vllm/vllm-openai/latest/images
|
| 89 |
|
| 90 |
#### Server
|
| 91 |
|
| 92 |
-
We recommand that you use Mistral-Small-Instruct-
|
| 93 |
|
| 94 |
1. Spin up a server:
|
| 95 |
|
|
|
|
| 73 |
|
| 74 |
**_Installation_**
|
| 75 |
|
| 76 |
+
Make sure you install [`vLLM >= 0.6.4`](https://github.com/vllm-project/vllm/releases/tag/v0.6.4):
|
| 77 |
|
| 78 |
```
|
| 79 |
pip install --upgrade vllm
|
| 80 |
```
|
| 81 |
|
| 82 |
+
Also make sure you have [`mistral_common >= 1.5.2`](https://github.com/mistralai/mistral-common/releases/tag/v1.5.2) installed:
|
| 83 |
|
| 84 |
```
|
| 85 |
pip install --upgrade mistral_common
|
| 86 |
```
|
| 87 |
|
| 88 |
+
You can also make use of a ready-to-go [docker image](https://github.com/vllm-project/vllm/blob/main/Dockerfile) or on the [docker hub](https://hub.docker.com/layers/vllm/vllm-openai/latest/images/sha256-de9032a92ffea7b5c007dad80b38fd44aac11eddc31c435f8e52f3b7404bbf39).
|
| 89 |
|
| 90 |
#### Server
|
| 91 |
|
| 92 |
+
We recommand that you use Mistral-Small-Instruct-2501 in a server/client setting.
|
| 93 |
|
| 94 |
1. Spin up a server:
|
| 95 |
|