Update README.md
Browse files
README.md
CHANGED
|
@@ -127,10 +127,16 @@ if __name__ == "__main__":
|
|
| 127 |
```
|
| 128 |
|
| 129 |
### Installation Requirements
|
|
|
|
| 130 |
```bash
|
| 131 |
pip install llama-cpp-python huggingface_hub
|
| 132 |
```
|
| 133 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 134 |
### Key Parameters
|
| 135 |
- `n_ctx`: Context window size (default: 2048)
|
| 136 |
- `n_threads`: Number of CPU threads to use (adjust based on your hardware)
|
|
|
|
| 127 |
```
|
| 128 |
|
| 129 |
### Installation Requirements
|
| 130 |
+
|
| 131 |
```bash
|
| 132 |
pip install llama-cpp-python huggingface_hub
|
| 133 |
```
|
| 134 |
|
| 135 |
+
For Macbook with Apple Silicon, install llama-cpp with the following instead
|
| 136 |
+
```bash
|
| 137 |
+
CMAKE_ARGS="-DCMAKE_OSX_ARCHITECTURES=arm64 -DLLAMA_METAL=on" pip install llama-cpp-python
|
| 138 |
+
```
|
| 139 |
+
|
| 140 |
### Key Parameters
|
| 141 |
- `n_ctx`: Context window size (default: 2048)
|
| 142 |
- `n_threads`: Number of CPU threads to use (adjust based on your hardware)
|