Update README.md
Browse files
README.md
CHANGED
|
@@ -15,7 +15,7 @@ using the LLM-Pruner from [`LLM-Pruner: On the Structural Pruning of Large Langu
|
|
| 15 |
Done to test viability of LLM-Pruner for task-agnostic, low resource Generative AI for Commercial and Personal Use
|
| 16 |
compared to using out-of-the-box models like [`meta-llama/Llama-3.2-3B-Instruct`](https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct)
|
| 17 |
|
| 18 |
-
Our presentation slides may be found
|
| 19 |
|
| 20 |
# To replicate,
|
| 21 |
|
|
@@ -33,7 +33,7 @@ python llama3.py --pruning_ratio 0.25 \
|
|
| 33 |
```
|
| 34 |
to get the pruned model.
|
| 35 |
|
| 36 |
-
**NOTE**:
|
| 37 |
- We removed `'ptb'` from the datasets in `llama3.py` since it requires foreign code to load.
|
| 38 |
- We change `get_examples` in `llama3.py` to use `'c4'` since bookcorpus requires foreign code to load.
|
| 39 |
|
|
|
|
| 15 |
Done to test viability of LLM-Pruner for task-agnostic, low resource Generative AI for Commercial and Personal Use
|
| 16 |
compared to using out-of-the-box models like [`meta-llama/Llama-3.2-3B-Instruct`](https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct)
|
| 17 |
|
| 18 |
+
[Our presentation slides may be found here](https://drive.google.com/file/d/1k-2afwk5YyQ-YEH76j-JBDfgO5_90XPF/view?usp=sharing)
|
| 19 |
|
| 20 |
# To replicate,
|
| 21 |
|
|
|
|
| 33 |
```
|
| 34 |
to get the pruned model.
|
| 35 |
|
| 36 |
+
**NOTE**: To fit the commercial and personal use settings:
|
| 37 |
- We removed `'ptb'` from the datasets in `llama3.py` since it requires foreign code to load.
|
| 38 |
- We change `get_examples` in `llama3.py` to use `'c4'` since bookcorpus requires foreign code to load.
|
| 39 |
|