Add Ollama Example for Function Calling
Browse files
README.md
CHANGED
|
@@ -25,6 +25,8 @@ Authors: [Ashvini Kumar Jindal](https://www.linkedin.com/in/ashvini-jindal-26653
|
|
| 25 |
|
| 26 |
**🤗 Hugging Face Announcement Blog**: https://huggingface.co/blog/akjindal53244/llama31-storm8b
|
| 27 |
|
|
|
|
|
|
|
| 28 |
<br>
|
| 29 |
|
| 30 |
# Llama-3.1-Storm-8B-FP8-Dynamic
|
|
@@ -99,7 +101,7 @@ Llama-3.1-Storm-8B is a powerful generalist model useful for diverse application
|
|
| 99 |
1. `BF16`: [Llama-3.1-Storm-8B](https://huggingface.co/akjindal53244/Llama-3.1-Storm-8B)
|
| 100 |
2. ⚡ `FP8`: [Llama-3.1-Storm-8B-FP8-Dynamic](https://huggingface.co/akjindal53244/Llama-3.1-Storm-8B-FP8-Dynamic)
|
| 101 |
3. ⚡ `GGUF`: [Llama-3.1-Storm-8B-GGUF](https://huggingface.co/akjindal53244/Llama-3.1-Storm-8B-GGUF)
|
| 102 |
-
4. Ollama: `ollama run ajindal/llama3.1-storm:8b`
|
| 103 |
|
| 104 |
|
| 105 |
|
|
@@ -218,7 +220,7 @@ prompt = tokenizer.apply_chat_template(messages, add_generation_prompt=True, tok
|
|
| 218 |
print(llm.generate([prompt], sampling_params)[0].outputs[0].text.strip()) # Expected Output: <tool_call>{'tool_name': 'web_chain_details', 'tool_arguments': {'chain_slug': 'ethereum'}}</tool_call>
|
| 219 |
```
|
| 220 |
|
| 221 |
-
#### Use with [
|
| 222 |
```
|
| 223 |
import ollama
|
| 224 |
|
|
|
|
| 25 |
|
| 26 |
**🤗 Hugging Face Announcement Blog**: https://huggingface.co/blog/akjindal53244/llama31-storm8b
|
| 27 |
|
| 28 |
+
**🚀Ollama:** `ollama run ajindal/llama3.1-storm:8b`
|
| 29 |
+
|
| 30 |
<br>
|
| 31 |
|
| 32 |
# Llama-3.1-Storm-8B-FP8-Dynamic
|
|
|
|
| 101 |
1. `BF16`: [Llama-3.1-Storm-8B](https://huggingface.co/akjindal53244/Llama-3.1-Storm-8B)
|
| 102 |
2. ⚡ `FP8`: [Llama-3.1-Storm-8B-FP8-Dynamic](https://huggingface.co/akjindal53244/Llama-3.1-Storm-8B-FP8-Dynamic)
|
| 103 |
3. ⚡ `GGUF`: [Llama-3.1-Storm-8B-GGUF](https://huggingface.co/akjindal53244/Llama-3.1-Storm-8B-GGUF)
|
| 104 |
+
4. 🚀 Ollama: `ollama run ajindal/llama3.1-storm:8b`
|
| 105 |
|
| 106 |
|
| 107 |
|
|
|
|
| 220 |
print(llm.generate([prompt], sampling_params)[0].outputs[0].text.strip()) # Expected Output: <tool_call>{'tool_name': 'web_chain_details', 'tool_arguments': {'chain_slug': 'ethereum'}}</tool_call>
|
| 221 |
```
|
| 222 |
|
| 223 |
+
#### Use with [Ollama](https://ollama.com/)
|
| 224 |
```
|
| 225 |
import ollama
|
| 226 |
|