Marcjoni commited on
Commit
bd3d21d
·
verified ·
1 Parent(s): 67d98cc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +27 -1
README.md CHANGED
@@ -3,10 +3,36 @@ tags:
3
  - merge
4
  - mergekit
5
  - lazymergekit
 
 
 
 
 
6
  ---
7
 
 
 
8
  # HyperNovaSynth-12B
9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
  HyperNovaSynth-12B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
11
 
12
  ## 🧩 Configuration
@@ -47,6 +73,6 @@ pipeline = transformers.pipeline(
47
  device_map="auto",
48
  )
49
 
50
- outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
51
  print(outputs[0]["generated_text"])
52
  ```
 
3
  - merge
4
  - mergekit
5
  - lazymergekit
6
+ language:
7
+ - en
8
+ base_model:
9
+ - Marcjoni/SuperNovaSynth-12B
10
+ - yamatazen/LorablatedStock-12B
11
  ---
12
 
13
+ <img src="./HyperNova.png" alt="Model Image"/>
14
+
15
  # HyperNovaSynth-12B
16
 
17
+ <b><i>From the void where giants fall, a deeper silence erupts. Darker, heavier, stranger.
18
+ <br> What follows is not light but gravity itself made song.
19
+ <br> This is no ordinary flare, but the whisper of something vast unraveling.</i></b>
20
+
21
+ ## 🔧 Recommended Sampling Settings:</b>
22
+ ```yaml
23
+ Temperature: 0.75 to 1.25
24
+ Min P: 0.035
25
+ Context Length: Stable at 12k tokens, with possible support for extended contexts
26
+ ```
27
+ ## 💬 Prompt Format
28
+ Supports ChatML style messages. Example:
29
+ ```yaml
30
+ <|im_start|>user
31
+ Your question here.
32
+ <|im_end|>
33
+ <|im_start|>assistant
34
+ ```
35
+
36
  HyperNovaSynth-12B is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
37
 
38
  ## 🧩 Configuration
 
73
  device_map="auto",
74
  )
75
 
76
+ outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=1, top_k=0, top_p=1)
77
  print(outputs[0]["generated_text"])
78
  ```