File size: 2,044 Bytes
21b8e7a
 
 
48843f0
21b8e7a
 
 
 
 
24525d6
21b8e7a
24525d6
21b8e7a
5660d4e
21b8e7a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
982723f
21b8e7a
 
 
ca46ec2
9b77ec9
21b8e7a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
---
license: other
license_name: tongyi-qianwen
license_link: https://huggingface.co/Qwen/Qwen1.5-72B/raw/main/LICENSE
language:
- th
- en
pipeline_tag: text-generation
---
**Typhoon-1.5-72B: Thai Large Language Model (Pretrained)**

**Typhoon-1.5-72B** is a *pretrained* Thai 🇹🇭 large language model with 72 billion parameters, and it is based on Qwen1.5-72B.

For release post, please see our [blog](https://blog.opentyphoon.ai/typhoon-1-5-release-a9364cb8e8d7).

## **Model Description**

- **Model type**: A 72B instruct decoder-only model based on Qwen1.5 archtecture.
- **Requirement**: transformers 4.38.0 or newer.
- **Primary Language(s)**: Thai 🇹🇭 and English 🇬🇧
- **License**: [Qwen License](https://huggingface.co/Qwen/Qwen1.5-72B/raw/main/LICENSE)

## **Intended Uses & Limitations**

This model is a pretrained base model. Thus, it may not be able to follow human instructions without using one/few-shot learning or instruction fine-tuning. The model does not have any moderation mechanisms, and may generate harmful or inappropriate responses.

## **Follow us**

**https://twitter.com/opentyphoon**

## **Support / Ask any question**

**https://discord.gg/us5gAYmrxw**

## **SCB10X AI Team**

- Kunat Pipatanakul, Potsawee Manakul, Sittipong Sripaisarnmongkol, Natapong Nitarach, Pathomporn Chokchainant, Kasima Tharnpipitchai
- If you find Typhoon-72B useful for your work, please cite it using:

```
@article{pipatanakul2023typhoon,
    title={Typhoon: Thai Large Language Models}, 
    author={Kunat Pipatanakul and Phatrasek Jirabovonvisut and Potsawee Manakul and Sittipong Sripaisarnmongkol and Ruangsak Patomwong and Pathomporn Chokchainant and Kasima Tharnpipitchai},
    year={2023},
    journal={arXiv preprint arXiv:2312.13951},
    url={https://arxiv.org/abs/2312.13951}
}
```

## **Contact Us**

- General & Collaboration: **[[email protected]](mailto:[email protected])****[[email protected]](mailto:[email protected])**
- Technical: **[[email protected]](mailto:[email protected])**