File size: 983 Bytes
cda69ba
 
019341e
7ea38a7
 
cda69ba
4161df7
d962664
4161df7
cda69ba
4161df7
cda69ba
 
 
 
 
 
 
4161df7
 
cda69ba
4161df7
cda69ba
4161df7
 
cda69ba
4161df7
 
cda69ba
4161df7
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
---
library_name: transformers
license: cc-by-nc-4.0
language:
- ko
---
<p align="left">
  <img src="https://huggingface.co/algograp-Inc/algograpV4/resolve/main/[email protected]" width="50%"/>
<p>

# algograp-Inc/algograpV4

<!-- Provide a quick summary of what the model is/does. -->



## Model Details

- **Developed by:** algograp-Inc
- **License:** cc-by-nc-4.0

## Hardware and Software

* **Hardware**: We utilized an H100x4 * 1
* **Training Factors**: We fine-tuned this model using a combination of the [DeepSpeed library](https://github.com/microsoft/DeepSpeed) and the [HuggingFace TRL Trainer](https://huggingface.co/docs/trl/trainer) / [HuggingFace Accelerate](https://huggingface.co/docs/accelerate/index)

## Method
- This model was trained using the learning method introduced in the [SOLAR paper](https://arxiv.org/pdf/2312.15166.pdf).

## Base Model 
- [yanolja/EEVE-Korean-Instruct-10.8B-v1.0](https://huggingface.co/yanolja/EEVE-Korean-Instruct-10.8B-v1.0)