RHEL AI Model Training Scenario: A Fictional Hotel Group

A fictional example for the Training Large Language Models with Red{nbsp}Hat Enterprise Linux AI (AI0005L) and Deploying Models with Red{nbsp}Hat Enterprise Linux AI (AI0006L) Red Hat Training lessons. These lessons present students with a scenario where a hotel group must train their own LLM, aligned with their business needs, by using RHEL AI.

NOTE: This model has been trained using a reduced version of the RHEL AI default training process. In this reduced version, the model has been trained only during four hours, instead of four-five days. Additionally, the number of training samples has been reduced from ~330,000 to only 10,000.

As a result, the model, although useful for learning purposes, is far from being optimally tuned.

Downloads last month
8
Safetensors
Model size
8B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support