Model Card for Model ID
Training data location: https://github.com/fuArizona/BEMGPT/tree/main/data/dataset/training/Quadruple
Example code location: https://github.com/fuArizona/BEMGPT/tree/main/code
Model Details
Model Description
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
**Developed by: Xiaoqin Fu @ TensorBuild Lab, University of Arizona
**Finetuned from model: LlaMA 3.2 11B
Uses
Question-Answering for Building Energy
Direct Use
Answer the questions related to Building Energy
How to Get Started with the Model
Use the code below to get started with the model.
Training Details
Training Data
https://github.com/fuArizona/BEMGPT/tree/main/data/dataset/training/Quadruple
Training Methods
LoRA (Low-Rank Adaptation)
QLoRA (Quantized Low-Rank Adaptation)
DoRA (Weight-Decomposed Low-Rank Adaptation)
Evaluation
Testing Data, Factors & Metrics
Testing Data
https://github.com/fuArizona/BEMGPT/tree/main/data/dataset/testing/Quadruple
Metrics
BLEU (bilingual evaluation understudy)
ROUGE (recall-oriented understudy for gisting evaluation), including ROUGE-1, ROUGE-2, and ROUGE-L
BERTScore
Score evaluated by gpt-4o (2024-11-20 version)
Technical Specifications [optional]
Hardware
(NVIDIA H100) GPUs
Software
Python environment and Python dependencies, including torch, transformers, huggingface_hub, datasets, peft, nltk, trl, numpy, rouge, bert_score, etc.
Conda environment (for Jupyter Notebooks)
Model Card Authors [optional]
Xiaoqin Fu @ TensorBuild Lab, University of Arizona