metadata
model-index:
- name: TimeSeriesTransformerClassifier
results:
- task:
name: Time Series Classification
type: time-series-classification
dataset:
name: Yahoo Finance GC=F 1h
type: financial-time-series
split: validation
config: gc1h
revision: '1.0'
metrics:
- name: Final Equity (Raw Signal)
type: final_equity
value: 206.24
- name: Sharpe Ratio (Raw Signal)
type: sharpe_ratio
value: 0.0318
- name: Winrate (Raw Signal)
type: winrate
value: 0.522
- task:
name: Time Series Classification
type: time-series-classification
dataset:
name: Yahoo Finance GC=F 1h
type: financial-time-series
split: validation
config: gc1h
revision: '1.0'
metrics:
- name: Final Equity (Confidence Filtered)
type: final_equity
value: 100
- name: Sharpe Ratio (Confidence Filtered)
type: sharpe_ratio
value: 0
- name: Winrate (Confidence Filtered)
type: winrate
value: 0
library_name: pytorch
tags:
- financial-forecasting
- time-series
- transformer
- algorithmic-trading
- gold
model-architecture: Transformer Encoder with Positional Encoding and Classification Head
inference:
parameters: {}
repo:
name: transformer-classifier-gc1h
url: https://huggingface.co/JonusNattapong/transformer-classifier-gc1h
model-creator: Jonus Nattapong
model-card-authors: Jonus Nattapong
π Time Series Transformer Classifier for Algorithmic Trading
This repository provides a Transformer-based time series classifier trained on Gold Futures (GC=F, 1-hour timeframe) to predict short-term price direction. The model outputs Up, Flat, or Down classes which can be used to generate trading signals.
π§ Model Architecture
- Base: Transformer Encoder with Positional Encoding
- Head: Classification Layer
- Framework: PyTorch
π Training Setup
- Dataset: GC=F (Gold Futures), 1-hour interval, 2 years (Yahoo Finance)
- Features: EMA, RSI, ATR, MACD, Bollinger Bands, lagged returns, realized volatility, and cyclical time features (hour/day).
- Target: 3-class (Up, Flat, Down), horizon = 6h, threshold = 0.0005.
- Split: Walk-forward (80% train, 20% validation).
- Loss: CrossEntropyLoss (weighted).
- Optimizer: AdamW
π Backtest Results
The model was evaluated via a walk-forward backtest with continuous position sizing.
Raw Signal Performance
- Final Equity: 206.24
- Sharpe Ratio: 0.032
- Profit Factor: 1.099
- Winrate: 52.2%
- Max Drawdown: -10.6%
Confidence-Filtered Signal (CONF_THR = 0.45)
- Final Equity: 100.0
- Sharpe Ratio: 0.0
- Winrate: 0% (filter too strict)
π Usage
from huggingface_hub import hf_hub_download
import torch, torch.nn.functional as F
import numpy as np
# Download model weights
repo_id = "JonusNattapong/transformer-classifier-gc1h"
filename = "transformer_cls_gc1h.pt"
state_dict_path = hf_hub_download(repo_id=repo_id, filename=filename)
# Define model (must match training config)
model_inf = TimeSeriesTransformerCLS(
n_features=n_features,
n_classes=3,
d_model=128,
n_heads=4,
n_layers=4,
d_ff=256,
dropout=0.1
).to(device)
# Load weights
model_inf.load_state_dict(torch.load(state_dict_path, map_location=device))
model_inf.eval()
# Example Inference
example_input = scaled.iloc[-WINDOW:].values.astype(np.float32) # shape [T, F]
example_input_tensor = torch.tensor(example_input).unsqueeze(0).to(device)
with torch.no_grad():
logits = model_inf(example_input_tensor)
probabilities = F.softmax(logits, dim=1).squeeze(0).cpu().numpy()
predicted_class = int(torch.argmax(logits, dim=1).item())
print("Probabilities:", probabilities)
print("Predicted Class:", predicted_class) # 0=Down, 1=Flat, 2=Up
