Inference examples

Transformers

You can use AI-Job-Classifier with Transformers.

Once, setup you can proceed to classify the job descriptions by running the snippet below:

# load model
from transformers import AutoTokenizer AutoModelForSequenceClassification
model_id = "FrankieShih/qwen3-0.6b-ai-jobs-classifier"
model = AutoModelForSequenceClassification.from_pretrained(model_id)
tokenizer = AutoTokenizer.from_pretrained(model_id)

# run the inference
text = """this is your test jd"""
inputs = tokenizer(text, return_tensors="pt")
with torch.no_grad():
    logits = model(**inputs).logits
predicted_class_id = logits.argmax().item()

# you may want to map the binary output to lables
new_id2label = {0: 'NON-AI JOB', 1: 'AI JOB'}
new_label2id = {v: k for k, v in new_id2label.items()}
model.config.id2label = new_id2label
model.config.label2id = new_label2id
print(model.config.id2label[predicted_class_id])
Downloads last month
14
Safetensors
Model size
0.6B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for FrankieShih/qwen3-0.6b-ai-jobs-classifier

Finetuned
Qwen/Qwen3-0.6B
Finetuned
(326)
this model