transformers

Get started

  • Quick tour
  • Installation
  • Philosophy
  • Glossary

Using 🤗 Transformers

  • Summary of the tasks
  • Summary of the models
  • Preprocessing data
  • Training and fine-tuning
  • Model sharing and uploading
  • Tokenizer summary
  • Multi-lingual models

Advanced guides

  • Pretrained models
  • Examples
  • Fine-tuning with custom datasets
  • 🤗 Transformers Notebooks
  • Converting Tensorflow Checkpoints
  • Migrating from previous packages
  • How to contribute to transformers?
  • Testing
  • Exporting transformers models

Research

  • BERTology
  • Perplexity of fixed-length models
  • Benchmarks

Package Reference

  • Configuration
  • Model outputs
  • Models
  • Tokenizer
  • Pipelines
  • Trainer
  • Optimization
  • Processors
  • Logging
  • AutoClasses
  • Encoder Decoder Models
  • BERT
  • OpenAI GPT
  • Transformer XL
  • OpenAI GPT2
  • XLM
  • XLNet
  • RoBERTa
  • DistilBERT
  • CTRL
  • CamemBERT
  • ALBERT
  • XLM-RoBERTa
  • FlauBERT
  • Bart
  • T5
  • ELECTRA
  • DialoGPT
  • Reformer
  • MarianMT
  • Longformer
  • RetriBERT
  • MobileBERT
  • DPR
  • Pegasus
  • MBart
  • FSMT
  • Funnel Transformer
  • LXMERT
  • BertGeneration
  • LayoutLM
  • Custom Layers and Utilities
  • Utilities for Tokenizers
  • Utilities for pipelines
transformers

ⓘ You are viewing legacy docs. Go to latest documentation instead.

  • Docs »
  • Search


© Copyright 2020, huggingface

Built with Sphinx using a theme provided by Read the Docs.