Feature Extraction
Transformers
PyTorch
e2d2
custom_code
Configuration Parsing Warning: In tokenizer_config.json: "tokenizer_config.chat_template" must be one of [string, array]

Quick start guide

To use this models, follow the snippet below:

from transformers import AutoModelForMaskedLM

# model_config_overrides = {}  # Use this to optionally override config parameters
model = AutoModelForMaskedLM.from_pretrained(
    "kuleshov-group/e2d2-wmt",
    trust_remote_code=True,
    # **model_config_overrides,
)

Model details

See the project site for more details and link to the paper and code: https://m-arriola.com/e2d2/

Citation

@inproceedings{
arriola2025e2d2,
title={Encoder-Decoder Diffusion Language Models for Efficient Training and Inference},
author={Marianne Arriola and Yair Schiff and Hao Phung and Aaron Gokaslan and Volodymyr Kuleshov},
booktitle={The Thirty-ninth Annual Conference on Neural Information Processing Systems},
year={2025},
url={https://arxiv.org/abs/2510.22852}
}
Downloads last month
115
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train kuleshov-group/e2d2-wmt

Collection including kuleshov-group/e2d2-wmt