The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
Error code: JobManagerCrashedError
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
image
image | label
class label |
|---|---|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
|
0assembling_gift_baskets_1749468508582193
|
ENACT: Evaluating Embodied Cognition with World Modeling of Egocentric Interaction
ENACT is a benchmark dataset for evaluating embodied cognition in vision–language models via egocentric world modeling. It probes whether models can reason about how the world changes under sequences of actions, using long-horizon household activities in a mobile manipulation setting.
- Project page: https://enact-embodied-cognition.github.io/
- Code & evaluation: https://github.com/mll-lab-nu/ENACT
Dataset Summary
Each ENACT example is a multi-image, multi-step reasoning problem built from robot trajectories:
Forward world modeling
- Input: one current state image, several future state images (shuffled), and a list of actions in correct order.
- Task: output a Python list of integers giving the correct chronological order of future images (e.g.,
[1, 3, 2]).
Inverse world modeling
- Input: an ordered sequence of images showing state changes, plus actions in shuffled order.
- Task: output a Python list of integers giving the correct chronological order of actions (e.g.,
[2, 3, 1]).
All images are egocentric RGB observations rendered from long-horizon household tasks (e.g., assembling gift baskets, bringing water, preparing lunch boxes, cleaning up a desk).
File Structure
After unpacking, the dataset has the following structure:
.
├── enact_ordering.jsonl # All QA examples (one JSON per line)
└── images/
├── forward_world_modeling_3_steps/
├── forward_world_modeling_4_steps/
├── ...
├── forward_world_modeling_10_steps/
├── inverse_world_modeling_3_steps/
├── ...
└── inverse_world_modeling_10_steps/
Each task folder (e.g., forward_world_modeling_3_steps/) contains one subfolder per activity, such as:
images/forward_world_modeling_3_steps/
├── assembling_gift_baskets_1749468508582193/
├── bringing_water_1750844141719178/
├── ...
Inside each activity folder are the PNGs for that trajectory (current state and future states, or ordered states in the inverse setting).
JSONL Format
Each line in enact_ordering.jsonl is a JSON object:
{
"id": "assembling_gift_baskets_1749468508582193_forward_world_modeling_3_steps_cfbcc15c",
"type": "forward_world_modeling_3_steps",
"task_name": "assembling_gift_baskets_1749468508582193",
"key_frame_ids": ["4150", "11360", "11834"],
"images": [
"QA/images/forward_world_modeling_3_steps/..._cur_state.png",
"QA/images/forward_world_modeling_3_steps/..._next_state_1.png",
"QA/images/forward_world_modeling_3_steps/..._next_state_2.png"
],
"question": "...natural language instructions and actions...",
"options": [],
"gt_answer": [1, 2]
}
id– unique identifier for this QA instance.type– question type and horizon, e.g.forward_world_modeling_3_stepsorinverse_world_modeling_4_steps.task_name– underlying household task instance.key_frame_ids– frame indices of selected key frames in the trajectory.images– relative paths to PNG images:- index 0 is the current state;
- subsequent entries are future states (forward) or later states (inverse).
question– natural language prompt specifying the task setup, actions, and the required output as a Python list of integers.gt_answer– ground-truth ordering of image or action labels (list of integers, e.g.[1, 3, 2]).
Usage
To evaluate, follow the scripts in the code repository: https://github.com/mll-lab-nu/ENACT
Citation
If you use ENACT, please cite the paper:
@article{wang2025enact,
title={ENACT: Evaluating Embodied Cognition with World Modeling of Egocentric Interaction},
author={Wang, Qineng and Huang, Wenlong and Zhou, Yu and Yin, Hang
and Bao, Tianwei and Lyu, Jianwen and Liu, Weiyu and Zhang, Ruohan
and Wu, Jiajun and Li, Fei-Fei and Li, Manling}
}
- Downloads last month
- 35