Upload 3 files
🦾 EMG Hand Gesture Dataset — 3-Channel MyoWare Signals
This dataset was collected as part of the “AI Prosthetic Hand Control via the Peripheral Nervous System”
📦 Dataset Summary
This dataset contains 3-channel surface EMG (sEMG) signals recorded from the forearms of human participants while performing hand gestures such as "Rest", "Fist", "Paper", and "Okay". The data was collected using MyoWare 2.0 sensors placed on:
Brachioradialis
Flexor Carpi Ulnaris
Flexor Carpi Radialis
The signals were sampled at 1070 Hz and segmented into 300 ms overlapping windows (stride: 30 ms), following best practices for real-time gesture recognition.
Each episode contains:
A full window of 3-channel EMG signals
The corresponding gesture label
Accurate time metadata
Subject and trial metadata
The data is stored in the LeRobotDataset format for compatibility with robotics and imitation learning workflows.
🧠 Project Context
This dataset supports the development of affordable AI-based prosthetic hands using pattern recognition. The prosthetic hand is 3D-printed and controlled using Raspberry Pi, servo motors, and a trained AI model (ANN).
The EMG signals enable gesture classification, and the predicted output is used to control the fingers of the prosthetic hand in real time.
👥 Participants
8 participants: 4 male and 4 female
Each participant performed 4 gestures: Rest, Fist, Paper, Okay
Each gesture was repeated across 4 rounds, with 5 repetitions per round
🏷 Labels
Rest
Fist
Paper
Okay
Each episode is labeled with the most frequent gesture in its 300 ms window.