dataset

hoi-attention

anom-tmu

or hover any field below to flag it

Overview

Name
hoi-attention
Source
anom-tmu
Episodes
0
Robot count
0
Format
other
Description
This repository show how to extract hand activity information on objects based on visual attention in the task-specific reach-to-grasp cycle. We used perception-based egocentric vision to observe hand-object interactions (HOI) in grasping tasks. Our approach combines object detection with hand skele
Robots used
null

Links

HuggingFace dataset
null