CVPR 2023 – ACR: Attention Collaboration-based Regressor for Arbitrary Two-Hand Reconstruction


In this episode we discuss ACR: Attention Collaboration-based Regressor for Arbitrary Two-Hand Reconstruction
by Zhengdi Yu, Shaoli Huang, Chen Fang, Toby P. Breckon, Jue Wang. The paper presents ACR, a new method for reconstructing two hands from monocular RGB images in arbitrary scenarios, addressing the challenges posed by occlusions and mutual confusion. Unlike existing methods, ACR leverages center and part-based attention for feature extraction to explicitly mitigate interdependencies between hands and their parts, and to learn a cross-hand prior that better handles interacting hands. The method outperforms the best interacting-hand approaches on the InterHand2.6M dataset and shows comparable performance with state-of-the-art single-hand methods on the FreiHand dataset. Qualitative results on various datasets further demonstrate the effectiveness of the approach for arbitrary hand reconstruction.


Posted

in

by

Tags: