Non-prehensile manipulation offers a robust alternative to traditional pick-and-place methods for object repositioning.
However, learning such skills with dexterous, multi-fingered hands remains largely unexplored,
leaving their potential for stable and efficient manipulation underutilized.
Progress has been limited by the lack of large-scale, contact-aware non-prehensile datasets for dexterous hands
and the absence of wrist-finger control policies.
To bridge these gaps, we present DexMove, a tactile-guided non-prehensile manipulation
framework for dexterous hands. DexMove combines a scalable simulation pipeline that generates physically
plausible wrist-finger trajectories with a wearable device, which captures multi-finger contact data from human demonstrations using vision-based tactile sensors.
Using these data, we train a flow-based policy that enables real-time, synergistic wrist-finger control for robust non-prehensile manipulation of diverse tabletop objects.
In real-world experiments, DexMove successfully manipulated six objects of varying shapes and materials, achieving a 77.8% success rate.
Our method outperforms ablated baselines by 36.6% and improves efficiency by nearly 300%. Furthermore, the learned policy generalizes to language-conditioned, long-horizon tasks such as object sorting and desktop tidying.