no code implementations • 2 May 2024 • Tifanny Portela, Gabriel B. Margolis, Yandong Ji, Pulkit Agrawal
We propose a method for training RL policies for direct force control without requiring access to force sensing.
no code implementations • 25 Mar 2024 • Minghuan Liu, Zixuan Chen, Xuxin Cheng, Yandong Ji, Ri-Zhao Qiu, Ruihan Yang, Xiaolong Wang
We propose a framework that can conduct the whole-body control autonomously with visual observations.
no code implementations • 26 Feb 2024 • Xuxin Cheng, Yandong Ji, Junming Chen, Ruihan Yang, Ge Yang, Xiaolong Wang
Can we enable humanoid robots to generate rich, diverse, and expressive motions in the real world?
no code implementations • 2 Nov 2023 • Gabriel B. Margolis, Xiang Fu, Yandong Ji, Pulkit Agrawal
We show that the visual system trained with a small amount of real-world traversal data accurately predicts physical parameters.
no code implementations • 3 Apr 2023 • Yandong Ji, Gabriel B. Margolis, Pulkit Agrawal
DribbleBot (Dexterous Ball Manipulation with a Legged Robot) is a legged robotic system that can dribble a soccer ball under the same real-world conditions as humans (i. e., in-the-wild).
no code implementations • 1 Aug 2022 • Yandong Ji, Zhongyu Li, Yinan Sun, Xue Bin Peng, Sergey Levine, Glen Berseth, Koushil Sreenath
Developing algorithms to enable a legged robot to shoot a soccer ball to a given target is a challenging problem that combines robot motion control and planning into one task.