no code implementations • 22 Feb 2024 • Nishanth Kumar, Tom Silver, Willie McClinton, Linfeng Zhao, Stephen Proulx, Tomás Lozano-Pérez, Leslie Pack Kaelbling, Jennifer Barry
We consider a setting where a robot is initially equipped with (1) a library of parameterized skills, (2) an AI planner for sequencing together the skills given a goal, and (3) a very general prior distribution for selecting skill parameters.
no code implementations • 22 Sep 2023 • Linfeng Zhao, Hongyu Li, Taskin Padir, Huaizu Jiang, Lawson L. S. Wong
Learning for robot navigation presents a critical and challenging task.
no code implementations • 17 Jul 2023 • Linfeng Zhao, Owen Howell, Jung Yeon Park, Xupeng Zhu, Robin Walters, Lawson L. S. Wong
In robotic tasks, changes in reference frames typically do not influence the underlying physical properties of the system, which has been known as invariance of physical laws. These changes, which preserve distance, encompass isometric transformations such as translations, rotations, and reflections, collectively known as the Euclidean group.
no code implementations • 7 Jul 2023 • Owen Howell, David Klee, Ondrej Biza, Linfeng Zhao, Robin Walters
We show that an algorithm that learns a three-dimensional representation of the world from two dimensional images must satisfy certain geometric consistency properties which we formulate as SO(2)-equivariance constraints.
no code implementations • 24 Oct 2022 • Linfeng Zhao, Huazhe Xu, Lawson L. S. Wong
To alleviate this issue, we propose to differentiate through the Bellman fixed-point equation to decouple forward and backward passes for Value Iteration Network and its variants, which enables constant backward cost (in planning horizon) and flexible forward budget and helps scale up to large tasks.
no code implementations • 8 Jun 2022 • Linfeng Zhao, Xupeng Zhu, Lingzhi Kong, Robin Walters, Lawson L. S. Wong
Our implementation is based on VINs and uses steerable convolution networks to incorporate symmetry.
no code implementations • 28 Apr 2022 • Linfeng Zhao, Lingzhi Kong, Robin Walters, Lawson L. S. Wong
Compositional generalization is a critical ability in learning and decision-making.
1 code implementation • 24 Apr 2022 • Jung Yeon Park, Ondrej Biza, Linfeng Zhao, Jan Willem van de Meent, Robin Walters
Incorporating symmetries can lead to highly data-efficient and generalizable models by defining equivalence classes of data samples related by transformations.
no code implementations • 29 Sep 2021 • Jung Yeon Park, Ondrej Biza, Linfeng Zhao, Jan-Willem van de Meent, Robin Walters
In this paper, we use equivariant transition models as an inductive bias to learn symmetric latent representations in a self-supervised manner.
no code implementations • 1 Jan 2021 • Linfeng Zhao, Lawson L. S. Wong
To learn this ability, we need to efficiently train an agent on environments with a small proportion of training maps and share knowledge effectively across the environments.
1 code implementation • NeurIPS 2020 • Fan Xie, Alexander Chowdhury, M. Clara De Paolis Kaluza, Linfeng Zhao, Lawson L. S. Wong, Rose Yu
Compared to baselines, our model generalizes better and achieves higher success rates on several simulated bimanual robotic manipulation tasks.