Extreme Parkour with Legged Robots

Xuxin Cheng*,1             Kexin Shi*,1,2             Ananye Agarwal1             Deepak Pathak1

1 2
ICRA 2024 CoRL 2023 Generalist / Roboletics / Deployable Workshop (Oral)


Humans can perform parkour by traversing obstacles in a highly dynamic fashion requiring precise eyemuscle coordination and movement. Getting robots to do the same task requires overcoming similar challenges. Classically, this is done by independently engineering perception, actuation, and control systems to very low tolerances. This restricts them to tightly controlled settings such as a predetermined obstacle course in labs. In contrast, humans are able to learn parkour through practice without significantly changing their underlying biology. In this paper, we take a similar approach to developing robot parkour on a small low-cost robot with imprecise actuation and a single front-facing depth camera for perception which is low-frequency, jittery, and prone to artifacts. We show how a single neural net policy operating directly from a camera image, trained in simulation with largescale RL, can overcome imprecise sensing and actuation to output highly precise control behavior end-to-end. We show our robot can perform a high jump on obstacles 2x its height, long jump across gaps 2x its length, do a handstand and run across tilted ramps, and generalize to novel obstacle courses with different physical properties.

Extreme Cases


Tilted Ramp

Fully autonomous without any human teleoperation.

Parkour Course





Demo at CoRL 2023


Uncut 2 mins clip of our robot nonstop climbing, leaping across gaps, and jumping down from boxes.


Failure case from Vision locomotion project. It falls downstair.

Without clearance reward, the robot easily touches the edge and fails on large gap terrain.

Without direction distillation, it is hard to control the robot via joystick on the ramp terrain.




title={Extreme Parkour with Legged Robots},
author={Cheng, Xuxin and Shi, Kexin and Agarwal, Ananye and Pathak, Deepak},
journal={arXiv preprint arXiv:2309.14341},