Stanford University's HumanPlus project has introduced a groundbreaking system enabling humanoid robots to learn and perform tasks autonomously by mimicking human actions using RGB camera data. The system, which is open-sourced, allows humanoids to shadow human movements in real-time, performing activities such as boxing, playing the piano, and typing. Additionally, the robots can learn autonomous skills like folding sweatshirts, unloading objects from warehouse racks, greeting another robot, and diverse locomotion skills including squatting, jumping, and standing. The research highlights the use of a single third-person-perspective camera for teleoperation, eliminating the need for VR rigs. The project has been praised for its impressive demonstrations and potential applications in various fields, with the Unitree H1 robot being a notable example.
Imitating humans - 🤖 training humanoids The only paper that is fully 𝗢𝗣𝗘𝗡-𝗦𝗢𝗨𝗥𝗖𝗘𝗗 ❗ SimXR, is an innovative approach to control simulated avatars using data from AR/VR headsets, merging physical movement with virtual reality seamlessly. ✅ Enhanced Control:… https://t.co/WqQNLN0Yy2
#AI-powered simulation training improves human performance in robotic exoskeletons. #Robotics https://t.co/1gwwpEFhRt
Stanford University’s HumanPlus's autonomous humanoids are capable of learning a variety of skills by imitating human data. Amazing. https://t.co/nHLvEO8alU
HumanPlus - Humanoid Shadowing and Imitation from Humans ◼ 🤖 New research introduces a groundbreaking system enabling humanoid robots to learn & perform tasks autonomously by mimicking human actions, using just RGB camera data. Demonstrated skills include walking, typing &… https://t.co/HQYhzNDd0A
How can we train full-size humanoid robots? New paper introducing: - learned controller for shadowing humans - imitation learning of demos collected via shadowing Website with code & videos: https://t.co/uX2aEahPCL https://t.co/MhbBJ6V3p5
Impressive paper from Stanford researchers: No VR rig for teleoperation. A humanoid robot [Unitree H1] is teleoperated by shadowing a human observed from a single third-person-perspective camera. Data captured from the bot's perspective is then used to make the bot autonomous. https://t.co/eq4vwbJAsN https://t.co/NKdFnhxYDt
Really exciting progress towards fully autonomous humanoids! Fantastic work, team HumanPlus! https://t.co/5OvnKKMoC1
Remotely operate your robot using a single RGB camera and a learned whole-body. The typing demo in particular is pretty impressive, very hard to get that to work well. https://t.co/TgyaHvVaay
Introduce HumanPlus - Autonomous Skills part Humanoids are born for using human data. Imitating humans, our humanoid learns: - fold sweatshirts - unload objects from warehouse racks - diverse locomotion skills (squatting, jumping, standing) - greet another robot Open-sourced! https://t.co/jFzfES6mMf
Introduce HumanPlus - Shadowing part Humanoids are born for using human data. We build a real-time shadowing system using a single RGB camera and a whole-body policy for cloning human motion. Examples: - boxing🥊 - playing the piano🎹/ping pong - tossing - typing Open-sourced! https://t.co/DQgVDPiNnS