Imitation learning is not merely collecting large-scale demonstration data. It requires effective data collection and curation. FSC is a great example of this! Join Lihan’s session and chat with him to learn how to make your policy more general from a data-centric perspective!
Imitation learning is not merely collecting large-scale demonstration data. It requires effective data collection and curation. FSC is a great example of this! Join Lihan’s session and chat with him to learn how to make your policy more general from a data-centric perspective!
The robot neck is COOL! Active perception could be the next big step—by learning where to see, the robot can then learn how to act, unlocking even more impressive capabilities! Congrats!
The robot neck is COOL! Active perception could be the next big step—by learning where to see, the robot can then learn how to act, unlocking even more impressive capabilities! Congrats!
Will be presenting KUDA at #ICRA2025 today! Looking forward to chatting with old and new friends!
📍 Room 404 (Regular Session WeET16)
📷 May 21 (Wednesday) 5:00 pm–5:05 pm
Will be presenting KUDA at #ICRA2025 today! Looking forward to chatting with old and new friends!
📍 Room 404 (Regular Session WeET16)
📷 May 21 (Wednesday) 5:00 pm–5:05 pm
Thank you @janusch_patas for highlighting our work! We are advancing visual representations such as Gaussian Splatting to empower robotics! Through building structured world models for deformable objects, our approach creates a neural-based real-to-sim digital twin from…
Thank you @janusch_patas for highlighting our work! We are advancing visual representations such as Gaussian Splatting to empower robotics! Through building structured world models for deformable objects, our approach creates a neural-based real-to-sim digital twin from…
Learning from videos of humans performing tasks provides valuable semantic and motion data for scaling robot generalists. Translating human actions into robotic capabilities remains an exciting challenge—Humanoid-X and UH1 demonstrate impressive advancements!
Learning from videos of humans performing tasks provides valuable semantic and motion data for scaling robot generalists. Translating human actions into robotic capabilities remains an exciting challenge—Humanoid-X and UH1 demonstrate impressive advancements!
What a day! The community has successfully reproduced this highly accessible tactile sensor developed by @binghao_huang. Step into a new era of multi-modal sensing!
What a day! The community has successfully reproduced this highly accessible tactile sensor developed by @binghao_huang. Step into a new era of multi-modal sensing!
Huge congratulations to @JiaweiYang118 for winning the NVIDIA Fellowship! Jiawei has a long-term vision and deep, thoughtful insight in his research. Truly well-deserved! 🙌
Huge congratulations to @JiaweiYang118 for winning the NVIDIA Fellowship! Jiawei has a long-term vision and deep, thoughtful insight in his research. Truly well-deserved! 🙌
Congratulations to @gan_chuang and the team on this phenomenal project! I am very lucky to have witnessed its journey and to have had insightful discussions and received invaluable mentorship from so many of you!
Congratulations to @gan_chuang and the team on this phenomenal project! I am very lucky to have witnessed its journey and to have had insightful discussions and received invaluable mentorship from so many of you!
Congratulations to the team on this outstanding achievement! Thinking back to my sweet early days in Boston as a newcomer to the domain, I am incredibly grateful to many people in this team for shaping my research journey and teaching me how to approach meaningful questions, make…
Congratulations to the team on this outstanding achievement! Thinking back to my sweet early days in Boston as a newcomer to the domain, I am incredibly grateful to many people in this team for shaping my research journey and teaching me how to approach meaningful questions, make…
Congratulations to @binghao_huang for making tactile sensing more accessible! Build a high-res tactile sensor in just 30 mins! Please have a try to make your robot capable of multimodal perception!
Congratulations to @binghao_huang for making tactile sensing more accessible! Build a high-res tactile sensor in just 30 mins! Please have a try to make your robot capable of multimodal perception!
17K Followers 6K FollowingNeurodivergent physics student with a keen interest in multisensory integration and emergent perception. Exploring research on a proposed ‘sixth sense’. Δ
110 Followers 435 FollowingInterested in 3D scene representations and their applications.
Ph.D. Student at UCSD | Sr. DL Engineer at Akasha Imaging | UCLA’21 | IIT Bombay’19
6K Followers 5K FollowingWelcome to my world🌎
There is nothing special about my bio🤷🏻♀️
Fun 🥳, Business🕴Travel✈️ News📰Sports🏌... I will share the things I am interested in👌🏻💯
2K Followers 911 FollowingHiring: resume to [email protected]
to love math is to see the face of God
Morgan Prize, Rhodes Scholar
Math PhD@Stanford; Neuro@Oxford; Math+Physics@MIT
21K Followers 267 FollowingPioneering the future of robotics since 1979. We’re transforming industries and everyday life through cutting-edge innovation and world-class education.
14K Followers 519 FollowingYour guide to radiance fields | Host of the podcast @ViewDependent | DM open for business inquiries | https://t.co/llYGWliKUv | discord: https://t.co/lrl64WGvlD
8K Followers 815 FollowingAssistant Professor MIT @medialab @MITEECS @nlp_mit || PhD from CMU @mldcmu @LTIatCMU || Foundations of multisensory AI to enhance the human experience.