(function(c,l,a,r,i,t,y){ c[a]=c[a]||function(){(c[a].q=c[a].q||[]).push(arguments)}; t=l.createElement(r);t.async=1;t.src="https://www.clarity.ms/tag/"+i+"?ref=wix"; y=l.getElementsByTagName(r)[0];y.parentNode.insertBefore(t,y); })(window, document, "clarity", "script", "nglikgkapv");
top of page

Machine Learning Series - BiACT Whitepaper Review



We dive into the innovative paper "Bi-ACT: Bilateral Control-Based Imitation Learning" by Masato Kobayashi and his team at Osaka University in Japan. Building on the concept of Action Chunking Transformers (ACT), this paper introduces a new approach that incorporates both position and force information, enabling robots to adapt to various object properties like hardness and weight.


We'll explore how Bi-ACT leverages joint angles, velocities, torque, and images to produce fast, robust motion generation at 100Hz. The use of bilateral control, a disturbance observer, and a Force Reaction Observer (FRO) ensures precise, real-time adjustments during operation. The paper's experiments demonstrate the system's effectiveness in handling complex tasks, including managing deformable objects and those containing liquids.


Join us as we break down the methodology, experiments, and groundbreaking results of this study. Don't forget to subscribe for more paper reviews and cutting-edge insights in robotics!


Start making your own machine learning models with an Aloha Kit


Credit:


Bi-ACT: Bilateral Control-Based Imitation Learning via Action Chunking with Transformer

Thanpimon Buamanee, Masato Kobayashi, Yuki Uranishi, Haruo Takemura

arXiv:2401.17698v1 [cs.RO]


Imitation learning for variable speed motion generation over multiple actions

Yuki Saigusa, Ayumu Sasagawa, Sho Sakaino, Toshiaki Tsuji

arXiv:2103.06466v4 [cs.RO]

Comments


bottom of page