MIT debuts a large language model-inspired method for teaching robots new skills

Date:

Share post:


MIT this week showcased a new model for training robots. Rather than the standard set of focused data used to teach robots new tasks, the method goes big, mimicking the massive troves of information used to train large language models (LLMs).

The researchers note that imitation learning — in which the agent learns by following an individual performing a task — can fail when small challenges are introduced. These could be things like lighting, a different setting, or new obstacles. In those scenarios, the robots simply don’t have enough data to draw upon in order to adapt.

The team looked to models like GPT-4 for a kind of brute force data approach to problem solving.

“In the language domain, the data are all just sentences,” says Lirui Wang, the new paper’s lead author. “In robotics, given all the heterogeneity in the data, if you want to pretrain in a similar manner, we need a different architecture.”

The team introduced a new architecture called Heterogeneous Pretrained Transformers (HPT), which pulls together information from different sensors and different environments. A transformer was then used to pull together the data into training models. The larger the transformer, the better the output.

Users then input the robot design, configuration, and the job they want done.

“Our dream is to have a universal robot brain that you could download and use for your robot without any training at all,” CMU associate professor David Held said of the research. “While we are just in the early stages, we are going to keep pushing hard and hope scaling leads to a breakthrough in robotic policies, like it did with large language models.”

The research was founded, in part, by Toyota Research Institute. Last year at TechCrunch Disrupt, TRI debuted a method for training robots overnight. More recently, it struck a watershed partnership that will unite its robot learning research with Boston Dynamics hardware.  



Source link

Lisa Holden
Lisa Holden
Lisa Holden is a news writer for LinkDaddy News. She writes health, sport, tech, and more. Some of her favorite topics include the latest trends in fitness and wellness, the best ways to use technology to improve your life, and the latest developments in medical research.

Recent posts

Related articles

‘Wolfs’ sequel canceled because director ‘no longer trusted’ Apple

It may be hard to remember, but George Clooney and Brad Pitt co-starred in a movie, “Wolfs,”...

DOJ tells Google to sell Chrome

Welcome back to Week in Review. This week, we’re exploring the DOJ telling Google to sell off...

Tesla says it has reached a ‘conditional’ settlement in Rivian trade secrets lawsuit

Tesla and Rivian may have resolved a lawsuit in which Tesla accused Rivian of poaching employees and...

The rise and fall of the ‘Scattered Spider’ hackers

After evading capture for more than two years following a hacking spree that targeted some of the...

Trump’s tariff threats don’t scare this Mexican fintech

Mexico’s economic development — turbocharged by the amount of nearshoring in recent years — has made it...

Meet three incoming EU lawmakers in charge of key tech policy areas

The European Union looks to have clinched political agreement on the team of 26 commissioners who will...

OpenAI accidentally deleted potential evidence in NY Times copyright lawsuit (updated)

Lawyers for The New York Times and Daily News, which are suing OpenAI for allegedly scraping their...

Sequoia marks up its 2020 fund by 25%

Sequoia says no exits, no problem. The Silicon Valley titan of venture marked up the value of its...