MIT debuts a large language model-inspired method for teaching robots new skills

Date:

Share post:


MIT this week showcased a new model for training robots. Rather than the standard set of focused data used to teach robots new tasks, the method goes big, mimicking the massive troves of information used to train large language models (LLMs).

The researchers note that imitation learning — in which the agent learns by following an individual performing a task — can fail when small challenges are introduced. These could be things like lighting, a different setting, or new obstacles. In those scenarios, the robots simply don’t have enough data to draw upon in order to adapt.

The team looked to models like GPT-4 for a kind of brute force data approach to problem solving.

“In the language domain, the data are all just sentences,” says Lirui Wang, the new paper’s lead author. “In robotics, given all the heterogeneity in the data, if you want to pretrain in a similar manner, we need a different architecture.”

The team introduced a new architecture called Heterogeneous Pretrained Transformers (HPT), which pulls together information from different sensors and different environments. A transformer was then used to pull together the data into training models. The larger the transformer, the better the output.

Users then input the robot design, configuration, and the job they want done.

“Our dream is to have a universal robot brain that you could download and use for your robot without any training at all,” CMU associate professor David Held said of the research. “While we are just in the early stages, we are going to keep pushing hard and hope scaling leads to a breakthrough in robotic policies, like it did with large language models.”

The research was founded, in part, by Toyota Research Institute. Last year at TechCrunch Disrupt, TRI debuted a method for training robots overnight. More recently, it struck a watershed partnership that will unite its robot learning research with Boston Dynamics hardware.  



Source link

Lisa Holden
Lisa Holden
Lisa Holden is a news writer for LinkDaddy News. She writes health, sport, tech, and more. Some of her favorite topics include the latest trends in fitness and wellness, the best ways to use technology to improve your life, and the latest developments in medical research.

Recent posts

Related articles

Meta COO Sheryl Sandberg sanctioned by judge for allegedly deleting emails

A Delaware judge has sanctioned Sheryl Sandberg, Meta’s former COO and board member, for allegedly deleting emails...

Microsoft is no longer OpenAI’s exclusive cloud provider

Microsoft was once the exclusive provider of data center infrastructure for OpenAI to train and run its...

Scale AI’s Alexandr Wang has published an open letter lobbying Trump to invest in AI

Alexandr Wang, the CEO of Scale AI, has taken out a full-page ad in The Washington Post...

Perplexity launches Sonar, an API for AI search

Perplexity on Tuesday launched an API service called Sonar, allowing enterprises and developers to build the startup’s...

Trump targets EV charging funding programs Tesla benefits from

President Donald Trump is trying to halt the flow of funding for EV charging infrastructure from two...

Spotify introduces educational audio courses, starting in the UK

Spotify is expanding its streaming service to now include educational courses in addition to music, podcasts, and...

Funding to fintechs continues to decline, but at a slower pace

Welcome to TechCrunch Fintech!  This week, we’re looking at just how much fintech startups raised in 2024, a...

Forum software NodeBB joins the fediverse

Before there was social media, there were internet forums. Millions of forum sites continue to operate, which...