UC Berkeley researchers created an artificial intelligence, or AI, software that gives robots the ability to grasp and move objects smoothly, making it possible for them to eventually assist in warehouses.
The COVID-19 pandemic has increased the demand of online retail while also reducing the ability for warehouse workers to fulfill orders, according to UC Berkeley postdoctoral researcher and primary author of the study Jeffrey Ichnowski.
In response, Ichnowski and Ken Goldberg, the campus William S. Floyd Jr. Distinguished Chair in Engineering and senior author of the study, collaborated with graduate student Yahav Avigal and undergraduate student Vishal Satish to create an AI software with a deep learning neural network. This AI function allows robots to use data learned from examples to approximate how to execute an action in different situations, which gives robots the ability to assist with warehouse tasks, Ichnowski said.
“There is a huge demand for this robotic operation in warehouses; however, automating robots to do warehouse tasks usually done by humans can be difficult,” Ichnowski said. “So what we had to do was find a way to create a repetitive process for robots to pick up different objects and place them somewhere else rapidly.”
This AI improves the Grasp-Optimized Motion Planner, a prior creation of Ichnowski and Goldberg that gives robots the ability to compute how to pick up and transfer objects from one location to another, according to the study.
The study said the original motion planner was flawed due to the “jerk” of its motions that could result in damage to the robots and its long computation time for planning out motions. These are problems that the AI attempts to fix by incorporating a deep learning neural network.
“To speed up the process of planning motion, we trained a neural network to learn from examples and make approximations to execute motions over and over again,” Ichnowski said. “However, while the neural network made motion fast, it was still inaccurate and jerky. So it became necessary to combine it with the motion planner.”
Incorporating the neural network into the motion planner cuts down the computation time for planning out motion from 29 seconds to 80 milliseconds, according to the study. Additionally, the motion planner refines the neural network’s approximations for picking up objects so that the “jerky” movements of robots are eliminated.
Jason Dong, external officer of Machine Learning at Berkeley, said the student-led organization is inspired by the creation and what it could do for the future.
“This creation is exciting because it allows robots to achieve what workers do just as efficiently, but with a certain level of confidence that they’ll do it correctly every time,” Dong said. “But there are also so many different cases that this can be used for beyond warehouse automation, which is another benefit to this creation.”