As an Amazon Associate I earn from qualifying purchases.

Making robots better co-workers

[ad_1]

Robots are everywhere these days. They paint cars, perform surgery, and make up security patrols. At Amazon, robots play increasingly key roles in warehouses, where they help speed deliveries to millions of customers every day.

Nima Keivan wants to help Amazon’s robots and fulfillment center workers to collaborate more effectively, so that robots can perform the more mundane tasks, while the humans can focus on higher-value jobs. Keivan is a robotics researcher for Amazon. Before that he served as CTO and co-founder of Canvas, a robotics startup building vision-based material handling mobile autonomous robots for the manufacturing and logistics industries. In 2019, Canvas was acquired by Amazon.

A Lego enthusiast as a youngster, Keivan moved from building basic sets to adding motors and other machinery to what had been a static toy. “The stuff with Legos was very mechanical and electrical,” he says. “Later I learned programming and added a software layer. With these three components, I could build a system that sensed its surroundings and could move and do things.”

To Keivan, despite the many tasks robots perform in the world today, they retain limitations. In particular, in many instances their inability to work seamlessly alongside people means they have to be assigned predictable, rote tasks where they’re either separated from people, or carefully controlled and slowed down.

“The goal is to build robots that can go ‘outside the fence,’” he says. “We’re working to build robots that are safe, can work seamlessly around people, and can be deployed in large numbers.”

Specifically, Keivan sees three hurdles to clear before robots can take the next step in productivity. These aren’t small steps – each one will represent a substantial, sharply delineated improvement in robot capabilities. They are: equipping robots with spatial AI; enabling them to understand what it means to work around people; and providing them with sufficient smarts to react to novel situations in real time.

Spatial AI

Robots are behind humans and even animals when it comes to spatial intelligence. For example, dogs have a good sense of their surroundings, know where their bed is located, and know the route for their morning walk. Robots? Not so much. “Robots lack spatial intelligence unless they are very tightly guided, such as with a magnetic strip embedded in the floor,” says Keivan. “For robots to work closely with people it’s a requirement that they have real spatial intelligence. They can’t be getting tangled up with the things around them, or getting lost.”

Working around people

Robots with spatial AI can successfully navigate through workspaces. But to move through that same space when people are behaving unpredictably (as people often do) — stopping to talk with co-workers, suddenly bending over to pick up a dropped set of keys, sharply changing the direction they’re walking — requires another step in robots’ skills. “Robots need to be able to react to continuous movements of people. What if a forklift drive puts down a pallet, and the robot encounters the object that wasn’t there 30 seconds earlier? They need to be able to manage that.”

But it’s immensely difficult to give robots that ability. People use so many signals in moment-to-moment interaction: They talk to one another, point at things, even give subtle looks with their eyes. “The cues people use to communicate is really hard to replace,” says Keivan.

Reactivity

For robots to succeed in proximity to people, they need to do more than know their way around or avoid people. They need to react in real time to changes in their physical environment or the behavior of the people they are helping. “If you could manage the first two, and then take an hour to process the third requirement– well, then that system is not useful,” says Keivan. “People’s tolerance for delay in robots ‘outside the fence’ is extremely low. It can be measured in less than a second. If you ask a robot to move out of the way and it doesn’t respond, then you would immediately assume something is wrong.”

We’re working to build robots that are safe, can work seamlessly around people, and can be deployed in large numbers.

Giving robots that sort of real-time responsiveness is extremely difficult: As Keivan notes, we can’t throw infinite computing resources at every robot on the shop floor. It will take careful layering developing new robotic capabilities. They will need to move through space even when the space is new to them, carrying with them some sort of internal map, perhaps like a GPS-driven automobile-routing system. They will have to respond to people and spaces that are constantly evolving.

Says Keivan: “All of this will require fairly exceptional sensing in challenging environments such as one with reflective surfaces. Not only does the sensing need to be high-fidelity, but the reasoning around the sensors also has to be able to deal with the consequences or after effects of the actions of what is being sensed.”

Altogether, robots with these skills will require suites of sensors that have extremely fast computing that does not overly tax the resources of the robot or the cloud it’s connected to; and a dataset that allows them to interpret what is going on around them. “My role has me looking across all of that,” says Keivan. It’s a big task. But Keivan thinks progress is being made at a fast rate.

More efficient robots also will confer benefits: Greater productivity and even greater employment opportunities for those who learn to repair, program, and design robots. “More intelligent, better robots are inevitable,” says Keivan. “Their benefits will be enormous.”



[ad_2]

Source link

We will be happy to hear your thoughts

Leave a reply

Discover Your Essential Style at NovaEssentials
Logo