[ad_1]
Amazon operates some of the most advanced fulfillment center warehouses in the world. Reading that opening line, it’s a fair bet you are picturing conveyor belts loaded with packages, or flat little robots carrying big yellow pods full of products around. But Amazon is also pushing the frontiers of human-centered engineering — a.k.a. ergonomics — to create the safest, healthiest warehouses in the world for its employees.
“While ergonomics is a relatively modern discipline, its standard approaches feel outdated,” says Mohammad Esfahani, a principal ergonomic research scientist at Amazon. “We are pioneering new ergonomic tools that incorporate virtual reality to help design flexible warehouse systems that can support the individual needs of our associates.”
High-definition simulation of potential workstation designs
The responsibility for the design of Amazon’s global fulfillment network — including new generations of buildings and their equipment, tools, and systems — lies with its World-Wide Design Engineering organization (WWDE). WWDE conducts ergonomic research to better understand how associates move while performing their tasks. Analyzing details such as posture, hand and arm movements, bending, movement speed, and repetition can help avert ergonomically driven problems like fatigue and injuries.
But, as with many research endeavors, the validity of the result is only as good as the data on which it’s based. WWDE has been devising approaches to gather realistic data to inform the design of more human-centered workstations and equipment that enhance associates’ wellbeing.
The limits of convention
A key challenge arises when designing the layout of next-generation workstations: They don’t exist yet — physically. So how do designers discover how a new layout or process will impact the people at the center of it? Traditionally, there are two options.
You can create an animation of a person, but it’s very clunky and robotic — not how a real person would move at all.
One approach is to create computer models of proposed workstation setups using standard ergonomics-industry software and then manipulate digital mannequins to simulate the movements required to complete the task.
“You can create an animation of a person, but it’s very clunky and robotic — not how a real person would move at all. It could be more accurate,” says Chris Morse, a senior innovation and design engineer working with Esfahani at WWDE.
Such digital human modeling tools can fall short because they rely on predefined assumptions about human behavior when predicting human motion and cannot fully account for the natural range and variability of real people and how they move.
A second method, which can confirm the results of digital simulations or explore new designs, is to build a prototype and observe people working with it. That’s a lengthy process, which also makes it hard to simultaneously explore multiple design possibilities.
To innovate faster than a prototyping approach allows, and to overcome the limiting assumptions of standard ergonomics tools, the team decided in 2021 to try placing real people — first the researchers themselves; later, Amazon fulfillment center associates — inside high-definition simulations of potential workstation designs. Doing that required a novel combination of virtual reality (VR), motion capture, and ergonomic-analysis tools. Here’s how it all came together.
Virtual workstation development
To create a virtual environment, the team started with a highly realistic, computer-aided design (CAD) model of a proposed workstation. The model was brought into a digital platform widely used to develop games, VR experiences, and other 3D applications. Taking the CAD model as a starting point, the team built a set of animations and potential interactions. With a VR headset and hand-held controllers, a person could place themselves within the novel workstation environment and “pick up” items, manipulate buttons, place packages on conveyor belts, or perform other interactions that might be necessary to complete a warehouse task.
So far, so good. But how to accurately capture the movement data of people in the VR environment? This required another layer of technology: a motion capture system to record body movements, accelerations and rotations of arms, legs, back, head, and shoulders. The data from the motion of someone wearing the VR headset and two hand controllers could be used to recreate postures of the upper body. By placing additional sensors on the legs and back, the team extended their data capture to entire bodies.
These data were then synthesized into a highly accurate animated digital human model that is about as close to reality as you can get (see diagram, picture b).
In developing the system, the team created a VR simulation of a simple workstation at which Amazon associates were asked to pack items for shipping to customers, a task that included four representative tasks.
Welcome to the power zone
While the team’s virtual reality ergonomics (VRE) tool is designed to explore the usability of potential new workstations, it has already made a positive impact on wellbeing in a very established aspect of Amazon’s fulfillment process, known simply as pick.
This is a departure from traditional industrial ergonomics … our ultimate goal is to create flexible workstations and systems that harmonize with individual associates.
Most Amazon items are stored in compartments within tall yellow pods in an automated part of a fulfillment center. When a customer places an order, a small robot carries the pod containing the desired item to an associate’s workstation, where the person picks the product out of its compartment.
Many times each day, an associate will pick items from these compartments, which range in height from near the floor to above the associate’s head. As many people know, doing a lot of bending down or reaching up over your head to grab something is more fatiguing than picking things that are in your “power zone” — that sweet spot between your waist and shoulders.
The team used their new VRE tool to evaluate the ergonomics of a new, individually tailored approach which takes associates’ average power zone into account.
Another view of a high-definition simulation
Let’s say a customer ordered a particular toy. To boost warehouse flexibility, items for sale are stored in multiple locations within fulfillment centers. Under normal circumstances, the automated system would simply bring the nearest pod containing the toy to the associate for picking, regardless of the height of the compartment the toy was in. The idea with the new approach is that the automated system attempts to make the picking easier for the associate, by finding not necessarily the nearest pod that holds the toy, but the nearest pod that holds the toy at the level of the average associate’s power zone.
“We were able to explore the potential impact and carry out the initial validation much more quickly and easily than would have been possible with real-world testing,” says Morse.
This is a departure from traditional industrial ergonomics, says Esfahani: “Instead of developing standardized systems that work well for most body shapes, our ultimate goal is to create flexible workstations and systems that harmonize with individual associates.”
New and improved
Right now, the team is using its VRE system to explore a range of new systems and improvements to existing workstations. Take T-shirt printing, which associates perform as part of the company’s Merch on Demand offering. For the T-shirt printing process, the team developed VR simulations of the existing printing workstation, as well as two proposed new printer stations. The VRE tool allows the team to work out which setup will be easiest on the associates using the machines, and it will guide decisions on which system will be selected for future warehouses.
It takes advanced ergonomics to find new improvements in Amazon processes, and although the changes are sometimes subtle, they are certainly worthwhile. “In a huge organization like Amazon, even small changes have big impacts,” says Morse.
That said, the impacts of the team’s pioneering approach to ergonomics may one day be felt more powerfully across Amazon and even other industries. “It’s 2024. It’s time to harness the technologies that have arisen in the last two decades to generate new ergonomics guidelines and tools,” says Esfahani.
window.fbAsyncInit = function() { FB.init({
appId : '1024652704536162',
xfbml : true, version : 'v2.9' }); };
(function(d, s, id){
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) {return;}
js = d.createElement(s); js.id = id;
js.src = "https://connect.facebook.net/en_US/sdk.js";
fjs.parentNode.insertBefore(js, fjs);
}(document, 'script', 'facebook-jssdk'));
[ad_2]
Source link