The Industrial Internet of Things has paved the way for machine automation of simple and routine tasks for smart, connected products (SCP). In parallel, augmented reality (AR) is empowering workers and yielding incredible gains in worker productivity.
In the future, work will be a mix of tasks conducted by humans and machines working collaboratively together. AR is the next-generation human-machine interface (HMI) to bridge the workflow gaps in tasks and maximize efficiencies of human and machine interactions.
This emergence of a task ecosystem where workers and machines work synchronously is replacing dated automation Utopian mindsets like lights out factories. Even technology pioneers like Elon Musk are citing the importance of human workers and the pitfalls of excessive automation and in particular, its detrimental impact on Tesla production.
Increasingly prevalent are the influx of autonomous systems and AI-powered robotics, which are replacing many manual tasks. This is notably different than stating automation is replacing jobs; McKinsey estimates that less than 5% of jobs consist of activities that are 100% automatable. While a worker’s future daily tasks are subject to major change, their employment status likely is not, as industries such as manufacturing are citing massive forthcoming work shortages.
Take an assembly worker for example: it is likely that through increasing digitization much of their daily monotonous paper-based documentation activities are automated and some of the assembly process that is unfavorable to them is completed by machines (heavy lifting, repetitive actions, etc.). However, variable and complex tasks like final quality verification of a manufactured product still requires a human expert in the workflow to issue a final approval.
Traditional HMIs cannot effectively contextualize and interact with these future workflows that now include physical and digital work information. The Boeing 737 is a tragic example of legacy HMIs incapable of managing tasks and interactions between autonomous systems and humans, with faulty physical airplane sensor data triggering a dangerous automated maneuver; the pilot had no interface to interact with this action.
To accommodate for the rapid change of hands for tasks within workflows, like in airplanes or in manufacturing processes, robots and humans will require both a collaborative nature with constant cross-exchanging of information and novel methods of interactions. An on-demand, in-context interface that leverages human oversight and instruct capabilities within their surrounding environments is needed to manage this influx of cyber-physical systems and the future of work.
Augmented reality is the purpose-built computer for front-line workers to monitor -- and now increasingly control and optimize -- connected machines. This emerging capability is needed as industrial companies are geared toward flexibility and agility yet challenged with downtime.
A cutting-edge example of human-machine collaboration in factories are the intersections of AR for humans and cobots for machines. Traditionally, industrial robots are expensive, fixed, and unsafe for humans to work alongside with. Cobots stick true to their collaborative name by providing a more flexible, low-cost option for manufacturers that frees up workers to take on higher-level tasks.
AR provides the natural lens for workers to instruct machines within the environment and computation to actuate commands. This could include kinetic machine control and motion programming of a cobot or another industrial robot. Instead of traditional reprogramming robot procedures, where an operator will have to leave the area to update the robot’s configurations and procedures, AR brings a virtual interactive dashboard to the shop floor in a timely and immersive experience, potentially saving thousands in changeover costs.
Combining data from a SCP with AR for interact and instruct could create a new learning from demonstration method where a robot’s task is dictated by an operator using AR to tether workflows and anchors in spaces with adjustable constraints based on IIoT data. An example could be programming a machine through AR and instructing it to precisely weld within certain measurements with IIoT giving real-time feedback that the welding machine is misaligned or about to break down.
Although at the very beginning of its maturity, this future human-machine collaboration is on display in university labs and AR providers’ research divisions. Brown University’s robotics lab is demonstrating with its Holobot the potential of AR as the interface to dictate robotic movements through voice and gesture commands.
PTC Reality Lab has developed this next-generation AR instruct concept on to its feeder machine. The Reality Editor brings logic flows through drag-and-drop programming for the operator to dynamically control the feeder and set tasks with IIoT-generated data in-mind.
The Reality Lab is also extending this interface to mobile cobots with its automated guided vehicle named Frida. The bot leverages kinetic AR and spatial mapping to program its motion in physical spaces. These instructions include path planning from set waypoints, where the bot could execute a designated task. The prototype uses AR for intuitive motions and actions, such as following the user’s position in real-time. In an industrial environment, an operator could use kinetic AR to send a mobile cobot from station to station, carrying heavy loads, a typically strenuous task for humans.
Maintaining safe interactions with the humans on the land and in the sky will be a crucial component of cobots. University of Colorado Boulder’s Atlas University is demonstrating on-the-fly programming of a drone to command its flight path and ensure it can safely operate in the same workspace as a worker, while completing complementary tasks.
AR will increasingly be the tool for front-line workers to interact with the wealth of digitized information widespread across industrial environments and instruct the machines to complete tasks within it. While guide and visualize capabilities are driving growing adoption of use cases across design, operations, sales & marketing, service, and training, a future frontier for AR innovation will be its untapped potential as an HMI, bridging the work collaboration gap between humans and machines.
David Immerman is a business analyst on PTC’s Corporate Marketing team providing thought leadership on technologies, trends, markets, and other topics. Previously David was an industry analyst in 451 Research’s Internet of Things channel primarily covering the smart transportation space and automotive technology markets, including fleet telematics, connected cars, and autonomous vehicles. He also spent time researching IoT-enabling technologies and other industry verticals including industrial. Prior to 451 Research, David conducted market research at IDC.
©Copyright 2024. All rights reserved by Modelcam Technologies Private Limited PUNE.