“The most profound technologies are those that disappear. They weave themselves into the fabric of everything day life until they are indistinguishable from it.” - Mark Weiser, “The Computer for the 21st Century”
Published in 1991, this quotation, the opening line of a Scientific American article, has stood the test of time, outlining the future of computing – “ubiquitous computing,” Weiser coined it. While many of the technologies described in the article have come to fruition, the overarching vision – “machines that fit the human environment” – is still very much a work in progress.
The PTC Reality Lab is actively researching how to improve computing within the confines of the industrial space. Factories with hundreds of machines, thousands of sensors, and billions of data points, are the ideal playground for this vision. They’re researching and prototyping to extend Weiser’s vision of ubiquitous computing with technologies that enable workers to seamlessly engage with machines in a way that cuts through the complexities.
PTC’s Vuforia Engine is a powerful technology that has been enabling the entire field of Augmented Reality for years. When Vuforia is combined with the vision of ubiquitous computing, we see the emergence of an entire new field of human machine interaction called Spatial Computing. This new field of computing is what the Reality Editor was designed for and it is the foundational platform for the research performed in the PTC Reality Lab. We’ve previously written about the emerging use cases of this technology.
As the Reality Editor was designed to be a flexible platform that allows for simple integration of new ideas, services, AR applications, and technologies, it helps to “weave” (to use Weiser’s phrasing) together disparate machines and user interfaces within a single human-machine interface (HMI).
For example, the Reality Lab team’s kinetic AR and collaborative robot prototype and research is implemented into the Reality Editor. Executed within the Reality Editor platform, this novel technology can make use of all other Reality Editor functionalities and allow a user to externalize ideas into the physical space with a powerful user interface. That’s the core concept of Spatial Computing. The spatial programming functionality is used not only to program the logic of machines, but also coordinate and program motion of robots within a factory.
Imagine being able to program this process in a smartphone app:
The scenario: A worker needs assistance from a robot to carry a heavy load from one area to another but is working in another area of the factory and wants the robot to pick up the load when it reaches 60 pounds.
The process: The worker opens the Reality Editor app on her smartphone. Within the app, using spatial visual programming, she programs the activation state of a feeder machine to synchronize with the scale placed on an AGV (autonomous guided vehicles) and sets a limiter logic element to detect the weight of a bin and stop when 60 pounds is reached. She also uses the Reality Editor to program a visual motion path that the AGV executes. Synchronizing the last stop of that path as an additional condition for the feeder to start, she allows the AGV to reach the feeder, request to fill a bin and continue its route once 60 pounds is reached. At any given time, the worker sees a visualization of the bin’s weight in AR or an HMI, and if the entire process requires any change that change can take place instantly.
With smart, connected factories becoming increasingly prevalent tied to the adoption of IIoT and other technologies, there is significant value in the ability to connect disparate systems and processes with a user-friendly platform. One of the key benefits is reduction in changeover times and downtime (both from a machine and worker perspective) which allows for greater flexibility and agility across business functions.
That’s been evident just within the Reality Lab team’s research – with each new iteration and idea, new value has been uncovered.
“There’s value in enabling multiple projects to run parallel and combine their functionalities into a powerful AR platform like the Reality Editor,” says Valentin Heun, the Reality Lab’s lead scientist.
“The synergy of the Reality Lab team is enabled by this Reality Editor platform and evident in our research. A single project alone cannot unveil the true potential that is Spatial Computing.”
Nancy White is a content marketing strategist for the Corporate Brand team at PTC. A journalist turned content marketer, she has a diverse writing background—from Fortune 500 companies to community newspapers—that spans more than a decade.
©Copyright 2024. All rights reserved by Modelcam Technologies Private Limited PUNE.