Reality Editor: A Perfect Match for the Smart, Connected Factory

Written By: Nancy White

Date:- 06 September 2019



Reality Editor: A Perfect Match for the Smart, Connected Factory

Several years ago, Ben Reynolds, then an undergrad studying at MIT, joined a research project led by Valentin Heun within the MIT Media Lab. The project focused on new ways of interacting with all kinds of home appliances, and even Lego toys, using a visual augmented reality interface. The title of the research project was Reality Editor, and its goal was to make the interface so simple and easy-to-use that even a child could use it.

The result was a new paradigm for spatial drag-and-drop interfaces that could program and connect disparate functions of everyday physical objects – an app that enables users (young and old, novice and expert) to visualize information from machines, control, and interact with them through augmented reality (AR).

Now at the PTC Reality Lab, Heun and Reynolds, PhD and Masters graduates of the MIT Media Lab, respectively, are heading a group of engineers researching the potential next technologies to drive industrial innovation. The original Reality Editor project has moved beyond those early demonstrations and is exploring industrial applications for Reality Editor.

One of the tenets of the early vision behind the Reality Editor is recognition of the value inherent in physical objects. While some physical things have been increasingly replaced by digital options – the calendar or planner is one example – this is not possible, or desired, for many tools and machines. Many of the once simple functionalities are now encapsulated in desktop-like visual interfaces and therefore removed from previously at-hand physicality.

With the abstractions these desktop computer interfaces use, the mental load to understand the connection of physical functionality and digital controls has significantly increased and will continue to increase with modern IIoT complexity.

“One of the main ideas that excited me about this project from the moment I joined was the idea of using augmented reality as a tool to enhance and augment the things around us,” Reynolds says. “This technology allows us to better understand and control the complexity of modern IIoT.”

Working on solutions for industrial enterprises has provided a wealth of opportunities and use cases for our research.

“When you walk into a factory, you immediately see the complexity and the multitude of machines there are. As you learn more you realize how critical it is for these machines to function correctly, safely, and efficiently, and therefore how critical it is for us humans to fully understand and intuitively be in control of such machines,” Reynolds says.

Reynolds believes the simple, intuitive interface of the Reality Editor is a “perfect match” for industrial machines, which on their own are complicated and hard to understand for inexperienced workers, or even a front-line worker familiar with the machine. With a surmounting skills gap, this lack of knowledge transfer is inhibiting worker productivity.

Spatial Programming in Augmented Reality

The Reality Editor can be imagined as a Swiss Army Knife for digital data, a multitool that allows one to configure, control, and understand networked machines. One functionality of this multitool is spatial programming; it aims to make programming machines and systems possible without being an expert programmer, Reynolds says.

“Spatial programming in augmented reality is very different than regular programming; you’re putting ideas into space using the things that are around you,” Reynolds explains. In other words, there’s no complex programming language to learn.

Take a look at this feeder demonstration to get a better understanding of how the Reality Editor works in an industrial context:



Spatial Programming Use Cases

The Reality Lab not only iterates on the reliability, scalability, and security aspects of this augmented reality tool, but also actively uses Reality Editor as its internal research platform. It is a highly optimized and powerful tool that lets the researchers invent the future of augmented reality.

Several compelling industrial use cases are emerging:

Spatial Visual Programming

In our modern IoT connected factory, the Reality Editor not only allows the user to visualize data from a machine in augmented reality, but also takes it one step further. Users can program actions for a single machine, or interactions between two or more machines. Using augmented reality, this novel spatial form of programming is as simple as connecting an electric guitar to its amplifier. Intuitive machine controls to be programmed on the fly and in-situ.

Remote In-Situ Assistance

What sounds like a contradiction becomes a more feasible use case. The Reality Editor enables the tools that allows a remote expert to visually dive into a factory, co-locate to machines and perform all applications the Reality Editor provides. A live volumetric video feed enables this technology and allows the seamless interaction among local and remote people with remote machines.

Role-Based Spatial Applications

With the Reality Editor, workers will be able to activate task-specific applications. These applications are not limited to the pure software side as we are used to from a desktop window, such as word processing or a web browser. The Reality Editor applications involve the physical functionality of the world as well. As such it is important to provide the right context to the right task. Each application can be tailored visually in augmented reality to fulfill the required needs. Here are three such examples:

A front-line worker can activate an application that allows him or her to see and interact with the operational functionality of a machine. A factory manager can have applications active that provide an overview perspective of the uptime and maintenance cycles of each machine or provide the overall status of an entire factory while walking through the aisles. A system integrator is granted access to the spatial programming tool to work on the maintenance or assembly of new factory machine. These are only three different perspectives – and there are many more possibilities; however, each requires a unique set of active augmented reality perspectives enabled by role-based spatial applications.

A Flexible and Universal User Interface

Traditional Human-Machine Interfaces (HMI) are tied to a 2D screen and difficult to program. Novel user HMIs are not bound to the 2D screen anymore; they can be visualized and operated on 2D screen, head mounted displays, and augmented reality enabled smartphones alike. The Reality Lab is researching the seamless transition between these multiple technology domains, to guarantee intuitive editing and operation and a seamless look and feel among all different visualization and interaction domains. The Reality Editor implements user interfaces that work on 2D screens in addition to AR.

These use cases are actively being researched and demonstrated within the Reality Lab, and the Reality Editor is the platform that enables their implementation.

Check out the PTC Reality Lab page for more videos and details about their research. Keep tabs on their exciting work by following them on Twitter, @PTCRealityLab.


Tags: Augmented Reality


About the Author

Nancy White

Nancy White is a content marketing strategist for the Corporate Brand team at PTC. A journalist turned content marketer, she has a diverse writing background—from Fortune 500 companies to community newspapers—that spans more than a decade.

Courtesy- PTC blogs