Product Deep-Dive: Spatial Search!

Zain Raza
8 min readMay 20, 2023

--

Note: opinions are my own and stories are presented from my own perspective. This is NOT official communication from PTC.

Photo by Shane Aldendorff on Unsplash

Special Credits

Special thanks to Valentin Heun for sharing his UI mockup for Spatial Search (shown below), and Marc-Thomas Schmidt for his help in describing its user journey!

Section 1: The Industrial Metaverse — A Gentle Introduction

Use Case of Interest: Remote Service

In this article, my goal will be to describe new capabilities being built by the Industrial Metaverse (abbreviated IMV) team at PTC (whom I have worked several times with over the 6 months) to try and fix the problems found in service solutions for industrial companies today. Let’s go.

Industry Analysis

When speaking about the Industrial Metaverse, probably the first question on our minds is: why now? Let’s outline several forces that might be making 2023 a turning point for advancing the state of remote service:

  1. Web-Based Integrations Between SaaS Solutions: web-based SaaS applications are nothing new — there’s a good chance you’re using one right now (thank you, Medium.com)! In the manufacturing space alone, there are so many software systems an enterprise might require to operate effectively: e.g., CAD, PLM, MES, ERP, FSM, and on and on. Having all these applications is great, but it can also lead to information silos between several systems of record (abbreviated “SoRs”). This then becomes an issue when adopting AR/VR solutions (in this case, remote service) — for instance, what if you need to bring 3D assets out of a CAD system, when answering a service call? Ideally, an enterprise should be able to pull that information out of their existing CAD data. If we can solve that “hand-off” problem for customers (more on that in just a moment), then the Industrial Metaverse can truly begin to immerse users; by providing all the same meta-information at their fingertips that’s useful in solving issues.
  2. On-Device ML on the Rise: limits on computing power and/or networking on mobile devices can also hinder communication over a remote service call. But, that issue might soon be a thing of the past. For example, with the emergence of Apple’s latest consumer iPhones/iPads, it is now possible to reconstruct indoor environments in 3D, with full 360-degree view synthesis, using enhanced on-device processing. What that means for us is enhanced communication — instead of the cramped, claustrophobic feeling of trying to convey a complex industrial apparatus through a mere iPhone camera, remote experts and end-users can seamlessly collaborate over a shared immersive space in full 3D, reconstructed on the fly on their phone.

Note: if folks would like to learn more about the “Pop-Up Metaverse” concept, please see this article by Shiva Kashalkar, our VP of Product Management for the Industrial Metaverse.

Competitive Landscape

Perhaps you still don’t believe that the Industrial Metaverse is anything special. To motivate this term, let’s elaborate on the comparison between the kinds of service solutions we’ve relied upon in the past: physical, in-person-based support; and remote service provided over digital technologies.

When it comes to providing service for industrial products, companies have had to make difficult tradeoffs for decades — do we wish to provide the most accurate/hands-on support to our customers, or the most convenient/easily consumable?

Servicing physical products, in theory, might seem like a straightforward process:

  1. You sell a product
  2. Your customer (inevitably) calls in with a problem
  3. You send a service technician over to fix the problem

In this scenario, we trust the technician — or some other kind of subject matter expert (abbreviated “SME”) to save the day. If they know their stuff and can solve the problem with our product hands-on, or at least accurately troubleshoot the cause of the issue, that helps us honor service-level agreements (SLAs) and ultimately maintain strong customer relationships.

But, what’s the problem with hands-on expert service?

Imagine this:

  • let’s say you’re running a company like Raytheon; and one of your core businesses is to sell missiles and other defense equipment to the US military
  • i.e., you have LOTS of warfighter customers relying on your product who are deployed thousands of miles away, yet still need you to provide critical support FAST when something goes wrong.

Enter remote service — with tools like Scope AR, Vuforia Chalk, or perhaps even a regular ‘ol Zoom meeting — you find yourself attempting to deliver the same kind of service to your customers remotely, for the convenience of not having to get on the next fastest plane, every time there’s a problem.

Nonetheless these tools are not without their problems: as it turns out while it might be convenient to hop on a remote call, it is not always convenient to have to collect supporting 3D assets (i.e., let’s go back to the example of having access to a CAD model handy for that meeting, or parts information from a PLM system). Secondly, if your end user’s mobile hardware is not powerful enough/has only limited networking, it can severely degrade the quality of communication between the customer and your technician.

Between these two options — in-person service vs. remote service calls — we can see some of the problems that modern industrial face. But the interest in remote service may not be lost for good.

Section 2: Brainstorming Solutions

Briefly, let’s recall the 2 trends in remote service I mentioned in the previous section:

  1. Web-Based Integrations Between SaaS Solutions
  2. On-Device ML on the Rise

Let’s face it — it’s probably not clear at first how these two pieces “fit” together in an actual software solution for service. Or, that’s at least how I felt as a rotational analyst at PTC, when I began a software engineering role in the Office of the CTO (aka “OCTO”) near the start of January this year.

For context, PTC’s OCTO houses several teams that have spent the past 5+ years developing core technologies needed for spatial computing. Let’s tackle two in particular, which nicely map to the trends above.

The Reality Lab (Maps to “On-Device ML on the Rise”)

In the OCTO we have the Reality Lab, driven by Valentin Heun’s vision to leverage “spatial context” in everyday processes, which has developed world reknown breakthrough in areas like spatial programming, robotics, and more (e.g., see the Vuforia Spatial Toolbox).

And now you ask: so what? In addition, this group is keenly interested in using 3D reconstruction techniques for the purpose of generating immersive spaces such as industrial environments. What’s special about their work is that they’re lowering the barrier to entry closer and closer to consumer-level hardware. Having worked with Valentin and following his work and class on the Metaverse online (and on YouTube), I’ve come to observe that their technology used to require installing large, expensive depth cameras all throughout an indoor space for a given reconstruction (as you can see in this demo from 3 years ago). Now, suffice to say we can do this reconstruction all on-device, given merely the depth cameras found in the latest iPhone/iPad Pros.

Advanced Atlas Services (Maps toWeb-Based Integrations Between SaaS Solutions”)

What about that software engineering role I mentioned? It’s not in the Reality Lab — instead, I work on a different OCTO team that builds integrations between our company’s various solutions: the Advanced Atlas Services (AdAS). As you may guess, those integrations include those being developed for the Industrial Metaverse…

And now you ask: so what? At a high-level, our team exists so that when end-users get their hands on PTC’s Metaverse solutions, it won’t be limiting their access to information found in other systems throughout their enterprise, be it CAD, PLM, IoT, etc. That’s because my team works cross-functionally, enabling developers in teams like the Reality Lab to seamlessly access data across all of PTC’s products (which together can be referred to as the “Digital Thread”). TL;DR, no more information silos!

Section 3: Product Development Deep-Dive

When Valentin’s group at PTC (namely, Ben and Keranie) first approached our AdAS team, the most immediate use case they wanted our support on was a feature called Spatial Search. In a sentence, it is a lightweight search engine, which allows frontline employees to effortlessly lookup information on a given industrial machine — sourced from whatever software systems are available throughout their enterprise — all within a single UI!

User Journey: Remote Service, Inside in the Industrial Metaverse

To give a fuller illustration, let’s consider how your process for delivering product support changes with Spatial Search:

  1. You sell a product
  2. Your customer (inevitably) calls in with a problem
  3. Before you decide to send (or in some cases, fly) a technician over to the customer site, they whip out their smartphone, scan the relevant environment around the machine, and create a “Pop-Up Metaverse” on the spot.
  4. Now it’s time for our frontline worker and SME to hop on a shared call. But instead of the two of time wasting time just to conjure up a shared mental model of the 3D space, they now have a shared 3D computer-generated model to view on their devices (most likely a desktop for the SME, and a mobile phone/tablet for the frontline personnel).
  5. Then, the task for the SME is to merely select spots of interest in that Pop-Up Metaverse environment, and decorate them with content retrieved from systems along their enterprise’s SoRs.
  6. This is where Spatial Search enters: simply put, one of our users (can be the SME or frontline person) opens a dialog box to select what info should be associated with the spot. They are presented with a search field and enter something like the model name of the product they want to look up. Then, they are presented with a list of information assets associated with that search term (e.g., a parts catalog from Windchill, a popular PLM system). They can navigate the “search results” and select certain assets to be associated with that spot, close to or exactly at the point where the field worker would find it the most useful.
Special thanks to Valentin for sharing this UI mockup, and Marc-Thomas Schmidt

Demo Time!

Want to see what Spatial Search looks like in action? Take a look here, PTC’s CEO Jim Heppelmann mentioned as part of his keynote at LiveWorx 23 (starts around 1:32 below):

A short snippet of a larger keynote by PTC’s CEO, Jim Heppelmann, in which speak about the Industrial Metaverse at LiveWorx (pronounced “live works”).

Although Spatial Search is only shown briefly during the keynote, the demo showcases achievements across the full stack:

  • on the front-end, the IMV team developed new, 2D UI component “markers” to mark the spots in the 3D environment which are associated with assets from outside SoRs,
  • and on the back-end, our AdAS team provided an API gateway to several Digital Thread SoRs, while only needing to add one more REST service and no additional authorization dialogs to the Spatial Search UI

Conclusion

So there you have it! To say the least, I am excited for the future of remote service —by utilizing features like Spatial Search, it’s possible to allow the Digital Thread to fade into background, and let companies streamline the service process by crafting AR experiences that are tailored to the information found within their own SoRs.

Feedback time: what did you think of this post? Please share thoughts/questions. And, I’m open to any suggestions on what else you’d like to hear about API development at PTC, or the Industrial Metaverse —let’s build this future together!

--

--

Zain Raza

Software engineer interested in A.I., and sharing stories on how tech can allow us to be more human.