Automating Industrial CAD Workflows: The Shipment Pattern

Zain Raza
5 min readJul 16, 2023

--

Photo by Timelab on Unsplash

For those of us familiar with PTC Flow (and if you’re not, please feel free to watch this LiveWorx presentation on it, starting at 15:25) you know it can be a powerful tool to use to integrate different software systems used in an enterprise.

But, how would one actually use PTC Flow, for automating the transfer of their CAD data? As a software engineer at PTC, who worked on developing a connection between PTC Flow and Onshape over the past 6 months, I have sat in on many conversations with Onshape users, developed proof-of-concepts for PTC’s CTO, and collected other use cases. This had led me and my colleagues to coin all-new design patterns for using PTC Flow.

I want to share one such pattern with you today (again, assuming you already know what Flow is), as a way to give you a better idea on how it can accelerate your CAD workflows, by automating the workflows that usually would be done by hand. We call it — the Shipment Pattern!

🌱 Pattern basics

Title

The Shipment Pattern — it deals specifically with when you have a CAD model located in one system, and need to export — or ship — it to another system.

Just to those readers already experienced in software development — I would tell you that at a big picture view, this Shipment Pattern is not really anything new — it functions similarly to the more popular Model-View-Controller, or MVC pattern. Because the Model is whatever artifact(s) are stored in the source system, the View can be likened to whatever output you wish to send to the destination system, and the Controller is Flow itself. The name “Shipment Pattern” therefore is just here to communicate the specific problem we want to solve when building Flow-based integrations!

Description

“Using (PTC) Flow, a user orchestrates the export of a (potentially large) collection of artifacts out of one system, manipulates it in someway, and imports it into another system.”

Example Flows

Consider the example use case: you’re using Onshape in a startup; and you’ve decided to use Google Drive to backup all the data being authored in the CAD solution. When a change notice comes to the CAD engineers in Onshape, they set about making new revisions of the relevant CAD models. When they’re ready to send it back to the backup folder in Drive… wait, what exactly will they be sending back to Drive? And how will do it in a way that’s as efficient as possible?

The Shipment Pattern can help provide an answer here. Let’s walkthrough the two “flows” you’d probably create for this (if you’re using PTC Flow):

  • Flow 1: Given an Release Package (encapsulating the Onshape data), Flow requests Onshape to generate derivative CAD files for the relevant 3D models found in said release package:
Flow 1: Given an Release Package for a product, Flow requests Onshape to generate derivative CAD files for the relevant 3D models found in said release package.
  • Flow 2: Once all the CAD files are created, Flow imports them into a user-specified data store (in this case, Google Drive).
Flow 2: Once all the CAD files are created, Flow imports them into a user-specified data store (in this case, Google Drive).

🧐 Pattern Specifications

“Okay Zain, great. So I can send data out of Onshape to other systems. But in general, what would I need to have to actually do this in the real world???”

I like that question! Let’s get into the specifics a little deeper, so you could apply this pattern to your own use case:

Input

  1. Authorization credentials for at least 2 systems — the Source and the Destination
  2. Pre-existing resources in the Source, that will be used to create the artifact(s)

Steps of the Pattern

  1. Commissioning the Workload: a request comes to the Flow, typically via a trigger or some other REST mechanism.
  2. The Workload: Flow will request the Source system to kick off some (ideally async) job to generate the artifacts that need to be exported.

In the Example above:

  1. Onshape sends an individual webhook notification for each CAD file that gets created, which is basically a JSON payload. Our app extension’s web server would be able to recognize that using an if statement, checking that the event field in the notification is set to: onshape.model.translation.complete.
  2. Then, there’s a second if statement that checks once all the CAD files in the data store have been finished translating. At that point, it triggers the flow to do the final export into the destination system (see step 3 below).
  3. The Shipment: Once completed, then Flow will retrieve the artifacts and start importing them into the Destination systems. As a best practice, this Flow should be async when possible!
  4. Output: once the import into the Destination system is complete, typically our Flow should send some kind of notification to the relevant stakeholder that they can now access those artifacts (e.g. via email)

➡️ Additional guidance

FAQs

  • “What if the connector for the Source/Destination has a broken trigger?”

This has happened before. In that case, one workaround you can use is to have the Flow use a simple webhook trigger. Then, write a minimalist web server, that can call that Flow (using the URL of the afforementioned webhook) based on the appropiate conditions.

  • “During my shipment process, I must be able to intermittently update the state of my artifact(s). Any recommendations?”

Never to fear! We do have a precedent for this. We can get it to work based on writing relatively simple app extension for Onshape, which used an LRUCacheto track the state of the Onshape assets being translated. Please see this Node.js server on GitHub for the full implementation. And see the Onshape developer documentation for more info on building app extensions!

  • “I have other questions about PTC Flow…”

…well, you are in luck, my friend! At PTC, we work cross-functionally to make sure our customers can mix-and-match how our offerings work, almost like the LEGO blocks we played with as kids. If you have feedback/questions about:

  • Integrating with Onshape: please see my last, introductory tutorial on that here,
  • OR would like support on using Flow in general — see the support docs here,
  • for all other things: feel free to leave a comment on this post! I’ll be interested to hear what you think, and the kinds of automations you find interesting…

--

--

Zain Raza

Software engineer interested in A.I., and sharing stories on how tech can allow us to be more human.