"Great customer service. The folks at Novedge were super helpful in navigating a somewhat complicated order including software upgrades and serial numbers in various stages of inactivity. They were friendly and helpful throughout the process.."
Ruben Ruckmark
"Quick & very helpful. We have been using Novedge for years and are very happy with their quick service when we need to make a purchase and excellent support resolving any issues."
Will Woodson
"Scott is the best. He reminds me about subscriptions dates, guides me in the correct direction for updates. He always responds promptly to me. He is literally the reason I continue to work with Novedge and will do so in the future."
Edward Mchugh
"Calvin Lok is “the man”. After my purchase of Sketchup 2021, he called me and provided step-by-step instructions to ease me through difficulties I was having with the setup of my new software."
Mike Borzage
December 14, 2025 12 min read

Geometry moves products, but semantics move organizations. Most digital threads still center on neutral CAD and manufacturing formats that faithfully ship shapes, meshes, and annotations, yet drop the very meaning that makes those shapes adaptable, auditable, and automatable. The result is a quiet tax on every handoff: humans re-enter constraints, rebuild assemblies, and re-validate rules that already existed upstream. This article sets aside nostalgia for purely geometric exchange and focuses on the practical path to retain and exploit the fuller story: parameters, behaviors, process intent, verification evidence, and governance signals. We’ll contrast what incumbent CAD/AM formats deliver with what modern workflows require, outline a layered model architecture for **rich semantics**, survey the emerging standards and containers that can carry that richness, and finish with a migration playbook and measurable KPIs. If your team is pushing design automation, multi-physics, model-based inspection, or supplier agility, the goal is not a new monolithic file, but a graph-backed, streamable, secure substrate that treats semantics—not just triangles and trimmed surfaces—as first-class data. With that mindset, your geometric payloads remain vital, but become only one layer in a stack purpose-built for resilient collaboration and machine-driven decisions.
Neutral formats have carried the industry for decades because they are pragmatic compromises between fidelity, adoption, and complexity. STEP AP242 delivers strong coverage for precise B-Rep, surface/curve definitions, assembly structures, tessellated representations for visualization, and increasingly robust PMI transport. It’s excellent at preserving exact geometry and assembly decomposition while allowing lightweight previews. 3MF excels in additive workflows with a mesh-centered container that cleanly bundles multiple parts, build orientations, materials, and metadata. Its slice extension enables handing off pre-sliced jobs, and the beam-lattice extension supports lattice-like elements common in AM. Vendors and service bureaus trust these formats because they are well-documented, broadly supported, and align with existing pipelines. The shared strength is reliable geometry interchange, which is still the bedrock for manufacturing and visualization. However, exactly because they prioritize broadly implementable geometry exchange, they often avoid the volatile and tool-specific semantics that actually encode design intent, behavior, and procedural manufacturing knowledge. In other words, they deliver “what” in great detail while leaving “why” and “how” to be rediscovered downstream—and that’s where time, money, and quality often leak away unnoticed.
Modern workflows increasingly depend on machine-readable semantics that persist beyond a single CAD session. They need parameters that drive geometry, constraints that encode intent, and feature graphs that reveal the generative path, not just its final shape. They need explicit descriptions of assembly behavior—mates, kinematics, motion envelopes, and configuration variants—that feed simulation and commissioning without manual rework. They need manufacturing semantics: DFM checks and their results, setup definitions, process plans, references to toolpaths or AM recipes, and lattice definitions that go beyond beams to implicit fields. They need simulation context: load cases, boundary conditions, solver settings, mesh provenance, validation evidence, and uncertainty models that allow surrogate and reduced-order models to be trusted. They need lifecycle and governance links: requirements traceability, approvals, change rationale, digital signatures, and IP rules that can be verified automatically. And they need data operations: partial load, progressive streaming, semantic diffs, branch/merge, and capability negotiation so that each partner shares and receives what they can actually use without flattening everything into triangles.
When the “why” and “how” evaporate at export time, downstream teams rebuild the invisible scaffolding by hand. CAM reinterprets faces and tolerances, recreates fixtures, and re-validates rules that were already checked in design. CMM programming loses the structure of datums, inspection strategies, and measurement tolerances that could auto-populate from model semantics. Simulation engineers receive geometry divorced from load cases and solver settings, forcing manual recreation and undermining reproducibility. Compliance teams struggle to produce consistent evidence and audit trails when approvals, signatures, and rationale get separated from the payload. Supplier collaboration becomes fragile: every vendor bridge reflects a partial understanding of the source tool, and small changes can break automation. Worse, the absence of **semantic diffs** produces costly version churn because minor updates trigger full file transfers, manual comparisons, and requalification. The industry pays for these gaps with delays, ambiguity, duplicated effort, and avoidable errors—none of which are inherent to the problem, but to the incomplete data we choose to exchange.
Solving the semantics gap calls for a layered architecture that separates concerns while linking them in a coherent graph. The idea is simple: geometry is a necessary base, but everything above it—intent, assembly behavior, PMI, manufacturing and materials semantics, simulation context, and governance—must survive transit as queryable, versioned, and verifiable data. A layered model isolates change: updating a fixture setup should not rewrite your B-Rep; modifying solver settings should not invalidate PMI; switching a material lot should carry its certs without renaming faces. This structure also enables selective sharing: stream tessellations for visualization to a partner while withholding solver decks, or share reduced-order models with a supplier without exposing the fine mesh and IP. The layers link via references and identities, forming a **queryable graph** rather than a bag of files. With consistent units, controlled vocabularies, and digital signatures, the layers together provide a resilient substrate for automation and compliance—one that can be implemented incrementally, mapped to standards, and verified with conformance tests.
The geometry layer should support exact B-Rep with analytic curves/surfaces, implicit fields where appropriate, and multi-resolution tessellations for visualization. Units must be explicit and consistent across layers. The intent layer records what makes geometry regenerable: parameters, constraints, expressions, design rules, and the feature/ sketch graph that defines the construction path. This enables “edit here” semantics downstream, not just “measure here.” The assembly layer encodes structure, mates, kinematics, motion envelopes, and options/variants. It should also include lightweight and simplified representations for large assemblies, enabling progressive visualization and partial loading. Together, these three layers represent the core of **design intent**: they capture not only the shape but the logic that produces it and the way subcomponents move and combine. When kept machine-readable and linked, downstream tools can replay design decisions, compute the impact of a parameter change, or validate that a configuration obeys its mating constraints—all without reverse engineering.
PMI is most valuable when expressed as machine-readable ISO 1101 semantics, not just textual notes or embedded images. The PMI/GD&T layer should define datum systems, tolerances, surface finish, welding symbols, and inspection strategies in a form that CMM and CAM can programmatically consume. The manufacturing layer describes process selection (CNC/AM/molding), setups and fixturing, tool libraries, toolpath references (not necessarily full G-code), and AM process recipes. It must also carry parametric lattice definitions that can be evaluated as implicit or voxel fields, not only beam-lattice abstractions. The materials layer spans specifications, alternatives, micro-to-macro properties, certificates and lot tracking, and the as-specified vs as-built deltas that support traceability and digital twins. With these layers in place, downstream stations can move beyond tribal knowledge: a CAM cell can inherit fixturing intent, a printer can apply the exact recipe with certified powder lots, and a quality cell can pick up target datums and measurement plans without interpretation errors.
Simulation is more than meshes and solvers; it is a chain of evidence. The simulation layer needs load cases, boundary conditions, solver configurations, mesh provenance (who meshed what and how), validation status, uncertainty quantification artifacts, and reduced-order models linked back to their full-fidelity pedigree. This enables credible re-use across design, certification, and service scenarios. The governance layer records who did what, when, why, and under which policy: provenance, approvals, requirements links (via MBSE/SysML), IP controls, and digital signatures. It should also encode behavioral rules—e.g., “this subgraph can be shared only with suppliers having attribute X”—so that the same container can flow through the chain without manual re-redaction. Capturing **governance** as first-class data transforms compliance from a document chase into automated checks: models can fail pipelines when signatures are missing, requirements are orphaned, or IP policies are violated, reducing risk while speeding flow. Together, these two layers make analysis auditable and decisions defensible.
Layers are only useful if a platform can query across them, stream them selectively, and version them meaningfully. A graph-based backbone unifies identities and relations; it powers queries like “which parameters influence this datum?” or “which simulation supports this approval?” Streamable, partial opening allows large assemblies to load quickly and progressively refine geometry or semantics on demand. An extension registry and capability negotiation ensure partners declare what they can read/write, preventing silent data loss. Crucially, **semantic versioning** and diffs must operate at the level of features, PMI, constraints, and kinematics—not just triangles or file timestamps—so review and merge workflows become feasible. Conformance profiles and round-trip fidelity metrics provide objective measurements of exchange quality, building trust and accelerating adoption. With these capabilities, semantics evolve from fragile annotations to dependable, testable interfaces across the lifecycle. That is how we unlock automation without sacrificing the vendor diversity that keeps ecosystems healthy and innovative.
We are not starting from scratch. The STEP family continues to evolve: AP242 edition 3 strengthens CAD/PMI coverage and tessellated representations; AP209 links geometry with FEA artifacts; AP238 (STEP-NC) targets machining process plans; AP243 (MoSSEC) addresses simulation context and evidence management. QIF provides a strong spine for metrology and GD&T interoperability with machine-readable semantics trusted by CMM vendors. On the AM side, 3MF keeps expanding with materials, slice, and lattice extensions—still mesh-centric, but effective in print pipelines. In AEC, IFC 4.x and mvdXML offer lessons on rich discipline semantics and model view definitions for role-based filtering. For lifecycle links, SysML v2 and OSLC provide mechanisms to connect requirements, behavior, and cross-tool traceability. The opportunity is to weave these standards into a cohesive, layered exchange rather than choosing a single “winner.” By aligning layer responsibilities to the standards best suited to them and using a common identity and provenance model, we can combine the precision of STEP, the interoperability of QIF, and the traceability of SysML without forcing any one format to do everything.
The container question is about composition, not empire-building. OpenUSD brings strong primitives for composition, layering, variants, and time-sampled data, which can underpin engineering scenarios if extended with industrial schemas and references to external exact-geometry and PMI payloads (e.g., STEP B-Rep, XT/SAT). On the semantics side, an RDF/JSON-LD overlay enables a **knowledge-graph** representation that is naturally queryable and extensible, with QUDT for units and PROV-O for provenance. This split lets you keep heavy geometry in specialized formats while elevating relationships, policies, and intent to first-class graph entities. Lightweight exact-geometry payloads can be swapped or updated independently, and multiple LOD tessellations can live side-by-side for visualization. The result is a hybrid model where containers orchestrate references, graphs express meaning, and engines stream what is needed. It respects the reality of existing CAD kernels and PMI implementations while enabling the richer semantics layer to flourish without waiting for all geometry holders to adopt the same internal representation.
Operational excellence determines whether semantics help or hinder. Streamable, chunked transport and level-of-detail let clients open large assemblies quickly and refine on demand: tiled tessellations for visualization, exact geometry only when needed. Semantic diff/merge with packfiles enables CAD-like version control: branch policies, automated checks, and human review that understand features, PMI, and mates. Integrity and confidentiality must be native: signed subgraphs with attribute-based access control allow sensitive parts or layers to be shared selectively, encrypted subassemblies travel safely, and watermarking deters misuse. To make all this interoperable, vendors and customers need an interop discipline: open extension registries, capability profiles, conformance suites, golden models, fuzz testing, and published round-trip scorecards. Together, these practices turn “rich semantics” from an aspirational slide into an operational guarantee. Teams can then adopt at their pace, secure in the knowledge that the same container can be verified, streamed, diffed, and governed across a diverse toolchain without fragile custom bridges.
Change sticks when it is incremental, reversible, and value-verified. Start by keeping your existing geometry/process exports (STEP/3MF) but attach a graph “sidecar” that carries intent, PMI semantics, assembly behavior, simulation context, and governance links. This approach yields quick wins without disrupting production pipelines. Next, define a canonical ontology for units, PMI entities, materials semantics, and provenance—then register extensions for what is unique to your domain. Pilot end-to-end threads that span MBD → CAM → CMM (QIF) → as-built capture → simulation update, instrumenting the flow to measure where semantics survive and where they degrade. Introduce capability negotiation with suppliers: publish minimum viable profiles by use case—manufacturing, metrology, simulation, service—so partners can onboard without guesswork. Add **semantic diff/merge** into PLM/ALM so models behave like code: branches for experimental features, protected merges for releases, automated checks for PMI validity and tolerance rules. Finally, bake in security from day one: sign payloads, encrypt sensitive subgraphs, and log provenance to make audits a query, not a project.
Without metrics, semantics tend to sprawl without delivering repeatable value. Track round-trip fidelity: the percentage of PMI, constraints, features, and mates preserved across tool boundaries. Measure coverage of assembly kinematics and configuration options in downstream tools. Compare delta package sizes to full file transfers; monitor parse/open times and interaction latency under progressive loading. Track automated rule/compliance pass rates: how many checks run without human intervention and how often they fail for semantic rather than geometric issues. Quantify interop defect rates and time-to-fix across the supply chain; use scorecards to identify chronic loss points, whether missing datum semantics or mismatched units. Include security KPIs: percentage of signed subgraphs, coverage of attribute-based access controls, leakage incidents. These measurements turn “rich semantics” into an accountable program: you can decide where to deepen modeling effort, where to standardize, and where to simplify based on observed return rather than intuition.
Adoption risks are real but manageable. Vendors may resist exposing internals or implementing non-trivial semantics; mitigate with open reference stacks, customer-led conformance programs, and demand signals tied to contracts. IP leakage concerns rise with richer metadata; counter with granular encryption, redaction, and intent abstraction when sharing with tiered suppliers. Performance regressions can occur if everything loads greedily; address with progressive streaming, caching strategies, and profile-driven optimization. Organizationally, treat semantic modeling as a product: owners, backlogs, and SLAs, not a side job. The bigger message is straightforward: the next decade of design interchange hinges on elevating semantics to first-class status. A layered, graph-backed, streamable, and secure exchange—built from evolving standards like AP242/AP243, QIF, and SysML v2, and modern containers such as OpenUSD with engineering schemas—unlocks automation, compliance, and resilient collaboration. With disciplined migration and measurable KPIs, you can move beyond STEP and 3MF without disrupting production, while laying the foundation for truly interoperable, intelligent workflows that capture not only the shape of products, but the reasoning, behavior, and evidence that make them trustworthy.

March 02, 2026 2 min read
Read More
March 02, 2026 2 min read
Read MoreSign up to get the latest on sales, new releases and more …