Design Software History: Semantics in Design Software: From CAD Attributes to Product Models and Digital Twins

December 01, 2025 13 min read

Design Software History: Semantics in Design Software: From CAD Attributes to Product Models and Digital Twins

NOVEDGE Blog Graphics

Introduction

Why semantics matter now

Computer-aided design began as a pragmatic way to draw faster, but it has steadily become a language for expressing what a product is, how it behaves, and why it should be built a certain way. That shift from pictures to meaning—what engineers and computer scientists call semantics—underpins today’s ability to search, validate, simulate, manufacture, and maintain complex systems across decades. In the 1970s and 1980s, drafting systems were ingenious, yet their “knowledge” lived in human convention: a layer name implied material, a block name implied a part type. By the 1990s and 2000s, standards bodies like ISO, consortia like PDES, Inc., and researchers at NIST began to encode semantics into neutral product models. The result is a continuum that now includes ontologies formalized in RDF/OWL, knowledge-based engineering rule engines inside major CAD suites, and end-to-end manufacturing and building workflows guided by machine-readable PMI, IFC property sets, and digital twins. This article traces that journey—from attributes riding shotgun on geometry to full-fledged product models that support reasoning—by highlighting pivotal software, standards, companies, and people whose work made semantics practical. The aim is not nostalgia; it is to show why the industry’s boldest ideas—lights-out inspection via QIF, automated code checking in BIM, and USD-based interoperability—stand on a deep foundation of semantic modeling.

From Attributes to Product Models: How Semantics Entered CAD

Early semantics in drafting CAD (1970s–1980s)

Early commercial CAD packages like IBM’s CADAM, Intergraph’s I/EMS and I/DRAFT, and Autodesk’s first AutoCAD releases did not “know” design intent in a formal sense; nonetheless, they incubated semantics through conventions that engineers learned to treat as rules. Layers grouped geometry into thematic collections—dimensions on one layer, centerlines on another, material cuts on a third—and those choices conveyed meaning to anyone trained in the shop’s culture. Blocks (AutoCAD) and cells (Intergraph) offered reusable patterns whose names and attributes acted as proto-metadata. When Autodesk added DWG/DXF extensibility with XData in the late 1980s, users and third-party developers could attach custom name–value pairs to geometry, seeding patterns that later matured into formal product data. As parametrics emerged, design intent entered the foreground. Pro/ENGINEER, created by Samuel Geisberg’s team at PTC, introduced feature-based solid modeling where parameters, relations, and regeneration order embodied how a part should be built, not just its final shape. Unigraphics (later Siemens NX) expressions provided equation-driven control across features, allowing dimensions to become variables governed by logic. These tools made semantics first-class: values were queryable; change propagated deterministically; and feature histories created a causal narrative that could be interpreted by both humans and software. The cumulative lesson from this era was clear: the “meaning” of a drawing—naming, layering, feature tables—was carrying more weight than its lines and arcs. With that realization, industry and government sponsors began to push beyond proprietary ecosystems toward neutral data models that could preserve meaning across organizations and decades.

  • CADAM, Intergraph, and early AutoCAD codified conventions via layers, blocks/cells, and naming.
  • DWG/DXF with XData enabled custom attributes to ride alongside geometry.
  • PTC Pro/ENGINEER and Unigraphics expressions elevated parameters and relations to “design intent.”

Neutral data models and the push beyond geometry

By the 1980s, the U.S. Department of Defense and the National Bureau of Standards (later NIST) recognized that multi-decade aerospace and defense programs could not depend on a single vendor’s file formats. The IGES standard emerged with participation from Boeing, General Motors, GE, and McDonnell Douglas to shepherd geometry and attributes across systems. IGES did carry notes, layers, and some annotations, but semantics were fragile: vendors extended the spec, meaning was often vendor-specific, and much of the “intent” remained implicit. The 1990s ushered in ISO 10303, better known as STEP, with application protocols like AP203 and AP214 that unified 3D geometry with configuration and process data. PDES, Inc. coordinated industrial pilots, while Martin Hardwick’s STEP Tools commercialized software libraries that made STEP practical for suppliers. The deeper turn came when researchers began to formalize “what is a product?” beyond shape. At NIST, Ram Sriram and Sudarsan Rachuri articulated the Core Product Model, while the Open Assembly Model described assembly structure and constraints with clarity. These blueprints envisioned product definitions that encompassed function, behavior, performance, and lifecycle context—not just the boundary representation (B-rep). The momentum continued through AP242, which later carried semantic PMI, and through PLCS (AP239) for configuration and support. In aggregate, the community moved from shipping geometry with attached notes to shipping product models whose entities and relations were computable across design, manufacturing, and service. That change set the stage for ontologies to enter the scene with logic-based reasoning.

  • IGES enabled cross-vendor exchange but left semantics weak and vendor-flavored.
  • STEP AP203/AP214/AP242 tied shape to configuration and later to semantic PMI.
  • NIST’s Core Product Model and Open Assembly Model defined products beyond mere geometry.
  • PDES, Inc. and STEP Tools (Martin Hardwick) translated standards into industrial practice.

Ontologies and Knowledge-Based Engineering: Naming, Reasoning, and Rules

What “ontology” means in design software

An ontology is a controlled vocabulary with formally defined classes, properties, and constraints that software can reason over. In engineering, this means representing parts, assemblies, requirements, tolerances, materials, processes, and maintenance events as interlinked entities rather than free-text annotations. Technologies like RDF and OWL bring logical semantics—subclass hierarchies, property restrictions, and machine-checkable rules—so that “every bolt with property P must also have property Q” is not just a guideline, but something a reasoner can validate. Ontology work in industry has coalesced around domains: ISO 15926 models the process plant lifecycle from piping specs to instrumentation; STEP AP239 PLCS (Product Life Cycle Support) extends product structures into in-service configurations and maintenance records; and in the built environment, mappings like ifcOWL render IFC (ISO 16739) into an OWL ontology, enabling SPARQL queries and rule-based validation. The intent is to persist knowledge across tools and lifecycle stages. Consider how configuration control and requirements traceability benefit: if a constraint in AP239 asserts that a serialized assembly must be linked to its as-maintained configuration, ontology-based stores can enforce that invariant. Similarly, ifcOWL enables queries like “find all fire doors that violate clearance rules” without ad hoc scripts in each BIM authoring tool. While traditional PDM/PLM schemas often hardwire meaning, ontology approaches provide extensibility, align with web standards, and support federation—crucial for supply chains that will not agree on a single monolithic database.

  • RDF/OWL encode classes, relationships, and constraints for machine reasoning.
  • ISO 15926, AP239 PLCS, and ifcOWL anchor logic in process plants, product support, and BIM.
  • SPARQL enables cross-tool queries; reasoners validate conformance to formal rules.

Knowledge-based engineering (KBE) as applied ontology

The interplay between ontologies and KBE is practical: ontologies name the world; KBE uses those names to act. In the 1980s and 1990s, ICAD pioneered rule-driven design at Boeing and other aerospace firms, allowing engineers to encode parametric rules for airframes and systems that automatically generated geometry, documents, and bills of material. Engineous Software’s AML and Isight (later part of Dassault Systèmes SIMULIA) extended this ethos to process automation—capturing multidisciplinary workflows, optimization loops, and model transformations as reusable templates. Siemens NX embedded Knowledge Fusion, a Lisp-like language that manipulates feature trees and attributes; PTC’s Pro/PROGRAM and Behavioral Modeling (BMX) turned Creo models into programmable objects; and Dassault Systèmes’ Knowledgeware and EKL (Enterprise Knowledge Language) knitted rules into CATIA and the 3DEXPERIENCE platform. The practical content of KBE is rarely rocket science; it is corporate memory: supplier envelopes, fastener preferences, minimum bend radii, deflection limits, certification constraints, and sequencing rules. Encoded once, these rules provide leverage—designers explore more options, errors manifest early, and downstream consumers receive models that are consistent by construction. In effect, KBE creates executable knowledge tied to named classes and attributes, which is why a healthy KBE stack increasingly leans on ontology-backed vocabularies to remain coherent over time and across programs.

  • ICAD enabled rule-driven generation of airframes and systems.
  • AML/Isight and process automation captured optimization and multidisciplinary flows.
  • NX Knowledge Fusion, PTC Pro/PROGRAM/BMX, and CATIA Knowledgeware/EKL embedded rules directly in CAD.
  • Executable knowledge encodes heuristics, standards, supplier rules, and certification constraints.

BIM and computable building knowledge

Architecture, engineering, and construction adopted semantics through the Industry Foundation Classes (IFC), standardized as ISO 16739 and stewarded by buildingSMART. Unlike pure geometry exchanges, IFC provides entity types (IfcWall, IfcDoor, IfcSpace), relationships (aggregations, containment, connectivity), and property sets that capture performance, codes, and specifications. Chuck Eastman’s long advocacy for building product models helped industry treat buildings as data-rich assemblies, not drawings. On top of IFC, the COBie schema—championed by Bill East—standardized the handover package for facility operations, identifying what assets exist, where they are, and what maintainers need to know. Rule-checking engines like Solibri and others now consume IFC and buildingSMART property sets to evaluate program compliance, code rules, and clashes by reading model metadata, not text notes. This makes building knowledge computable across authoring tools like Autodesk Revit, Graphisoft Archicad, and Trimble Tekla, and across the long life of a building from design to FM. The approach is increasingly ontology-friendly: ifcOWL exposes IFC semantics for linked-data queries, and national classification systems layer domain vocabularies. The result is a discipline in which the meaning of a door, duct, or pump is explicit, enabling software to reason about clearances, capacities, and codes without bespoke scripts for every project.

  • IFC and buildingSMART property sets function as a domain ontology for AEC.
  • COBie formalizes handover semantics for facilities management.
  • Solibri and rule-checking tools assess codes, clashes, and program adherence from structured data.
  • Authoring platforms—Revit, Archicad, Tekla—interoperate via IFC and shared vocabularies.

Smart CAD Models at Work: MBD/PMI, Additive, and Visualization Graphs

Model-Based Definition (MBD) and Product Manufacturing Information (PMI)

Model-Based Definition carries the manufacturing authority in the 3D model, relegating 2D drawings to a derivative view when needed. The standards underpinning MBD crystallized in ASME Y14.41 for 3D annotation practices, STEP AP242 for semantic PMI (machine-readable dimensions, tolerances, finishes, materials), JT as ISO 14306 for lightweight visualization with attribute payloads, and 3D PDF/PRC for shareable dashboards that retain PMI. The industrialization of MBD is visible in CATIA’s “3D Master,” Siemens NX PMI, PTC Creo, and SolidWorks MBD, while PLM systems such as Teamcenter, Windchill, and 3DEXPERIENCE maintain associativity—ensuring that when a model changes, annotations and downstream consumers track the change. Downstream automation closes the loop: CMM and inspection systems ingest QIF (from DMSC) to derive inspection plans directly from toleranced features; tolerance analysis packages like DCS 3DCS and Sigmetrix CETOL 6σ simulate variation effects; CAM and toolpath verification respect GD&T zones and datums. The net effect is a supply chain that treats PMI as code: the same way source code compiles into executables, PMI compiles into plans, fixtures, and measurements. This only works when semantics are not presentation-only; symbolic GD&T must be unambiguous to a machine, and feature logic must travel intact from CAD to PLM to quality and NC. AP242 and QIF are the connective tissue that make that promise concrete.

  • ASME Y14.41, STEP AP242, JT (ISO 14306), and 3D PDF/PRC form the MBD backbone.
  • CATIA “3D Master,” NX PMI, PTC Creo, and SolidWorks MBD enable semantic annotations.
  • Teamcenter, Windchill, 3DEXPERIENCE preserve associativity along the digital thread.
  • QIF-driven CMM, and tools like DCS 3DCS and CETOL 6σ, turn PMI into executable checks.

Additive manufacturing semantics

Additive manufacturing brings its own semantic requirements: not just shape, but process parameters, material systems, lattice descriptors, and build instructions. AMF (ISO/ASTM 52915) moved beyond STL’s tessellations to encode curved triangles, units, colors, materials, and meta-information. The 3MF Consortium—spearheaded by Microsoft with contributors like Autodesk, Materialise, Stratasys, and others—focused on a modern ZIP/XML container that captures materials, textures, units, beam lattices, and slices, while defining extensibility for processes and vendor-specific needs. This matters because additive workflows are data-rich by nature: a unit-cell lattice may be defined by parameters that feed both simulation and production; a build plan encodes orientation, support strategy, and scan vectors; a powder reuse policy travels with the job as a constraint. Software stacks such as ANSYS Additive Suite, Autodesk Netfabb, and Siemens AM solutions rely on parametric and material ontologies to keep analysis, optimization, and planning coherent. When a designer specifies a gyroid lattice with target stiffness and density, the meaning must persist: topology optimization and multiscale simulation need it; the slicer and printer need it; downstream inspection needs it to compare as-built porosity against intent. The additive community is thus converging on formats that treat process semantics as first-class citizens, making it feasible to audit, certify, and reproduce parts across machines and years.

  • AMF and 3MF encode materials, colors, units, and lattice/process data absent from STL.
  • Autodesk, Microsoft, Materialise, Stratasys, and partners push interoperable AM semantics.
  • ANSYS Additive, Netfabb, and Siemens AM suites hinge on parametric/material ontologies.
  • Persistent lattice and process semantics enable simulation–print–inspection continuity.

AEC digital twins and lifecycle metadata

In the built environment, the digitization of operations turns BIM into digital twins layered with time, telemetry, and maintenance semantics. IFC property sets, combined with classification systems like Uniclass and OmniClass, and anchored by the buildingSMART Data Dictionary, provide the scaffold for handover and operations. Authoring platforms—Autodesk Revit, Graphisoft Archicad, Trimble Tekla—export models that, when enriched with sensor IDs, asset tags, and warranty metadata, become operational graphs. Twin ecosystems extend this into live systems: Bentley’s iTwin and iModels stream model states and change sets; Autodesk Tandem integrates model context with commissioning and operations; Siemens Xcelerator links mechanical/electrical twins with SCADA and enterprise asset management. The semantics make cross-domain queries routine: “Show all air handling units with filter ΔP out of spec and overdue maintenance,” or “Highlight dampers that violate fire-safety egress constraints after last renovation.” Under the hood, this requires versioned, queryable graphs and governance: who can update which vocabulary, how are property sets evolved, and how is provenance tracked? The answer emerging in practice is a mix of IFC as the backbone, classification systems as overlays, and cloud graph stores that track deltas and identities over time. With that foundation, twin platforms can align design intent with operational reality, enabling performance analytics, predictive maintenance, and compliance reporting from shared semantics rather than custom integrations per building.

  • IFC property sets, Uniclass/OmniClass, and the buildingSMART Data Dictionary encode AEC semantics.
  • Revit, Archicad, Tekla supply models that become operational graphs when enriched with asset data.
  • Bentley iTwin/iModels, Autodesk Tandem, and Siemens Xcelerator layer time and telemetry onto BIM.
  • Versioned graphs and governance sustain multi-decade digital threads for facilities.

Visualization and scene description as semantic carriers

Visualization formats have matured from “what you see” to “what you can compute.” Pixar’s USD, increasingly the lingua franca across DCC and engineering, structures scenes as composition graphs with namespaces, schemas, and non-destructive layering. In NVIDIA Omniverse, USD’s schema-based extensions allow CAD and simulation semantics to coexist with visual assets, enabling workflows where a mechanical assembly’s mass properties, constraints, and materials are not lost when exported for visualization. Material standards like MaterialX and MDL capture the semantics of shading and appearance—what is a coating, what is a substrate—so that rendering and product visualization remain faithful across engines. For distribution, glTF provides a compact, PBR-ready carrier with extensible metadata channels, letting client apps consume geometry, textures, and annotations efficiently. The significance for design software is twofold. First, scene description as a knowledge graph allows interoperability across tools without flattening intent; USD’s variant sets, for example, represent product configurations cleanly. Second, visualization stacks are becoming the shared substrate where CAD, simulation, and controls meet; semantic richness ensures that a “bolt” is not merely triangles, but a typed component with relationships, attributes, and behaviors. As more vendors map their kernels and PLM systems to USD schemas and glTF extensions, visualization ceases to be the last mile—it becomes a primary carrier of design semantics across the enterprise.

  • USD’s composition, schemas, and layering preserve meaning across tools and stages.
  • NVIDIA Omniverse hosts CAD and simulation semantics alongside DCC assets.
  • MaterialX and MDL codify material semantics; glTF carries compact, extensible assets.
  • Visualization acts as a semantic bridge, not a lossy endpoint.

Conclusion: From Geometry-Centric to Knowledge-Centric Design

The historical arc

The throughline from 1970s drafting to today’s digital threads is a steady climb from implicit convention to explicit, computable meaning. Layers, blocks, and naming in CADAM, Intergraph, and AutoCAD taught teams to carry semantics in shared habits. Parametric features and expressions in Pro/ENGINEER and Unigraphics made intent explicit and queryable. IGES exposed the limits of geometry-only exchange, prompting STEP AP203/AP214 to link shape with configuration and laying groundwork for AP242’s semantic PMI. Parallel strands in AEC saw IFC and buildingSMART property sets define a domain model for buildings; Chuck Eastman’s product-model vision took root in practice. Researchers like NIST’s Ram Sriram and Sudarsan Rachuri specified Core Product Models, while industrial champions including PDES, Inc. and Martin Hardwick’s STEP Tools brought standards into factories and supply chains. Vendors—Dassault Systèmes, Siemens, PTC, Autodesk, Bentley, Trimble—embedded semantics across CAD, PLM, and twin stacks. The consequence is profound: models are no longer just inputs to human interpretation; they are nodes in shared knowledge graphs that feed simulation, manufacturing, inspection, and operations with far less ambiguity and far more automation. This journey has been pragmatic rather than doctrinaire, blending standards with vendor innovation to yield systems that scale beyond any single tool or project.

  • From implicit conventions to explicit parameters and relations.
  • From geometry exchange to product models that carry lifecycle meaning.
  • From isolated files to enterprise knowledge graphs spanning decades.

Key challenges ahead

Despite progress, semantics at scale presents hard problems that sit at the intersection of technology, governance, and economics. Cross-domain harmonization is the foremost: products blend mechanical, electrical, software, and AEC concerns, yet vocabularies and models often live in silos. The industry needs versioned, queryable graphs that federate STEP/AP schemas, SysML/UML systems models, EDA netlists, and IFC assets without crushing them into a monolith. Governance is equally urgent: vocabularies evolve; supplier IP must be partitioned; permissions and data sovereignty constrain graph sharing across borders. Every semantic change—adding a property, deprecating a class—needs provenance, impact analysis, and rollback. Finally, practical reasoning at scale remains a challenge: OWL reasoners are expressive but can be heavy; graph and column stores are fast but less declarative. The likely path is hybrid: embed lightweight constraint checking and incremental validation inside CAD/PLM tools; centralize heavyweight reasoning and analytics in cloud services; and standardize interfaces so that every tool can participate. Without this balance, semantics risks becoming shelfware—beautiful ontologies unconnected to daily design work. With it, the industry can keep semantic integrity while meeting the cadence and scale of real programs.

  • Harmonize mechanical, electrical, software, and AEC semantics across federated graphs.
  • Institute governance for vocabulary evolution, IP/permissioning, and traceability.
  • Blend OWL/RDF reasoning with fast graph/column stores and incremental checks in tools.

Likely next steps

The near future will be marked by consolidation around mature standards and by AI systems leveraging them. In discrete manufacturing, expect wider adoption of AP242-based PMI with QIF-anchored inspection closing the loop: model, annotate, plan, measure, and feed back into tolerance schemas. In AEC, IFC 4.x workflows will mature, with broader use of classification overlays and the buildingSMART Data Dictionary to stabilize property semantics across national variants. On the visualization front, USD will deepen its role as a neutral substrate; vendors will publish schemas that capture design intent—parameters, mates, constraints—rather than just tessellated proxies, while glTF will continue to serve performant delivery. Additive will push 3MF with extensions for process and materials, unifying design–simulation–print pipelines. Most consequentially, AI models trained on standardized ontologies will propose features, enforce rules, and explain designs in context—shifting CAD from drawing what to explaining why. The groundwork is already visible as vendors expose graph APIs and semantic services in platforms like 3DEXPERIENCE, Teamcenter, Windchill, iTwin, and Tandem. The opportunity now is to ensure that these advances remain interoperable: if the industry keeps standards and governance at the center, the next decade will convert semantic potential into everyday productivity across the entire lifecycle.

  • AP242 PMI and QIF adoption close inspection loops; IFC 4.x workflows stabilize AEC semantics.
  • USD schemas encode intent; glTF delivers; 3MF extends to process and materials.
  • AI leverages standardized ontologies to propose, check, and explain designs.



Also in Design News