DEV Community

Cover image for The Digital Product Passport Is Moving From Concept to Execution
Oliver Fries
Oliver Fries

Posted on

The Digital Product Passport Is Moving From Concept to Execution

For a long time, the Digital Product Passport felt theoretical.

Standards discussions. Working groups. Position papers.

Now it’s different.

Budgets are allocated. Platforms are being selected. Pilot projects are turning into implementation roadmaps. The question is no longer if.

It’s:

How do we actually deliver this in a real system landscape?


The Standard Is Not the Hard Part

The Digital Product Passport builds on the Asset Administration Shell.

The specification is detailed.The semantic structure is defined.

Interoperability is a serious effort. But in industrial reality, the friction is rarely the standard itself.

It’s this:

  • Product master data in ERP
  • Technical attributes in legacy .NET systems
  • Bills of material in PLM
  • Units that differ between systems
  • Field names that evolved over fifteen years
  • “Digital” processes that still rely on Excel exports

You don’t “introduce AAS” into that environment.
You extract data from brownfield systems.

You map fields to semantics.

You normalize units.

You validate mandatory elements.

You automate the process.

That is the actual work.


Platforms Are Being Bought. The Integration Layer Is Missing.

Many companies are currently evaluating or purchasing DPP platforms.

That step makes sense.

But a platform does not create structured data where none exists.

The real gap is not on the visualization layer. It is on the delivery layer.

What the industry needs now are:

  • Mapping libraries
  • Validation mechanisms
  • Deterministic submodel builders
  • CI-integrated generation processes
  • Clear ownership of source fields

Not more slides.

Engineering-grade building blocks.


Why Open Source Matters at This Stage

DPP is becoming infrastructure. Infrastructure should not depend entirely on opaque vendor logic.

If regulatory compliance depends on generated submodels, then teams need to know:

  • Where each value comes from
  • How it is transformed
  • Which rules validate it
  • How it is versioned

Open source enables that transparency.

You can inspect it.

You can test it.

You can adapt it to your landscape.

When regulation meets software delivery, black boxes become risk factors.


I Built FluentAAS

FluentAAS was not created to “explain AAS.”

It was created to build it.

In real .NET environments. With legacy systems. Under delivery pressure.

I needed a way to:

  • Build submodels in a structured, typed manner
  • Encapsulate mapping logic in code
  • Validate mandatory fields early
  • Integrate generation into CI/CD
  • Version output alongside application releases

FluentAAS follows a simple idea:

If DPP becomes part of compliance, submodels must be reproducible artifacts. Not manual exports.

It is not a platform. It is not a full DPP solution. It is a building block.

Because the ecosystem needs composable components, not monolith tools.


Legacy Is Not the Problem

There is still a reflex in many discussions:

“DPP means we need a new architecture.”

In most industrial environments, that is not realistic.

Legacy systems often contain the most reliable production data in the company.

The real issue is uncontrolled integration.

Rewriting a production-critical .NET system to “enable DPP” introduces risk without solving the mapping problem.

You stabilize first. You define clear boundaries. You build a controlled extraction layer.

Then you map and validate into AAS.

Legacy is not replaced. It is integrated under control.


What a Serious AAS Pipeline Requires

If DPP becomes regulatory infrastructure, the pipeline must be:

  • Deterministic
  • Reproducible
  • Automated
  • Versioned
  • Validated

That means:

  • Clear source ownership per field
  • Explicit transformation logic
  • Unit normalization at a defined boundary
  • Mandatory field validation before export
  • CI-integrated generation
  • Traceable versioning tied to product state

If you cannot regenerate the same submodel for the same product version tomorrow, you do not have compliance-grade infrastructure.

You have a snapshot.


The Shift That Is Happening

The conversation is slowly changing.

From:

“How do we understand the standard?”

To:

“How do we build the integration layer?”

That shift is necessary.

It requires engineers.

It requires discipline.

It requires real tooling.

The DPP ecosystem will mature through libraries, SDKs, validation tools and working pipelines.

Not through architecture diagrams alone.


Closing Thought

Momentum is real.

Now the industry needs delivery capability.

The Digital Product Passport will not be implemented by strategy documents.

It will be implemented by engineers who take responsibility for the last mile:

From legacy data to validated, reproducible AAS submodels.

If you are building in this space, focus on the pipeline.

That is where success or failure will be decided.

Top comments (0)