The Double-Edged Sword of Data Standards: Benefits at the Core, Burdens at the Edges

By Chris Lees

Data standards are often celebrated as the bedrock of digital transformation. From the ISO-based Industry Foundation Classes, to Uniclass for classification, to OSCRE’s Industry Data Model for real estate — they offer a shared structure and language for exchanging, integrating, and automating data.

And within their originating communities, they work exceptionally well. Standards help core stakeholders streamline processes, enable automation, and unlock powerful analytics. These benefits — well captured in industry business cases — tend to flow to the community at the centre of standard-setting.

But in practice, no organization is an island.

A developer might align to IFC and OSCRE, yet depend on the emerging Open Digital Planning (ODP) data from local authorities to assess planning constraints. They may need to incorporate official search data — as specified by the NLIS hub, from HM Land Registry, or the Mining Remediation Authority — into due diligence workflows.

Each of these standards or data sources emerges from different contexts, with different priorities and perspectives. And herein lies the challenge: the further an organization or stakeholder is from the core of any one standards community, the harder it becomes to apply a given standard without friction.

At the Edges, Fit and Benefit Degrade

Stakeholders on the periphery — including regulators, service providers, or tech vendors — often operate in environments where a given standard doesn’t align neatly with their domain logic, data capture methods, or information needs. Even within the built environment, what makes sense in construction doesn’t always make sense in operations. A schema designed for asset owners may not suit local authorities.

As a result, peripheral stakeholders are often left mapping, translating, or reconciling between standards from multiple communities. Or worse, they’re expected to conform to standards that weren’t designed with them in mind — increasing compliance costs and reducing efficiency. Exactly the inverse of the benefits case made for the standards in the first place.

This is especially true when organizations are exposed to multiple overlapping data standards. A property developer, for example, might need to reconcile planning data (ODP), legal boundaries (Land Registry), geotechnical hazards (Mining Authority), and design/construction data (IFC, Uniclass, OSCRE). These standards and format often evolved in isolation, and while each has internal logic, they rarely fit together neatly.

And that’s before we consider the even wider supply chains including manufacturers whose customer segmentation may cover everything from wellness to space exploration.

Standardization Should Serve the Whole Ecosystem

It’s easy to assume more standardization always leads to better outcomes. But unless we design with the entire ecosystem in mind taking a system-of-systems approach as is being advocated within the Council for Science and Technology, we risk creating disbenefits at the edges — duplicated work, conflicting definitions, brittle integrations, and disengagement from stakeholders who feel excluded from the standard-setting process.

The million-dollar question is this: do those disbenefits in aggregate exceed the benefits?

Efforts like nima‘s Project Indigo and – with the Construction Leadership CouncilInformation Management Initiative, OSCRE International‘s Semantic Data Infrastructure strategy, and the work of the The RED Foundation are building on previous work done by ISO and others to avoid this. This involves foundational elements, including:

  • Focusing less on the format and more on the semantics.
  • Enabling and expecting data standards publishers to interoperate to enable reuse.
  • Machine-readable standards that reduce the overwhelming reliance on people to perform activities like data mapping.

Standards are powerful tools — but only when they’re built for connection, not isolation.

Let’s move toward a future where our standards reflect the interconnected reality of the data ecosystems we work in. I believe this is achievable, today.

Leave a Reply

Your email address will not be published. Required fields are marked *