What is the Testbed on Trusted Data and Systems?
The Testbed on Trusted Data and Systems is a collaborative innovation programme coordinated by the Open Geospatial Consortium (OGC) and anchored within the international geospatial community. Originally launched under the name Testbed Europe, the programme was renamed after organisations from outside Europe expressed interest in participating, reflecting its growing global scope. It is designed to help National Mapping and Cadastral Agencies (NMCAs) and other authoritative public data providers modernise, adapt, and thrive in a rapidly evolving data landscape.
It is not a research project. It is not a standards committee. The Testbed on Trusted Data and Systems is a practical, results-oriented initiative in which real engineering challenges are tackled by real organisations, producing open, reusable outputs that any NMCA can adopt.
The programme brings together two kinds of activity. Two of its work streams are delivered in direct partnership with EuroGeographics, reflecting shared priorities around modernising service delivery and establishing fair commercial models for authoritative data. These carry joint branding from both organisations. OGC leads the remaining work streams in response to needs raised by its own members and by individual NMCAs from Europe and beyond. EuroGeographics members are welcome to participate in these, but they are managed and branded by OGC independently.
The Testbed on Trusted Data and Systems exists because NMCAs around the world share common problems that no single agency can solve alone, and because solving them together, with the right expertise at the table, is faster, cheaper, and more durable than solving them in isolation.
The OGC Testbed Model
OGC has run innovation testbeds for over 25 years. The model is straightforward.
Sponsors, typically public agencies or industry organisations, identify concrete requirements and contribute funding. OGC assembles a curated team of technology providers, domain experts, and standardisation specialists best suited to address those requirements. The team works collaboratively and openly to produce engineering reports, prototype implementations, and reusable specifications. The budget contributed by sponsors is redistributed to the participating organisations doing the work, making sponsorship an investment in solutions rather than overhead.
OGC operates at the preproduction level. It does not build or operate production systems. Instead, it develops prototypes and engineering guidance that show how future production systems should be designed. The production itself is carried out by the agencies and companies that adopt the results.
The Testbed on Trusted Data and Systems adapts this proven model to the specific context of NMCAs: the regulatory environment, the INSPIRE legacy in Europe, the need for better integration, and the resource pressures that characterise public geospatial agencies today.
Who is the Testbed on Trusted Data and Systems for?
The Testbed on Trusted Data and Systems is primarily designed to serve National Mapping and Cadastral Agencies and other authoritative public data providers, who produce and maintain the foundational geographic data on which governments, businesses, and citizens depend.
At the same time, it is built to engage the private sector: technology companies, platform providers, and data integrators who rely on NMCA data and who have both the capacity and the incentive to contribute to its improvement. This engagement from both sides is central to the programme’s design.
Work Streams
The following nine topics represent the work streams for the Testbed on Trusted Data and Systems. They have been identified through dialogue with NMCAs, EuroGeographics, and OGC members, and reflect the most pressing shared challenges in the geospatial sector today. Each work stream will be scoped in detail in accordance with sponsor requirements. The descriptions below define the problem space and the intended direction, not a fixed specification.
Work streams 1 and 2 are delivered in partnership between OGC and EuroGeographics. They address the modernisation of data services and the establishment of fair commercial models for authoritative data. These two streams carry joint branding from both organisations.
Work streams 3 through 9 are led by OGC in response to needs identified by its members and by individual NMCAs. These streams are managed and branded by OGC. EuroGeographics members are welcome to participate, but these activities do not carry EuroGeographics branding or endorsement.
Partnership Work Streams: OGC and EuroGeographics
1. Renew the Engine: From Legacy Services to Modern Web APIs
OGC + EuroGeographics
Challenge: Many NMCAs still operate on legacy OGC service standards such as WFS and WMS that predate the modern web. These aging interfaces limit interoperability, scalability, and adoption by contemporary applications. As NMCAs look to modernise their backend systems, there is a clear and shared need for practical migration guidance and tested reference implementations.
Approach: The testbed will define and pilot migration pathways to OGC API standards, including OGC API Features, Tiles, Maps, and Coverages. This will enable NMCAs to modernise their data delivery infrastructure step by step and at manageable cost. The work stream will produce practical guidance, reference implementations, and reusable migration templates that any agency can adopt regardless of its current technical starting point.
2. Fair Exchange Between Authoritative Data Providers and Commercial Platforms
OGC + EuroGeographics
Challenge: Large commercial platforms derive significant value from authoritative public geodata, yet National Mapping and Cadastral Agencies receive little in return: no revenue, no quality feedback, and no technology transfer. The relationship is imbalanced, and there is no established model for correcting it.
Approach: The testbed will investigate governance and technical models inspired by Wikimedia Enterprise: tiered access, reciprocal data contribution, and structured commercial licensing that rewards public data providers while keeping open access intact for noncommercial use. The goal is to establish a practical framework in which the value of authoritative data is recognised and fairly shared between public providers and the commercial platforms that depend on it.
OGC Led Work Streams
3. Addressing Resource Shortages: Outsourcing Research and Enhancement to OGC
OGC
Challenge: NMCAs face growing demands for data modernisation but lack the internal research and development capacity to keep pace. Critical enhancement tasks go unaddressed not for lack of will, but for lack of available people and expertise. Some agencies have no difficulty recruiting additional staff, while others are severely constrained.
Approach: OGC can act as a trusted intermediary, commissioning targeted research and innovation tasks on behalf of NMCAs through its global network of member organisations. This brings the most capable and innovative companies to the table while shielding NMCAs from procurement complexity. The work stream responds to a clear need from agencies that want to move forward but cannot staff additional work internally.
4. Semantic Interoperability: Aligning Data Models Without Forcing Convergence
OGC
Challenge: Data produced by different agencies is often structurally compatible but semantically fragmented: the same real world feature is described differently across datasets, making it difficult and expensive to combine or compare data from multiple sources. This is a fundamental barrier to effective data use, including by AI systems that must understand what the data means in order to process it correctly.
Approach: OGC has developed a new approach to semantic interoperability that does not require all parties to agree on a single, rigid international schema. Instead, organisations agree on a small shared core model and then derive their own profiles in a standardised way. Because the derivation process is itself standardised, machines can reconcile different profiles automatically. This work stream will test and refine this approach, enabling agencies to keep their own data models while dramatically improving the ability to integrate data across organisational boundaries. The approach is attracting strong interest internationally and is directly relevant to the broader discussion about how future data frameworks, including any successor to INSPIRE, should balance rigidity with flexibility.
5. Protecting Sensitive Geospatial Data: Technical Approaches
OGC
Challenge: Several NMCAs are facing growing pressure from security and defence communities to reconsider what geospatial data is made publicly available. There are concerns that detailed data about critical infrastructure may be exploited. Once data has been released openly and downloaded by third parties, it cannot simply be retracted, and agencies are asking what technical measures can still be applied.
Approach: This work stream will explore technical methods for managing the release of sensitive geospatial data going forward. This includes techniques such as spatial obfuscation, selective resolution reduction, and controlled access mechanisms that can be applied to new data releases without breaking existing systems. The focus is on practical, implementable solutions rather than policy guidance. National decisions about what data to release and under what conditions remain with national authorities and their relevant ministries.
6. Common Identifiers: Bridging Different Systems for Geographic Features
OGC
Challenge: Multiple identifier systems exist for geographic features, each serving a different community and purpose. There is no single universal identifier, and imposing one is neither realistic nor desirable. The challenge is that organisations and users who work across these systems cannot easily link or reconcile records that refer to the same real world feature.
Approach: This work stream will investigate dynamic mapping environments that allow organisations to retain their own identifier systems while still being able to discover and link to equivalent features identified under other schemes. Rather than standardising a single identifier, the approach focuses on building bridges between existing systems, including those from INSPIRE, Wikidata, national registers, and commercial providers (e.g., GERS).
7. Data Aggregation and Assembly at the National Level
OGC
Challenge: In federal countries and other states where data responsibilities are distributed across multiple regional or local authorities, assembling a coherent national dataset is a significant operational challenge. The aggregation pipelines involved are often bespoke, brittle, and expensive to maintain.
Approach: This work stream will prototype standardised approaches for assembling national datasets from distributed subnational sources within a single country. It will define interfaces, quality checks, and assembly workflows that make this process more reliable, repeatable, and efficient. The focus is on national level aggregation to support domestic needs such as infrastructure planning, traffic simulation, and emergency response.
8. Integrity, Provenance, and Trust
OGC
Challenge: As data passes through multiple systems, transformations, and organisations, its lineage becomes difficult to trace. The geospatial data landscape is changing rapidly. Alongside established providers such as national space agencies, there is now a growing number of commercial satellite operators, and civil drone operators are expected to multiply significantly in the coming years. At the same time, AI makes it possible to synthesise realistic data efficiently. In this environment, users increasingly need ways to verify that the data they receive is authentic, unmodified, and fit for purpose.
Approach: This work stream will implement and evaluate provenance tracking mechanisms and integrity verification approaches, including cryptographic signatures, standardised lineage metadata, and trust frameworks. The goal is to enable data consumers to assess origin and quality with confidence. The work is relevant to both civil and defence communities and responds to growing concerns about data reliability in an environment with many new and diverse data providers.
9. Future Ready Geospatial Metadata for AI and Advanced Technologies
OGC
Challenge: Geospatial metadata standards in widespread use today were designed primarily for human consumption and catalogue based discovery. Standards such as ISO 19115 and its national profiles have provided long standing consistency, but their structure reflects an earlier era of file based workflows. With the rapid growth of AI, agentic systems, and cloud computing, metadata must increasingly be read, interpreted, and acted upon by machines rather than people. Current standards are not well suited to this shift, and there is no agreed path for evolving them.
Approach: This work stream will identify what geospatial metadata needs to look like in an AI driven future. It will evaluate the fitness of current international metadata standards, including ISO 19115, GeoDCAT, OGC API Records, STAC, and schema.org, against the requirements of advanced technologies. Based on this evaluation, it will develop recommendations for how existing standards should evolve and what new approaches may be needed. The work stream will also prototype and test modern metadata approaches, including demonstrations of how AI enabled metadata can support natural language discovery tools, automated dataset identification through APIs, and seamless consumption of multilingual metadata assets. Attention will be given to the ability of metadata frameworks to support multiple languages and to mechanisms for assessing metadata quality and completeness. The ultimate goal is a clear understanding of the limitations of existing standards, the changes required, and the practical steps needed to transition to a metadata environment that serves both machines and people.
Join the Conversation
Input and engagement from across the geospatial community are essential to ensuring Testbed Europe remains relevant, balanced, and practically useful. To contribute or learn more, please contact Muthu Kumar at mk****@*gc.org.
OGC(R) Requests Comments on Candidate Specification Model Standard