Skip to main content Skip to footer
  • "com.cts.aem.core.models.NavigationItem@43ff8869" Careers
  • "com.cts.aem.core.models.NavigationItem@57f9d6e" News
  • "com.cts.aem.core.models.NavigationItem@48e0418c" Events
  • "com.cts.aem.core.models.NavigationItem@2b6bf472" Investors
Cognizant Blog

With 70% of public sector organisations reporting fragmented data landscapes, cross-departmental collaboration remains elusive despite AI's potential. Here’s how marketplaces and governance frameworks are breaking down silos to deliver transformational efficiency gains.

Data sharing has become central to the UK government's mission to modernise public services and drive innovation. This urgency intensifies as the UK seeks to boost productivity amid fiscal tightening and a projected GDP growth rate of just 1.2% this year, according to recent CBI calculations.

The National Data Strategy, the 2025 Industrial Strategy update, and the AI Opportunities Action Plan converge on a singular truth: better access to high-quality data powers artificial intelligence, improves decision-making, and delivers faster, more personalised services.

Yet most departments struggle with fundamental discovery problems. They don't know what data exists, who owns it, or how to access it across organisational boundaries. The State of Digital Government Review, published in January, found that 70% of survey respondents across government reported that their data landscape is not well-coordinated or interoperable and fails to provide a unified source of truth. Data sits trapped in various formats with inconsistent quality and unclear permissions, making effective data sharing difficult.

Where AI is already delivering results

AI is already transforming public services where data sharing works effectively. In the NHS, for instance, AI helps diagnose breast cancer significantly faster, reducing patient notification times from 14 days to just three, as well as decreasing hospital admissions, as a recent pilot has shown.

Elsewhere, the Home Office uses AI to analyse immigration and border data, detecting risks earlier through real-time risk scoring algorithms and streamlining visa processing. As another example, HMRC leverages AI to identify fraud patterns and tailor services to individual taxpayers.

The economic potential of data sharing is staggering. The State of Digital Government Review, published in January, shows that improved data sharing could save £10-15 billion annually across government by reducing duplication and enhancing decision-making. Process automation based on shared data could reduce administrative costs by 25-40% in transactional services.

The 2025 Cyber Security and Resilience Bill begins addressing these challenges by mandating clearer cross-departmental data sharing standards, breach reporting protocols, and metadata tagging requirements. However, legislation alone won't solve the architectural and cultural barriers that prevent valuable insights from being shared.

Building the infrastructure for collaboration

Effective data sharing requires what we call "golden record" management for core citizen data. Names, addresses, dates of birth, and national insurance numbers form the bedrock of public sector operations, yet inconsistencies across systems create matching nightmares that undermine service delivery.

While the National Data Library (NDL) lays the groundwork for a modern government data marketplace, the real impetus comes from implementing the National Data Strategy within major, potentially federated operational delivery departments. Doing so would pave the way for golden records for services across immigration, borders, asylum, passport and civil registration, improving consistency across services and reducing verification overhead.

The technical foundation equally matters. Our experience across government shows the need for a coherent enterprise architecture that works across large, federated departments and supports continuous improvement principles. This requires an API-first approach, event-driven integration, and federated query capabilities that enable data to remain within departmental boundaries while facilitating cross-government analytics.

Data marketplaces represent the most promising solution to discovery and access challenges. Rather than requiring departments to negotiate individual agreements for each data exchange, marketplaces provide secure platforms for publishing, discovering, and accessing datasets through standardised processes.

But marketplaces only work when supported by comprehensive data management frameworks. This requires nine critical capabilities:

  1. Comprehensive metadata and data lineage enable users to understand what data represents and its origin.
  2. Rigorous data quality processes ensure accuracy and consistency across sources.
  3. Master data management creates those golden records for core citizen information.
  4. Modern data architecture provides a scalable and agile foundation that enables rapid innovation.
  5. Security, privacy, and compliance controls safeguard sensitive information while allowing authorised access.
  6. Clear data governance defines roles, responsibilities, and decision rights.
  7. AI governance frameworks ensure algorithmic accountability and explainability, critical for maintaining public trust as automation expands.
  8. Cultural transformation embeds data sharing as a strategic priority backed by leadership incentives.
  9. Cybersecurity and resilience capabilities implement zero-trust architecture and breach notification protocols.

The encouraging news is that AI and AI Agents are accelerating this transformation. Capabilities such as automated metadata tagging, self-healing data quality rules, AI-powered policy creation, and Personally Identifiable Information (PII) helps departments implement these frameworks in weeks rather than years. Platforms like Informatica, Databricks, Amazon Web Services and accelerators from Cognizant enable outcomes that previously required massive manual effort.

Most recently, Cognizant’s partnership with Anthropic and its Claude models will accelerate data marketplace adoption by embedding agentic AI that operates autonomously within governance frameworks. These capabilities streamline compliance, coordinate workflows, and resolve data quality issues, reducing manual effort and operational risk, enabling departments to scale data sharing with greater confidence.

Some of the early enablers include:

  • Automated governance: Real-time enforcement of UK regulatory and ethical standards.
  • Smarter data discovery: AI-driven metadata tagging and lineage mapping for faster, more reliable dataset search.
  • Secure, auditable sharing: Agents manage access requests and approvals through transparent workflows.
  • Self-healing data quality: Automated detection and correction of anomalies and PII.
  • Legacy modernisation: AI-assisted code refactoring and API exposure to improve interoperability without service disruption.
Why shared technical assets multiply data value

Beyond shared data standards, an effective government-wide strategy should promote shared technical assets: feature stores for machine learning reuse, standardised data APIs, CI/CD templates, and reusable code libraries. These assets, integrated into data management frameworks and marketplace platforms, enable consistent architecture while increasing delivery velocity.

This approach transforms how departments think about data and technology development. Rather than building isolated solutions, teams design assets for collective benefit across government-wide digital services. This requires strategic alignment, cross-departmental collaboration, and leadership that encourages shared accountability. In today's constrained macroeconomic environment—where public debt sits near 96% of GDP—maximising value from existing datasets becomes essential for sustaining high-quality public services.

Starting data-sharing initiatives without proper frameworks can cause them to slow down, stall, or be cancelled due to a lack of perceived value. Meanwhile, departments with robust data management foundations can implement marketplaces where they confidently publish and access datasets, enabling secure, compliant, and efficient data exchange.

The need for strategic action grows more urgent each passing month. Early movers will benefit from capabilities that compound over time; those that delay will face rising costs from maintaining fragmented systems and diminishing ability to serve citizens effectively.

Our experience working with UK government departments on data transformation reflects this pattern. We’ve supported the modernisation of legacy systems and the development of data offices and data platform ecosystems across areas such as immigration services, regulatory oversight, and cross-departmental analysis, always with a focus on maintaining critical services while improving the use of data for decision-making and citizen outcomes.

In a world where AI-enhanced services become the baseline expectation, data sharing transforms from an operational nice-to-have into a strategic necessity. The economics are unforgiving, but the opportunity remains substantial for departments ready to act decisively.


Ritesh Singh

Head of AI, Data Delivery and Solutions - Public Sector, Cognizant 

Author Image




Dr Pal Kulandaivel

Client Partner for Home & Justice - Public Sector, Cognizant

Author Image





In focus

Latest blog posts

More blog posts