Gavin Starks: Smart Data Insights
Gavin Starks helps unlock data-enabled value at sector, national and international scale. He has created over a dozen companies, employing 100’s of people, and delivering $100M’s in measurable impact. He co-chaired the development of the Open Banking Standard, co-chaired the UK Smart Data Council, and was founding CEO of the Open Data Institute. He has mentored, chaired, and been a director of over 40 organisations across diverse sectors. He currently runs data governance organisation ib1.org to accelerate our Net Zero Future.
How does an astrophysicist get distracted by data, financial services, and infrastructure, and pulled into making net-zero a core focus of his work?
“It’s actually less of a jump than it sounds. Astrophysics is fundamentally about understanding complex systems through data. At Jodrell Bank (radio telescope) where I worked, we were mapping the universe: you learn to deal with incomplete information, vast datasets, and trying to pull meaningful signals out of the noise.
When I moved to work on the early web, I realised those same principles applied to markets and the data that shapes and enables them. The web mirrors society in its complexity and systems challenges.
One of the biggest systems challenges we face is our climate emergency. Aside from some of the politics, which is material and distracting, the challenge is not a lack of ambition or capital. Instead the challenge is how to realise market incentives at scale, and in this instance this requires bringing data from the real economy into the financial economy, across sectors. And this is where trusted data infrastructure can enable markets to work properly.
Back in 2006 I set up AMEE (the Avoiding Mass Extinctions Engine) to enable the calculation of the footprint of everything on Earth - an open data, open API-based, venture-backed initiative. Through that work we clearly set out the scale of the challenge and helped to work through many of the economic, legal, policy and technical points of friction (e.g. this was before smart meters, and SaaS was still early). However, the business case is clear: if finance cannot reliably quantify and track risk and opportunities in the real economy, it cannot allocate capital effectively. Much of my work has become about building the plumbing of systems so we can measure, understand, and act at scale.”
You have deep experience in setting up data sharing infrastructure; what are some of the key lessons you’ve learned over the years?
“The biggest lesson is that technology is the easiest bit: Governance is hard.
Most successful data sharing initiatives rely on three things: first, clear rules about who can do what with data and under what conditions, second, institutions - that participants trust - convene and enable definition of the rules, third, that there are shared incentives where everyone benefits from cooperation and reciprocity is clear.
Another important lesson is that interoperability must be designed from the start: not just technical interoperability but legal and processes. Ecosystems will always fragment into incompatible or higher-friction standards or closed platforms, progress will slow and costs increase without strong coordination. Adoption rarely happens through mandates alone: practical examples and use cases show participants how collaboration reduces friction, creates new value, and lowers risk. Without addressing “what’s in it for me”, no one will act.”
You were instrumental in setting up the Open Data Institute; how did you sell the concept to the government, and how do you see that legacy playing out in the age of smart data?
“We created two major interventions. The first was to reframe data as infrastructure rather than ‘a technology’. The second was to define the Data Spectrum to change the culture of thinking about data as a collection of bits into information with rights that can be open, shared or closed. Without knowing where data sits on the Data Spectrum you can’t create incentives and an economic model around it.
Governments had invested heavily in collecting information but had not yet realised that releasing it openly could create an entire economic ecosystem around it. The ODI helped demonstrate that Open Data could drive innovation, transparency, and economic growth. What we are seeing now with Smart Data is the next phase of this journey: enabling secure, permissioned sharing of data between organisations so that markets function more efficiently.
Open data unlocked discovery and experimentation. Smart data is about operationalising data flows across the economy. A legacy of the ODI is that it helped normalise the idea that data governance and infrastructure are matters of national economic policy.”
What led to you setting up Icebreaker One, and how does it encapsulate the principles you’ve learned from your data sharing experiences?
Icebreaker One builds on lessons learned from three things: AMEE, the Open Data Institute, and the Open Banking Standard (which I co-chaired the creation of).
AMEE addressed data portability, open APIs and shared data rights between businesses to dramatically lower the cost and friction of measuring impact.
The ODI showed that shifting the culture and perceptions around data requires strong collaboration to unlock innovation ecosystems. It is not enough to publish data or build tools: you need institutions, standards, and rules that make the system trustworthy and sustainable. We also set up an incubator and an accelerator to help stimulate private sector startups. Without a joined-up approach, value creation is stifled.
Open Banking demonstrated the next step. It showed that governments can create regulated frameworks where competitors can safely share data through common rules and standards, unlocking entirely new markets and services. Most recently Open Banking is projected to be a £43B opportunity for the UK.
Icebreaker One brings these learnings together as a public benefit non-profit, non-partisan and open-by-default on what it creates. This is an important design decision. It does two things.
Firstly, it focuses on going far together: building multi-actor coalitions that co-design data-sharing Schemes within a strong governance framework. These coalitions define the governance, rules, and assurance mechanisms needed for markets to exchange trusted data at scale. In the case of climate and net zero, that allows financial institutions to see credible emissions and performance data from the real economy and create market-facing incentives.
Secondly, it delivers Trust Services. These include the operation of a completely open source Trust Framework (TF) that can enable the implementation of Schemes, as well as implementations of assurability and data sensitivity assessments. Further, rather than competing with existing TF providers we design for interoperability - for example with Open Banking.
You are spearheading a movement to make all finance “sustainable” - what does that mean in practical terms, and how is Project Perseus proving that it is possible?
In practice it means that environmental performance becomes a normal input into financial decision-making, rather than a separate reporting exercise.
Sustainability data is often slow, expensive, and disconnected from the systems that actually drive financial decisions. Our flagship initiative, Perseus, demonstrates that it is possible to automate by allowing assurable data to flow directly from operational systems into accounting and finance workflows, across sectors and providers, at scale.
Getting industry (irrespective of sector) to voluntarily work together to build big things, or transform their operating model is tantamount to miraculous, yet you’ve managed to do it. What is key to getting that cooperation to happen?
In my experience, two things really matter: shared purpose and neutral governance.
Organisations rarely collaborate just because someone asks them to. They collaborate when the value of participation (or cost of not cooperating) becomes obvious. Climate transition, regulatory pressure, and the complexity of modern supply chains are forcing companies to confront problems that no single organisation can solve alone and this is a driving need for sharing data that is currently high-friction.
The second ingredient is trust. Participants need to know that the initiative is not simply a disguised attempt by one company to dominate the market, or create a new oligopoly. Independent governance, transparent rules, and open standards create the conditions where competitors can collaborate safely.
When these foundations are in place, cooperation becomes much easier because everyone can see the collective benefit. Once the foundations are established, constant effort is needed to keep them going (as Open Banking has also demonstrated).
Where do you see smart data gaining the most traction in the next five years, and what are the top five factors that will determine its success and adoption?
Smart data will gain traction where markets can create value from reducing high-friction or fragmented data that gets ‘stuck’. This will have an impact economy-wide: energy, finance, property, and supply chains are obvious because they involve complex networks of actors and decisions that depend on trusted information.
Five factors will determine success:
Clear governance frameworks
Define rights, responsibilities, and accountability.Interoperable technical standards
Reduce the risks of ecosystem fragmentation into incompatible systems.Regulatory alignment
Give participants confidence the model will endure, whether mandatory or voluntary in nature.Demonstrable economic value
‘Show the thing’: evidence that participation reduces cost or creates opportunity.Trusted institutions
Ensure there are skills and capacity to steward these systems over the long term.To find out more about Icebreaker One: https://ib1.org/. For more thoughts about data infrastructure from Gavin: agentgav.medium.com