Fairness Innovation Challenge

Key Features

UK registered organisations can apply for a share of up to £400,000 for projects resulting in new solutions to address bias and discrimination in AI systems. This funding is from the Centre for Data Ethics and Innovation (CDEI).

Programme:     Innovate UK

Award:     Share of up to £400,000

Opens: 16th Oct 2023

Closes: 13th Dec 2023

! This scheme is now closed


Innovate UK will work with the Centre for Data Ethics and Innovation (CDEI), part of the Department for Science Innovation and Technology (DSIT), to invest up to £400,000 in innovation projects.


The aim of this competition is to drive the development of novel solutions to address bias and discrimination in artificial intelligence (AI) systems.

Our objectives are to:

  • encourage the development of socio-technical approaches to fairness
  • test how strategies to address bias and discrimination in AI systems can comply with relevant regulation including the Equality Act 2010, the UK General Data Protection Regulation (GDPR) and the Data Protection Act 2018
  • provide greater clarity about how different assurance techniques can be applied in practice

Assurance techniques, include the methods and processes used to verify and ensure that systems and solutions meet certain standards, including those related to fairness.

Despite increased interest in addressing bias and discrimination in AI systems, organisations continue to face numerous challenges, including:

  • a lack of clarity around best practice for the use of fairness metrics and toolkits
  • limitations associated with technical approaches
  • risks of breaching UK legislation

This competition aims to tackle these challenges in practice. You must propose a solution to address bias and discrimination in an AI system in one of the real world use cases:

  • provided healthcare use case
  • open use case

Your proposal must include:

  • a description of the process you would adopt to detect and address bias and discrimination in the selected use case, including potential technical and socio-technical interventions
  • an explanation of why you have selected this particular approach, for example, why you have chosen to use a particular fairness metric or socio-technical intervention
  • an explanation of how you will also ensure broader ethical or legal fairness within the UK context, for example compliance with data protection legislation and equalities law, beyond just looking at technical and mathematical fairness

Your proposed solution must also address at least two of the following stages in the process of addressing bias and discrimination in AI systems:

  • accessing demographic data (for bias detection)
  • bias detection
  • bias mitigation
  • ongoing monitoring and evaluation

Your proposed solution must adopt a socio-technical, rather than purely mathematical or statistical approach to achieving fairness.

A socio-technical approach considers the broader historical, social and cultural context in which an AI system is embedded and seeks to address both statistical and structural biases associated with the use of AI systems.

Possible socio-technical interventions include but are not limited to:

  • participatory forms of data collection, audit or mitigation
  • governance interventions addressing organisational biases
  • intersectional bias analysis
  • custom context-specific bias metrics
  • engagement with subject matter experts
  • investigating bias in human decision making processes surrounding the system

You can access information about socio-technical approaches to fairness, in this paper and page 10 of this guidance from the National Institute of Standards and Technology (NIST).

If successful, on completion of your funded project you will be required to attend a show case event to present evidence of your outcomes.

You will also be required to share the outputs and outcomes of your project, this will include, at a minimum:

  • a White Paper explaining the solution you developed, its impact, and lessons others can learn from your project
  • if a method or tool is developed as part of the challenge, the code or description must be made available and open source
  • if a proprietary method or tool is used as part of the challenge, a transparency record must be filled out and made publicly available for example, the Algorithmic Transparency Recording Standard (ATRS) or a model cards

Uses Cases

Your project must focus on one of the following use cases.

Healthcare use case:

This use case asks participants to submit fairness solutions to address bias and discrimination in the CogStack Foresight model developed by Kings Health Partners and Health Data Research UK, with the support of NHS AI Lab. This is a generative AI model for predicting patient outcomes based on Electronic Health Records.

CogStack is a platform that has been deployed in several NHS Hospitals. The platform includes tools for unstructured (text) health data centralisation, natural language processing for curation as well as generative AI for longitudinal data analytics, forecasting and generation.

This generative AI, Foresight, is a Generative Pretrained Transformer (GPT) model. Foresight can forecast next diagnostic codes and any other standardised medical codes including medications and symptoms, based on their source dataset. Foresight can also generate synthetic longitudinal health records that match the probability distributions of the source data, allowing pilots on synthetic data without direct access to private data.

As these AI models have been trained on real-world data, they contain biases of their historical datasets, including demographic biases, styles of historical practice and biased missingness from data capture.

Open use case

For this option, you can propose your own use case. This includes AI models, systems and solutions at different stages of prototyping or deployment that are believed to be at risk of bias and discrimination.

If you are proposing your own use case, you must provide additional information in your application about:

  • background or context: what are you using an AI enabled system for, what is the model, why is it being used, what problem does it solve
  • potential risks to fairness: what are the fairness challenges associated with this system for this specific use case or context, why is it difficult to make this system fairer
  • technical details: describe the data set, including the size of the data set and any variables, as well as the learning algorithms used to train the models

Your use case and proposed solutions will need to be published or shareable. This challenge is only open to use cases that are transparent about their models, tools and data, as well as the challenges and potential solutions to fairness.

Technical Briefing Document

You can access more information and guidance about the healthcare and open use case in the attached technical briefing document.

Fairness Innovation Challenge Technical briefing.pdf (opens in a new window)


Your project must:

  • have total project costs of up to £130,000
  • carry out its project work in the UK
  • intend to exploit the results from or in the UK
  • start by 1 May 2024
  • end by 31 March 2025

Projects must always start on the first of the month and this must be stated within your application. Your project start date will be reflected in your grant offer letter if you are successful.

You must only include eligible project costs in your application.

Under current restrictions, this competition will not fund any procurement, commercial, business development or supply chain activity with any Russian or Belarusian entity as lead, partner or subcontractor. This includes any goods or services originating from a Russian or Belarusian source.

Subcontractors are allowed in this competition. We recognise that developing socio-technical solutions to address bias and discrimination in AI systems requires a breadth of knowledge and skills that may require you to work with different organisations as subcontractors.

Subcontractors can be from anywhere in the UK and you must select them through your usual procurement process.

You can use subcontractors from overseas but must make the case in your application as to why you could not use suppliers from the UK.

You must also provide a detailed rationale, evidence of the potential UK contractors you approached and the reasons why they were unable to work with you.

Innovate UK expect all subcontractor costs to be justified and appropriate to the total eligible project costs. We will not accept a cheaper cost as a sufficient reason to use an overseas subcontractor.

An eligible organisation can lead on any number of distinct projects.

You can use a previously submitted application to apply for this competition.


Innovate are not funding projects that:

  • do not adopt a socio-technical approach to fairness
  • do not address at least two of the stages in the process of addressing bias and discrimination in AI systems
  • do not evidence the potential for the proposed innovation to generate positive economic or societal impact

If you are proposing your own use cases, we will not accept projects that are not transparent and open about the models, data and risks to fairness that your use case presents.

Innovate UK cannot fund projects that are:

  • not allowed under De minimis regulation restrictions
  • not eligible to receive Minimal Financial Assistance
  • dependent on export performance, for example giving an award to a baker on the condition that they export a certain quantity of bread to another country
  • dependent on domestic inputs usage, for example if we give an award to a baker on the condition that they use 50% UK flour in their product

Funding Costs

Innovate UK have allocated up to £400,000 to fund innovation projects in this competition.

Your total project costs will be 100% funded up to the £130,000 maximum. Your total project costs, detailed within your application, must not exceed this maximum and must match the funding sought.

Interested in applying for this competition?

Book an appointment to speak to one of our advisors to discuss your eligibility to apply for this Grant Funding opportunity.