Deadline: 13 December 2023
UK registered organisations can apply for a share of up to £400,000 for projects resulting in new solutions to address bias and discrimination in AI systems.
Innovate UK will work with the Centre for Data Ethics and Innovation (CDEI), part of the Department for Science Innovation and Technology (DSIT), to invest in innovation projects.
The aim of this competition is to drive the development of novel solutions to address bias and discrimination in artificial intelligence (AI) systems.
The objectives are to:
- encourage the development of socio-technical approaches to fairness
- test how strategies to address bias and discrimination in AI systems can comply with relevant regulation including the Equality Act 2010, the UK General Data Protection Regulation (GDPR) and the Data Protection Act 2018
- provide greater clarity about how different assurance techniques can be applied in practice
Your proposal must address bias and discrimination in one of the following use cases:
- provided healthcare use case
- open use case
Use Cases
Your project must focus on one of the following use cases.
- Healthcare use case:
- This use case asks participants to submit fairness solutions to address bias and discrimination in the CogStack Foresight model developed by Kings Health Partners and Health Data Research UK, with the support of NHS AI Lab. This is a generative AI model for predicting patient outcomes based on Electronic Health Records.
- CogStack is a platform that has been deployed in several NHS Hospitals. The platform includes tools for unstructured (text) health data centralisation, natural language processing for curation as well as generative AI for longitudinal data analytics, forecasting and generation.
- This generative AI, Foresight, is a Generative Pretrained Transformer (GPT) model. Foresight can forecast next diagnostic codes and any other standardised medical codes including medications and symptoms, based on their source dataset. Foresight can also generate synthetic longitudinal health records that match the probability distributions of the source data, allowing pilots on synthetic data without direct access to private data.
- As these AI models have been trained on real-world data, they contain biases of their historical datasets, including demographic biases, styles of historical practice and biased missingness from data capture.
- Open use case:
- For this option, you can propose your own use case. This includes AI models, systems and solutions at different stages of prototyping or deployment that are believed to be at risk of bias and discrimination.
- If you are proposing your own use case, you must provide additional information in your application about:
- background or context: what are you using an AI enabled system for, what is the model, why is it being used, what problem does it solve
- potential risks to fairness: what are the fairness challenges associated with this system for this specific use case or context, why is it difficult to make this system fairer
- technical details: describe the data set, including the size of the data set and any variables, as well as the learning algorithms used to train the models
- Your use case and proposed solutions will need to be published or shareable. This challenge is only open to use cases that are transparent about their models, tools and data, as well as the challenges and potential solutions to fairness.
Project Size
- Your project’s total costs can be up to £130,000.
Projects they will not fund
- They are not funding projects that:
- do not adopt a socio-technical approach to fairness
- do not address at least two of the stages in the process of addressing bias and discrimination in AI systems
- do not evidence the potential for the proposed innovation to generate positive economic or societal impact
- If you are proposing your own use cases, they will not accept projects that are not transparent and open about the models, data and risks to fairness that your use case presents.
- They cannot fund projects that are:
- not allowed under De minimis regulation restrictions
- not eligible to receive Minimal Financial Assistance
- dependent on export performance, for example giving an award to a baker on the condition that they export a certain quantity of bread to another country
- dependent on domestic inputs usage, for example if they give an award to a baker on the condition that they use 50% UK flour in their product
Eligibility Criteria
- Your project must:
- carry out its project work in the UK
- intend to exploit the results from or in the UK
- start by 1 May 2024
- end by 31 March 2025
- To lead a project your organisation must be a UK registered:
- business of any size
- academic institution
- research and technology organisation (RTO)
- charity
- not for profit
- public sector organization
- Subcontractors are allowed in this competition. They recognise that developing socio-technical solutions to address bias and discrimination in AI systems requires a breadth of knowledge and skills that may require you to work with different organisations as subcontractors.
- Subcontractors can be from anywhere in the UK and you must select them through your usual procurement process.
- You can use subcontractors from overseas but must make the case in your application as to why you could not use suppliers from the UK.
- You must also provide a detailed rationale, evidence of the potential UK contractors you approached and the reasons why they were unable to work with you.
- They expect all subcontractor costs to be justified and appropriate to the total eligible project costs. They will not accept a cheaper cost as a sufficient reason to use an overseas subcontractor.
- An eligible organisation can lead on any number of distinct projects.
For more information, visit Innovate UK.