Systemic AI Safety Grants

Key Features

This programme will fund researchers who will collaborate with the UK government to advance systemic approaches to AI Safety.

Programme:     AI Safety Institute

Award:     Share of up to £200,000

Opens: 15th Oct 2024

Closes: 26th Nov 2024

! This scheme is now closed

Overview

To fully address systemic AI risks, we must consider both the capabilities of AI models and their potential impact on people, society and the systems they interact with.

Systemic AI safety is focused on safeguarding the societal systems and critical infrastructure into which AI is being deployed—to make our world more resilient to AI-related risks and to enable its benefits.

The AI Safety Institute (AISI), in partnership with the Engineering and Physical Sciences Research Council (EPSRC) and Innovate UK, part of UK Research and Innovation (UKRI), is excited to announce support for impactful research that takes a systemic approach to AI safety. We are offering seed grants round of £200,000 for 12 months, and plan to follow in future rounds with more substantial awards. Successful applicants will receive ongoing support, computing resources where needed, and access to a community of AI and sector-specific domain experts.

What AISI Systemic Safety Grants are funding

AISI are seeking applications focused on a range of safety-related problems: this could involve monitoring and anticipating AI use and misuse in society, or the risks they expose in certain sectors. We want to see applications that could enhance understanding of how government could intervene where needed – with new infrastructure and technical innovations – to make society more resilient to AI-related risks.

AISI conceive of systemic AI safety as a very broad field of research and interventions. Below we introduce some examples of the kinds of research we are interested in. A longer list of example projects is available here.

  • A systems-informed approach for how to improve trust in authentic digital media, protect against AI-generated misinformation, and improve democratic deliberation.
  • Targeted interventions that protect critical infrastructure, for example, those providing energy or healthcare, from an AI-mediated cyberattack.
  • Ideas about how to measure or mitigate the potentially destabilising effects of AI transformations of the labour market.
  • Ways to measure, model, and mitigate the secondary effects of AI systems that take autonomous actions on digital platforms.

AISI recognise that future risks from AI remain largely unknown. We are open to a range of plausible assumptions about how AI technologies will develop and be deployed in the next 2-5 years. We are excited about work that addresses both ongoing and anticipated risks, as long as it is credible and evidence based.

Benefits of working with the AI Safety Institute

  • Access to technical experts in the field of AI Safety, working with researchers who have previously worked at OpenAI, Google DeepMind, and Cambridge University.
  • Access to compute infrastructure to drive forward projects and applications into tangible and innovative solutions.
  • A supportive community across Government and research organisations to promote systemic wide interventions in AI Safety.

In the future, we will build on the outputs of this first phase and make larger, longer-term investments in specific interventions that have a promise for increasing systemic safety. Projects in the first phase will be prioritised on their ability to help us make these decisions in the second phase.

What do we expect from successful applicants?

In addition to delivering your proposed projects, successful grantees will be expected to produce quarterly progress updates, against financial and non-financial performance metrics, participate in regular progress meetings with AISI and UKRI, participate in workshops organised by AISI on a regular basis, and to engage with the programme officer to increase the impact of your work. These expectations would be laid out in the grant agreement terms and conditions.

Interested in applying for this competition?

Book an appointment to speak to one of our advisors to discuss your eligibility to apply for this Grant Funding opportunity.