← Back to Blog

The Assurance Gap: A £4 Billion Market Opportunity in Defence Autonomous Systems

Adnan Mahmud

The 2025 UK Strategic Defence Review commits £4 billion to autonomy this Parliament and mandates 10% of the defence budget for artificial intelligence and autonomous systems from April 2025. Yet existing certification frameworks cannot adequately address machine learning systems. This analysis examines how this structural mismatch between surging investment and immature assurance frameworks creates significant commercial opportunities for specialists in probabilistic reasoning, epistemic and aleatoric uncertainty representation, and interpretable artificial intelligence. Drawing on policy documents from the UK Ministry of Defence, NATO, and the European Union, this blog maps specific funding programmes, capability gaps, and market entry points across the defence innovation ecosystem.

Introduction

European defence spending is undergoing a structural transformation not witnessed since the Cold War. Germany's €100 billion special fund, the European Union's €150 billion ReArm Europe initiative, and the United Kingdom's commitment to allocate 2.5% of GDP to defence by 2027 collectively represent the most significant rearmament programme in decades (UK Parliament, 2025). This geopolitical shift creates substantial commercial opportunities, particularly for technical specialists whose capabilities align with emerging procurement priorities.

The UK Ministry of Defence, NATO, and EU defence agencies have all identified Test, Evaluation, Verification and Validation (TEV&V) for artificial intelligence as a critical bottleneck constraining the deployment of autonomous systems (Defence and Security Accelerator, 2025). The Defence AI Centre explicitly requires "continuous assurance throughout the development lifecycle" and "revalidation at the speed of relevance"—problems that fundamentally require real-time uncertainty quantification and sophisticated digital twin architectures (GOV.UK, 2025a).

UK Defence Funding Landscape

The UK Ministry of Defence has established multiple active programmes with clearly articulated requirements in AI assurance. The most directly relevant immediate opportunity is the Defence and Security Accelerator's (DASA) "Delivering Future Advantage Through Testing and Evaluation" competition, with Phase 1 funding of £1 million and individual project awards of approximately £250,000 targeting Technology Readiness Level 6 at completion (Defence and Security Accelerator, 2025).

Challenge 2 of this competition explicitly seeks "solutions enabling continuous assurance throughout the development lifecycle" and cites "AI Test, Evaluation and Assurance platforms that efficiently revalidate AI components in changing contexts" as an exemplar requirement (GOV.UK, 2025b). This competition connects to a broader T&E Transformation Fund with £32 million ringfenced through 2030, indicating sustained investment in precisely these capabilities.

The Defence Science and Technology Laboratory's (DSTL) AI Programme operates at approximately £80 million over four years, whilst the Autonomy Programme commits £30-40 million annually for five years (TechUK, 2025). These figures exclude classified programmes but indicate the scale of publicly accessible funding.

The Defence AI Centre and Model Arena

The Defence AI Centre's AI Model Arena, launching at DAIC Connect in March 2026, represents a significant commercial validation pathway. Developed in partnership with Advai and the National Security Strategic Investment Fund, this platform evaluates AI models across performance, reliability, robustness, and security against JSP 936 requirements (GOV.UK, 2025a). The capability to assess 100 suppliers simultaneously suggests substantial procurement activity is anticipated.

Key procurement routes include the R-Cloud Dynamic Procurement System (DSTL's replacement for Serapis, open for registration), the CCS RM6200 AI Dynamic Purchasing System with contracts up to four years, and UK Defence Innovation with a £400 million ringfenced budget launching in 2025 (GOV.UK, 2025c).

NATO's €1 Billion Innovation Architecture

NATO has established the most comprehensive international framework for responsible AI in defence applications, creating significant demand for compliance expertise. The Alliance's six Principles of Responsible Use—lawfulness, responsibility, explainability, reliability, governability, and bias mitigation—now require concrete implementation mechanisms (NATO, 2024a). The Data and Artificial Intelligence Review Board (DARB), established in October 2022, is actively developing a user-friendly AI certification standard translating these principles into verifiable requirements (NATO, 2024b).

DIANA: Defence Innovation Accelerator for the North Atlantic

DIANA selected 150 companies from 24 NATO countries for its 2026 Challenge Programme in December 2024 (NATO, 2025). Two of the ten priority challenge areas directly align with uncertainty quantification capabilities: "Autonomy and unmanned systems" and "Data-assisted decision making." Funding includes €100,000 per company with up to €300,000 additional for top performers. The six-month accelerator began in January 2026, with access to over 200 test centres across 32 nations (Innovate UK Business Connect, 2025).

NATO Innovation Fund

The NATO Innovation Fund represents €1 billion in venture capital across 24 Allied nations over 15 years, with initial investments reaching €15 million and substantial follow-on reserves (NATO Innovation Fund, 2024a). Portfolio companies already include ARX Robotics for scalable robotic systems and Tekever for UAV platforms, indicating active deployment of autonomous capabilities requiring assurance frameworks (NATO Innovation Fund, 2024b). The Fund's ethical guidelines explicitly require technologies "designed and deployed with Principles of Responsible Use" and "adequate accountability and governance" (NATO Innovation Fund, 2024c).

European Defence Fund: Opportunities and Constraints

The European Defence Fund's €7.9 billion budget for 2021-2027 includes substantial AI and autonomy components, though UK entities face specific constraints following Brexit. Whilst UK organisations can participate in EDF projects, they receive no EDF funding and face intellectual property retention restrictions (European Commission, 2024). Understanding this landscape remains valuable for partnership strategies and competitor analysis. Active EDF projects directly addressing AI assurance include FaRADAI (Frugal and Robust AI for Defence Advanced Intelligence), KOIOS (secure, robust, frugal, resilient and explainable AI), and AIDA (AI Deployable Agent for autonomous cyber defence). The 2025 Work Programme allocates approximately €1.065 billion with specific topics on risk, robustness and resilience for autonomous vehicles in military operations (European Commission, 2025).

The EU AI Act and Defence Exemptions

The EU AI Act explicitly exempts AI systems used "exclusively for military, defence or national security purposes" under Article 2(3) (EU Artificial Intelligence Act, 2024a). However, dual-use systems with both military and civilian applications fall within scope, creating complex compliance requirements. High-risk system rules apply from 2 August 2026, with embedded product requirements effective from 2 August 2027 (EU Artificial Intelligence Act, 2024b). Defence suppliers increasingly align with AI Act principles regardless of exemption status, as EDF calls now routinely reference "explainable AI," "robust AI," and "trustworthy" systems (European Commission, 2025).

Defence Prime Capability Gaps

Major UK defence contractors are investing heavily in autonomous capabilities whilst acknowledging fundamental challenges in certification. These gaps create direct subcontracting opportunities for specialists with relevant expertise.

BAE Systems and the GCAP/Tempest Programme

BAE Systems leads the Tempest/GCAP sixth-generation fighter programme featuring deep learning AI, manned/unmanned flight modes, and swarming drone control. The platform uses VxWorks 653 with DO-178C DAL B certification—a standard designed for conventional software rather than machine learning systems (Wind River, 2022; Aerospace Testing International, 2022). BAE has publicly stated that "AI-enabled autonomy for military use is held to a higher standard" and seeks systems that are "dependable, adaptable, and trustworthy enough for military use" (BAE Systems, 2024). Current job postings for Lead Technologist positions specifically require "engineering assurance for AI/ML research topics" (BAE Systems, 2025).

MBDA and Autonomous Weapons Systems

MBDA is integrating ORCHESTRIKE AI into SPEAR cruise missiles—the first weapon-to-weapon collaborative AI communications system (Airforce Technology, 2025). Their published position explicitly states the requirement for "trusted AI that must be explainable, secure, and robust against cyberattack risks" and the "requirement for guarantees of robustness before AI introduction" (MBDA, 2020). The Crossbow loitering munition uses AI-enabled image-based navigation in GNSS-denied environments, presenting significant uncertainty quantification challenges (Naval Technology, 2025).

Leonardo UK and Rotary-Wing Autonomy

Leonardo UK holds a £60 million Proteus contract for the Royal Navy's Rotary Wing Uncrewed Air System. Technical challenges include "flight control laws and algorithms for large autonomous VTOL aircraft" and digital twin development with AI/ML algorithms (Leonardo UK, 2025). The programme encompasses autonomous take-off and landing, cargo delivery, and operation from Royal Navy vessels with first flight preparation targeting mid-2025 (Breaking Defense, 2025; Aerospace Manufacturing, 2025).

Thales and Hybrid AI Architectures

Thales has published extensively on AI safety challenges, noting that "the integrity of an AI system is below 99%, far from the required safety level" where the "probability of catastrophic failure should be one in 10 million or below" (Aviation Week, 2024). In response, Thales has adopted hybrid architectures employing dual computers—high-performance systems for AI processing and separate safety-critical systems for verification—indicating recognition that current AI cannot meet certification requirements independently.

Regulatory Framework Gaps

The regulatory framework for defence AI certification remains fundamentally incomplete, driving sustained demand for interpretation and implementation expertise. The Military Aviation Authority has established comprehensive RPAS categorisation (Open/Specific/Certified) and is conducting a major review of Def Stan 00-970 Part 9 for RPAS—creating immediate consultancy demand. However, no specific AI/ML certification guidance exists from the MAA, leaving a critical gap (UK Parliament, 2025). NATO's AI Certification Standard development, initiated in February 2023 with an original completion target of end-2023, remains delayed. The Rapid Adoption Action Plan established in June 2025 mandates a maximum 24-month technology adoption timeline, intensifying pressure for streamlined certification approaches (NATO, 2024b).

Six Critical Technical Capability Gaps

Synthesising across all research streams, six specific capability shortfalls emerge where specialist expertise commands premium value.

AI Safety Case Development and Uncertainty Quantification

Traditional safety case methodologies such as Def Stan 00-56 and DO-178C cannot adequately address machine learning behaviour. Every major programme—GCAP/Tempest, Proteus, ORCHESTRIKE—requires novel approaches to demonstrate ALARP compliance for AI components. Bayesian networks and probabilistic graphical models directly address the requirement to represent epistemic uncertainty (model limitations) and aleatoric uncertainty (inherent randomness) in safety arguments. This capability gap aligns with the DASA T&E Transformation programme (£32 million) and the DSTL AI Programme (£80 million).

Real-Time Uncertainty Quantification for Decision Support

NATO's MDO AI and MAINSAIL programmes, UK defence AI deployments, and weapons systems like SPEAR all require confidence-calibrated outputs. Defence buyers need systems that, in essence, "know what they don't know." The gap between current AI confidence outputs and required safety integrity levels—where probability of catastrophic failure must be less than 10^⁻⁷—spans orders of magnitude. This requirement aligns with NATO DIANA challenges and Defence AI Centre validation frameworks (NATO ACT, 2024).

Digital Twin Assurance for Autonomous Systems

Leonardo UK, QinetiQ, and DSTL have all identified digital twin validation as critical to autonomous system deployment. The ability to demonstrate that synthetic environment testing adequately represents operational conditions—and to quantify the uncertainty inherent in that representation—enables faster certification pathways. This capability aligns with the DSTL Astrid Framework for Synthetic Environments and the Leonardo Proteus programme.

Explainable AI for Certification Evidence

JSP 936 requires "understanding" of AI systems with "explanation mechanisms." MBDA requires "explainable" AI. NATO's Principles of Responsible Use mandate "explainability and traceability." Interpretable machine learning techniques that maintain performance whilst enabling audit are essential for certification bodies to accept AI systems. This requirement aligns with EDF KOIOS and the DAIC AI Model Arena assessment criteria.

Human-Machine Teaming Assessment

EASA's Level 2 guidance, MOD human factors requirements, and NATO's emphasis on "appropriate human-machine interaction" all drive demand for rigorous assessment frameworks. This includes quantifying appropriate automation levels and demonstrating safe handover between human and autonomous control. Relevant funding includes NATO Science and Technology Organization programme HFM-247 and the EPSRC AI Safety Sandpit.

Continuous Assurance and Runtime Monitoring

The DAIC's explicit requirement for "continuous assurance" and "revalidation at the speed of relevance" addresses systems that learn or adapt post-deployment. Conventional certification assumes static systems; defence requires frameworks for systems that improve over time whilst maintaining safety guarantees. This requirement directly aligns with DASA T&E Transformation Challenge 2.

Conclusion

UK defence has committed billions to autonomy whilst acknowledging certification frameworks remain immature. Defence primes are pursuing AI-enabled weapons, aircraft, and maritime systems without adequate tools to demonstrate safety. NATO has established principles but lacks implemented standards. Regulators are adapting aviation-era approaches to machine learning systems for which they were never designed.

This structural gap—between ambition and assurance capability—creates sustained commercial opportunity. A technical specialist combining Bayesian reasoning for uncertainty representation, digital twin expertise for synthetic validation, and interpretable AI for certification evidence addresses fundamental requirements that defence buyers have identified but cannot adequately source from existing contractors.

The most immediate opportunities include DASA's T&E Transformation competition with awards up to £250,000 per project, NATO DIANA's autonomy challenges with funding between €100,000 and €400,000, and subcontracting to primes on Tempest, Proteus, and ORCHESTRIKE. Medium-term positioning should target emerging ISO/IEC 42001 and TS 22440 certification services, NATO AI certification standard compliance consulting, and digital twin assurance for safety-critical systems as these frameworks mature through 2027-2030.

References

Aerospace Manufacturing (2025) 'Leonardo unveils new uncrewed rotorcraft design', Aerospace Manufacturing, available at: https://www.aero-mag.com/leonardo-unveils-new-uncrewed-rotorcraft-design.

Aerospace Testing International (2022) 'BAE Systems picks Wind River for Tempest fighter software development', Aerospace Testing International, available at: https://www.aerospacetestinginternational.com/news/software/bae-picks-wind-river-for-tempest-fighter-software-development-and-test.html.

Airforce Technology (2025) 'Orchestrike AI unveiled for the first time on SPEAR cruise missile', Airforce Technology, available at: https://www.airforce-technology.com/news/orchestrike-ai-unveiled-for-the-first-time-on-spear-cruise-missile/.

Aviation Week (2024) 'Thales bets on hybrid AI for future automation', Aviation Week, available at: https://aviationweek.com/air-transport/safety-ops-regulation/thales-bets-hybrid-ai-future-automation.

BAE Systems (2024) 'What is AI-enabled autonomy?', BAE Systems, available at: https://www.baesystems.com/en-us/definition/what-is-ai-enabled-autonomy.

BAE Systems (2025) 'Lead Technologist – AI & ML' [job posting], BAE Systems, available at: https://jobsearch.baesystems.com/job/lead-technologist-ai-ml-121679.

Breaking Defense (2025) 'Leonardo unveils British Navy Proteus rotary wing UAS final design', Breaking Defense, available at: https://breakingdefense.com/2025/01/leonardo-unveils-british-navy-proteus-rotary-wing-uas-final-design/.

Defence and Security Accelerator (2025) 'DASA competition: Delivering future advantage through testing and evaluation', Defence and Security Accelerator, available at: https://myresearchconnect.com/dasa-competition-delivering-future-advantage-through-testing-and-evaluation/.

EU Artificial Intelligence Act (2024a) 'Article 2: Scope', EU Artificial Intelligence Act, available at: https://artificialintelligenceact.eu/article/2/.

EU Artificial Intelligence Act (2024b) 'Recital 24', EU Artificial Intelligence Act, available at: https://artificialintelligenceact.eu/recital/24/.

European Commission (2024) 'European Defence Fund (EDF) – Official webpage of the European Commission', European Commission, available at: https://defence-industry-space.ec.europa.eu/eu-defence-industry/european-defence-fund-edf-official-webpage-european-commission_en.

European Commission (2025) 'EDF indicative multiannual perspective 2025–2027', European Commission, available at: https://defence-industry-space.ec.europa.eu/document/download/4e253565-aa45-445d-86e4-5489869f00fa_en.

GOV.UK (2025a) 'Launching the AI Model Arena', GOV.UK, available at: https://www.gov.uk/government/news/launching-the-ai-model-arena.

GOV.UK (2025b) 'Delivering future advantage through testing and evaluation: Competition document', GOV.UK, available at: https://www.gov.uk/government/publications/delivering-future-advantage-through-testing-and-evaluation-phase-1/delivering-future-advantage-through-testing-and-evaluation-competition-document.

GOV.UK (2025c) 'Procurement at MOD', GOV.UK, available at: https://www.gov.uk/government/organisations/ministry-of-defence/about/procurement.

Innovate UK Business Connect (2025) 'NATO Defence Innovation Accelerator (DIANA) 2026 cohort', Innovate UK Business Connect, available at: https://iuk-business-connect.org.uk/opportunities/nato-defence-innovation-accelerator-diana-2026-cohort/.

Leonardo UK (2025) 'Leonardo unveils design of Proteus Uncrewed Rotorcraft Technology Demonstrator', Leonardo UK, available at: https://uk.leonardo.com/en/news-and-stories-detail/-/detail/leonardo-unveils-design-of-proteus-uncrewed-rotorcraft-technology-demonstrator.

MBDA (2020) 'The potential of AI in the battlefield and in managing effects', MBDA, available at: https://www.mbda-systems.com/2020/12/15/the-potential-of-ai-in-the-battlefield-and-in-managing-effects/.

NATO (2024a) 'Summary of NATO’s revised Artificial Intelligence (AI) strategy', NATO, available at: https://www.nato.int/en/about-us/official-texts-and-resources/official-texts/2024/07/10/summary-of-natos-revised-artificial-intelligence-ai-strategy.

NATO (2024b) 'NATO starts work on Artificial Intelligence certification standard', NATO, available at: https://www.nato.int/cps/en/natolive/news_211498.htm.

NATO (2025) 'NATO Defence Innovation Accelerator announces largest-ever cohort of 150 innovators to work on ten defence and security challenges in 2026', NATO, available at: https://www.nato.int/en/news-and-events/articles/news/2025/12/10/nato-defence-innovation-accelerator-announces-largest-ever-cohort-of-150-innovators-to-work-on-ten-defence-and-security-challenges-in-2026.

NATO ACT (2024) 'Digital transformation', NATO Allied Command Transformation, available at: https://www.act.nato.int/activities/digital-transformation/.

NATO Innovation Fund (2024a) 'About', NATO Innovation Fund, available at: https://www.nif.fund/about/.

NATO Innovation Fund (2024b) 'NATO Innovation Fund makes first investments to secure the future of the Alliance’s 1 billion citizens', NATO Innovation Fund, available at: https://www.nif.fund/news/nato-innovation-fund-makes-first-investments-to-secure-the-future-of-the-alliances-1-billion-citizens/.

NATO Innovation Fund (2024c) 'Accelerating autonomy: How NATO and the NATO Innovation Fund are shaping the future of defence with Task Force X', NATO Innovation Fund, available at: https://www.nif.fund/news/accelerating-autonomy-how-nato-and-the-nato-innovation-fund-are-shaping-the-future-of-defence-with-task-force-x/.

Naval Technology (2025) 'DSEI 2025: Latest MBDA weapons systems revealed', Naval Technology, available at: https://www.naval-technology.com/news/dsei-2025-latest-mbda-weapons-systems-revealed/.

TechUK (2025) 'tech2035: UK defence back in the spotlight ahead of industrial strategy', TechUK, available at: https://www.techuk.org/resource/tech2035-uk-defence-back-in-the-spotlight-ahead-of-industrial-strategy.html.

UK Parliament (2025) 'Use of drones in defence (CDP-2025-0176)', UK Parliament, available at: https://researchbriefings.files.parliament.uk/documents/CDP-2025-0176/CDP-2025-0176.pdf.

Wind River (2022) 'Wind River selected by BAE Systems for Team Tempest advanced combat air systems development', Wind River, available at: https://www.windriver.com/news/press/news-20220426.