Palestine Action Targets Elbit Systems' Use of AI

Six Palestine Action activists face criminal damage charges after a 2024 break-in at an Elbit Systems factory in Filton, near Bristol. Defendants told Woolwich Crown Court they aimed to disable computers and drones because the site was an R&D facility supplying what they described as AI-driven decision support systems and weaponry used by Israel. Defendants, including Zoe Rogers and Jordan Devlin, said they damaged computers to prevent development of systems they believe enable lethal targeting. The case centers on claims that attacking early-stage hardware and software can save lives by disrupting weapons development pipelines, while prosecutors view the actions as criminal damage to a defence contractor.
What happened
Six defendants charged with criminal damage entered evidence at Woolwich Crown Court over a break-in at an Elbit Systems factory in Filton near Bristol in August 2024. They said the objective was to disable computers and hardware because the site conducted R&D on systems the activists described as "deadly AI" used in Israeli weaponry. Defendants named include Zoe Rogers, Leona Kamio, Jordan Devlin, Charlotte Head, Fatema Rajwani, and Samuel Corner.
Technical details
Defendants framed the site as an R&D facility developing advanced weapon-support technology, citing the companys work on AI-driven Decision Support Systems and ground and in-flight simulators. Court testimony referenced targeted items:
- •quadcopter drones and small unmanned systems
- •workshop computers and engineering workstations
- •simulator and training systems that underpin control and targeting software
They said the intention was to damage computers early in the development cycle to impede software and hardware integration that could enable lethal deployments.
Context and significance
This is a legal and ethical flashpoint where activism, defence contracting, and AI-enabled weapons intersect. For practitioners, the case highlights two operational realities: first, that advanced software and AI for defence is concentrated in accessible R&D sites; second, that civil disobedience is now explicitly focused on AI components rather than only kinetic hardware. The defendants articulating a strategy to interrupt development pipelines underscores a growing public perception of AI as a direct multiplier of lethality in conflict. The narrative also raises questions about supply-chain security, physical protection of development environments, and responsible disclosure versus public protest.
What to watch
Monitor court findings about the specific systems described and any technical evidence entered about how those systems operate. Also watch for broader policy or industry reactions from defence suppliers on hardening R&D sites and from researchers debating dual-use responsibilities.
Scoring Rationale
The trial is significant for security and ethics discussions around AI in weapons and shows activists targeting development pipelines, but it is a legal and political event rather than a technical breakthrough.
Practice interview problems based on real data
1,500+ SQL & Python problems across 15 industry datasets — the exact type of data you work with.
Try 250 free problems


