Accused.Scot Logo
Policy Watch, Scotland

Biometrics Strategy 2025–2030: Why “Public Trust” Can Still Mean Permanent Data for the Unconvicted

Police Scotland and the Scottish Police Authority (SPA) are consulting on a Joint Biometrics Strategy for 2025–2030. It promises ethics, legality, transparency, and public trust. But if you’ve ever been falsely accused, you know the real question isn’t the slogan. It’s what happens to your data when the allegation fails.

accused.scot submitted a full written response to the official consultation and is publishing this analysis so readers can follow the paper trail.

Read the source material

What this strategy really changes

Biometrics is no longer just fingerprints and DNA. The draft pulls in facial images and other biometric data from CCTV, body-worn video, VIPER identification procedures, drones, digital forensics, and evidence-sharing systems. This is a full pipeline: collect, store, search, match, share, and delete.

Plain English: once your biometric data enters the system, it can keep working on you long after the allegation has stopped working in court.

When legislation is silent, policy expands

The draft acknowledges a lack of primary legislation governing the acquisition, retention, sharing, use, and deletion of facial images, with reform deferred to a future review.

That distinction matters. Law is debated, constrained, and enforceable through courts. Policy can be rewritten internally, quietly, and at pace. If long-term biometric retention rests mainly on policy rather than statute, the public is being asked to trust an evolving internal rulebook with permanent consequences.

That’s not a footnote. If law is silent, policy fills the gap. Policy changes quietly. Legislation is harder to move. For the unconvicted, “we’ll review later” is how long-term harm begins.

Deletion and audit are the whole ball game

Public trust depends on two unglamorous capabilities: deletion and audit. Can the system reliably delete data when it should, and can it prove who accessed data, when, and why?

The draft itself states that Police Scotland and the SPA will “seek technical solutions” for biometric systems with limited or lacking “weeding capacity”, and for systems with limited audit and logging. This wording appears in the strategy’s own description of current capability gaps. It is a quiet admission that key safeguards may not yet be consistently enforceable across all systems.

Where deletion and access trails are unreliable, retention becomes a one-way ratchet: easy to collect, hard to remove, hard to challenge.

Where a system cannot reliably delete and audit, it should not be allowed to expand collection on the promise of future fixes.

If a system can’t reliably delete, and can’t reliably prove access, then “trust” is being asked on credit. That credit is paid by the innocent.

Why this hits false sexual allegations especially hard

Sexual offence allegations carry unique procedural and social weight. Even without conviction, the consequences can be severe: employment, housing, relationships, and mental health. In that context, state data practices matter more, not less.

A common pattern: a case collapses or ends without conviction, but the individual remains trapped in an administrative struggle to remove records, images, and digital traces gathered at the point of accusation. The legal process may end, but the data persists across systems.

The innocent person’s problem

  • Data is collected early, before credibility is properly tested.
  • Image and device evidence multiplies rapidly.
  • Retention can fuel future suspicion after the case collapses.
  • Weak deletion rules turn legal clearance into an administrative fight.

This is why accused.scot pressed for non-conviction safeguards: clear retention limits, automatic review points, deletion following collapsed cases, and meaningful independent oversight.

“Maximising technology” can mean maximising harm

The draft commits to expanding facial matching capabilities, improving custody image quality, and widening biometric data sharing. It also references future developments such as Live Facial Recognition.

Tech isn’t neutral. It amplifies whatever rules already exist. If the system over-relies on “matches”, or treats them as certainty, investigative errors become harder to unwind.

This is not an argument against modern investigative tools. It is an argument that stronger tools require stronger controls: clear legal footing, measurable performance standards, independent auditing, and enforceable deletion rules.

Sharing makes mistakes travel

The strategy outlines local, national, and international sharing routes, including cross-border arrangements and Prüm 2.

What is Prüm 2?
Prüm 2 is the EU-wide framework for automated cross-border sharing of biometric data, including DNA, fingerprints, and facial images. Once data enters these systems, deletion and correction become far harder to enforce across jurisdictions.

If data is shared before it is properly reviewed or weeded, it can become effectively permanent even if the original allegation collapses. Trust is being asked for on credit, but the credit is paid by the unconvicted.

What real public trust would look like

The draft promises more published statistics and a joint Biometrics Annual Report. Good. But trust requires answers to uncomfortable questions.

These are not abstract ideals. They should be treated as a standing litmus test for every Biometrics Annual Report published from this point onward.

  • Non-conviction retention: how much data is held without a conviction?
  • Deletion outcomes: how often is deletion refused or delayed?
  • Audit reality: can access trails actually be proven?
  • Secondary use: how often is data reused beyond the original case?
  • Challenge routes: can ordinary people contest retention without specialist help?

The long game: record, resist, repeat

Biometrics policy will expand because it is framed as efficiency and safety. The only counterweight is enforceable limits: law, retention rules, deletion triggers, and independent audit.

accused.scot’s submission is now part of the record. If safeguards are watered down, nobody can claim no one raised concerns.

If you’re innocent and under suspicion

Your case might end. Your data might not. The fight is not only court, it’s the systems that retain, share, and reuse what was gathered when the allegation was at its loudest.

Published by accused.scot. Policy analysis only. Not legal advice.