Some people, when they want to improve public transport safety, hire more staff, fix the lighting, or maybe even try being on time.
The British Transport Police, however, have gone full Black Mirror, deciding the best way to protect you from crime on your morning commute is by pointing cameras at your face and feeding your biometric soul into a machine.
Yes, for many Britons, facial recognition is coming to a railway station near them. Smile. Or don’t. It makes no difference. The algorithm will be watching anyway.
In the coming weeks, British Transport Police (BTP) will be trialling Live Facial Recognition (LFR) tech in London stations. It’s being sold as a six-month pilot program, which in government-speak usually means it will last somewhere between forever and the heat death of the universe.
The idea is to deploy these cameras in “key transport hubs,” which is bureaucratic code for: “places you’re likely to be standing around long enough for a camera to decide whether or not you look criminal.”
BTP assures us that the system is “intelligence-led,” which doesn’t mean they’ll be targeting shady characters with crowbars, but rather that the cameras will be feeding your face into a watchlist generated from police data systems.
They’re looking for criminals and missing people, they say. But here’s how it works: if your face doesn’t match anyone on the list, it gets deleted immediately. Allegedly. If it does match, an officer gets a ping, stares at a screen, and decides whether you’re a knife-wielding fugitive or just a man who looks like one.
And you have to love the quaint touch of QR codes, and signs stuck up around the station letting you know that, yes, your biometric identity is being scanned in real time.
Chief Superintendent Chris Casey would like you to know that “we’re absolutely committed to using LFR ethically and in line with privacy safeguards.”
The deployments, we’re told, will come with “internal governance” and even “external engagement with ethics and independent advisory groups.”