The UK government’s love affair with tech-enabled surveillance knows no bounds.
[This story is a little dated, having first surfaced in the Financial Times on Monday. But on that day I decided to write a piece on what I thought was an even more pressing issue: Italy’s “no jab, no job” vaccine mandate, which threatens to render millions of people unemployed. But this story from the UK is such an outrageous example of creeping surveillance in the so-called “liberal” West that I thought it still worth sharing]
As the pink paper reported, nine schools in the Scottish region of North Ayrshire have started using facial recognition systems as a form of contactless payment in cashless canteens (cafeterias in the US). The BBC later reported that two schools in England were also piloting the system. At a time when many schools in the UK are facing crippling budget cuts, this speaks volumes about the local councils’ educational priorities.
In response to the revelations, the Information Commissioner’s Office issued a weak-tea statement, encouraging schools to “carefully consider the necessity and proportionality of collecting biometric data before they do so.”
A statement from children’s digital rights group Defend Digital Me packed a meatier punch: “Biometrics should never be used for children in educational settings — no ifs, no buts. It’s not necessary. Just ban it.”
Normalising Biometric Surveillance
In its defence, North Ayrshire council said it had sent out a flyer explaining the technology to the children’s parents ahead of the enrollment. That flyer included this lovely little nugget: “With Facial Recognition, pupils simply select their meal, look at the camera and go, making for a faster lunch service whilst removing any contact at the point of sale.”
Apparently a whopping 97% of the school children or their parents consented to be enrolled in the pilot scheme. It seems that the council believes that preteens and teenagers are adequately equipped to decide for themselves whether or not the installation of facial recognition technologies in the school canteen infringes their privacy.
Similar facial recognition systems have been in use in the United States for years, though usually as a security measure. In the case of the schools in Ayrshire, this is all about ease, speed and efficiency. Or so we are told.
“It’s the fastest way of recognising someone at the till,” said David Swanston, the managing director of CRB Cunninghams, the company that provided the system. Swanston added that the average transaction time using the system was five seconds per pupil: “In a secondary school you have about a 25-minute period to serve potentially 1,000 pupils. So we need fast throughput at the point of sale”.
One wonders how school cafeterias were able to cope with demand for so long without digital and biometric payment technologies. But critics argue that these pilot schemes have a much darker purpose than expediting school lunch queues; they are about conditioning children to the widespread use of facial recognition and other biometric technologies.
“It’s normalising biometric identity checks for something that is mundane,” Silkie Carlo of the UK campaign group Big Brother Watch told the FT. “You don’t need to resort to airport style [technology] for children getting their lunch.”
Checkpoint Britain
The UK has on average 1 surveillance camera for every 6.5 people, according to a 2019 analysis by IHS Markit. That’s more than any other country in the world, except for China, which has 1 camera per 4.1 people, the US (4.6) and Taiwan (5.5). This data was featured in a 2019 CBS article warning about the US’ increasing adoption of surveillance technologies:
“During the past few years, coverage of the surveillance market has focused heavily on China’s massive deployments of cameras and artificial intelligence technology. What’s received far less attention is the high level of penetration of surveillance cameras in the United States,” report author Oliver Philippou, an analyst at IHS Markit, said in a note. ‘With the U.S. nearly on par with China in terms of camera penetration, future debate over mass surveillance is likely to concern America as much as China.'”
Like their US counterparts, UK authorities have been trialling live facial recognition (LFR) surveillance in public places for several years. Many of the trials were monitored by activist group Big Brother Watch. In a 2019 article for Yahoo, Silkie Carlo wrote that watching “these live facial recognition trials is to watch your civil liberties slip away before your eyes.” She recounted an anecdote from an LFR trial in the East London borough of Romford. When a passing pedestrian called John (not his real name) noticed the Police cameras, he pulled his jumper over his chin. It was, Carlo says, “a small act of resistance to encroaching surveillance in his town”, for which he ended up paying a price:
I watched a plainclothes officer who had been loitering near us radio through to uniformed officers, instructing them to stop him. John was then surrounded and grabbed by officers, pushed to a wall and questioned. They demanded to know why he was covering his face.
“If I want to cover my face, I’ll cover my face,” he said. “Don’t push me over when I’m walking down the street.”
The plainclothes officer then took a photo of him on a mobile device anyway “for facial recognition.” Police had not told us, or anyone, that they had not only the capability to scan faces with fixed cameras but to “point and shoot” with handheld, mobile devices. He was made to hand over his ID as well. After police aggravated him and threatened to handcuff him, they issued him with a £90 ($115) fine for disorderly behaviour.
The New Frontier of Facial Recognition Surveillance
Since the Covid-19 pandemic police forces around the world have been given much broader surveillance powers. In August this year, the Mayor of London quietly green lighted a controversial proposal that will permit the Metropolitan Police, the UK’s biggest police force, to use Retrospective Facial Recognition (RFR), as part of a £3 million deal with Japanese tech firm NEC Corporation, reports Wired magazine:
The system examines images of people obtained by the police before comparing them against the force’s internal image database to try and find a match.
“Those deploying it can in effect turn back the clock to see who you are, where you’ve been, what you have done and with whom, over many months or even years,” says Ella Jakubowska, policy advisor at European Digital Rights, an advocacy group. Jakubowska says the technology can “suppress people’s free expression, assembly and ability to live without fear”.
The purchase of the system is one of the first times the Met’s use of RFR has been publicly acknowledged. Previous versions of its facial recognition web page on the Wayback Machine shows references to RFR were added at some stage between November 27, 2020, and February 22, 2021. The technology is currently used by six police forces in England and Wales, according to a report published in March. “The purchase of a modern, high-performing facial recognition search capability reflects an upgrade to capabilities long used by the Met as well as a number of other police forces,” a spokesperson for the Met says.
Critics argue that the use of RFR encroaches on people’s privacy, is unreliable and could exacerbate racial discrimination. “In the US, we have seen people being wrongly jailed thanks to RFR,” says Silkie Carlo, director of civil liberties group Big Brother Watch. “A wider public conversation and strict safeguards are vital before even contemplating an extreme technology like this, but the Mayor of London has continued to support expensive, pointless and rights-abusive police technologies.”
The Backlash Begins (in Brussels)
In September, the Geneva-based Human Rights Council (HRC) published a report recommending that the protection of human rights must be front and centre of the development of AI-based systems. While the report conceded that AI “can be a force for good,” it also flagged concerns around how data is stored, what it’s used for, and how it might be misused
“AI technologies can have negative, even catastrophic, effects if they are used without sufficient regard to how they affect people’s human rights,” Michelle Bachelet, the United Nations High Commissioner for Human Rights, said in a statement.
One of the biggest problems is the prevailing legal vacuum that makes it all but impossible to guarantee the safety of all the biometric data being harvested. As I reported for NC a few weeks ago, the facial recognition data of over 16 million UK residents, handed over to the NHS App, is being managed by undisclosed companies. The NHS appears to be sharing some of that facial recognition data with law enforcement bodies, according to The Guardian. The data is also likely to be of interest to UK and foreign intelligence services.
Yesterday (Oct. 21), continental rail operator Eurostar unveiled a new pilot scheme for a biometric identity verification technology that offers seamless travel across borders. The system allows travellers to upload their face and passport ahead of travel for a seamless touch-free passage through border checks. It’s fast, smooth, convenient and easy. One just has to hope that one’s facial recognition data is in safe hands.
The European Parliament does not seem convinced. In a recent unanimous vote MEPs called for a ban not only on police use of facial recognition technology in public places but also private facial recognition databases. The vote was non binding, of course, and some EU Member States are desperate to deploy facial recognition technologies to fortify their security apparatuses, but at least it’s a step in the right direction.
More reason to keep wearing masks in public. It’s a twofer.
First they came for the queue jumpers…
And yet amazingly, there was no camera footage of the Skripals movements around Salisbury either on foot or by car, and then collapsing on a bench on that fateful day, that was considered suitable for public consumption.
That happens in the US too. Remember how the footage from parking garage surveillance across from the Pentagon on ‘9-11’ just happened to disappear? The Jeffrey Epstein jail surveiilance camera just so happened to switch itself off for just the right amount of time?
Unreliable for anything other than draining the public treasury. The tech is even more immature than those using it. Facial recognition and machine learning (ML) are sciences that are younger than my dog: and the people deploying this are not scientists. They’re carnival barkers, former used-car salesmen and PMC careerists (Mayor Pete’s phone app crew). The real message for me? Sovereign immunity has to go, for everything: use of violence (say $500 a shove), neglect (for when all that private info gets leaked), and outright theft (think the Fergusen municipal court fines and penalties scam). The only way to flush the parasites out of the system is to bankrupt them personally. If that happens enough times, they’ll scurry away and go into something less harmful to society like … bitcoin.
I’d add that even when “mature” this tech will surely produce so many false positives as to make it useless to actually catch someone accurately. But then that’s never the purpose. Aside from the obvious carnival barker sales pitch and profit a la the murderer Elon Musk and his ilk, and just like now when our forensics theater regularly imprisons the wrong person, it will just be one more piece of “evidence” to convict whoever the system randomly, or not, decides to pin to the wall.
Whitney Webb is scrupulously documenting all the dystopian nightmares the super wealthy are dreaming up for us. These people think they are winning when anyone with eyes can see the “system” they’ve constructed (more accurately the one they deconstructed) is so weak one more push and all the central bank money printing in the zuckerberg approved metaverse won’t save them.
Thanks Yankee, great link and thanks for your participation in NC’s comments.
“Whitney Webb is scrupulously documenting all the dystopian nightmares the super wealthy are dreaming up for us. ”
************
WW does yeolady work in her analysis, she does not get distracted by minutiae but goes right to the heart of the subject.
Keep in mind that the Elite Masters have multiple game plans and scenarios and if one of them is resisted too much, just start pushing another. Keep pushing on all fronts and some objectives will be achieved. Behavioural Scientists right beside them advising which Human Buttons to push.
I remember that incident of that John being pulled over and fined for not wanting his mug snapped. NC featured the original video clip and about the same time, London Police were giving a pub in London a lot of grief because the owners did not want to set up a camera outside their door to record the faces of their customers going in and out that pub. Now that the Pandemic is over (going by media coverage in Oz) will that mean that by next year authorities will try to discourage or even make illegal all those still wearing masks? Will those that want to wear a mask be required to get a doctor’s certificate saying that wearing one is necessary on medical grounds?
I thought it pretty bad when I read several years ago how some US schools were forcing students to give their fingerprints if they wanted to make use of the school library but this is worse. No doubt some businesses will be thinking that this would be a great way to control employees going in and out of their buildings and maybe public transport like London’s tubes will want to deploy this system so that they can do away with most most tickets and passes. Imagine a mass protest. The police will be able to scan all the faces in a crowd of thousands and come up with a list of names, addresses, mobile numbers, etc. so will no longer need to kettle a crowd. They will already have their identities. And the UK would be a great place to trial this system due to the density of cameras.
This was already deployed in the wake of the 2011 riots, and I’m sure at many other protests since. The claim is that only a handful arrests were made based on facial recognition technology, but the 24 hour kangaroo courts had no shortage of suspects to process (I believe over 2,000 sentenced); whether automated or manual combing through CCTV surveillance makes little material difference, I suppose. By the way, the average custodial sentences received for offences committed during the riots were twice as harsh as normal sentencing guidelines.
I have lived in the UK for most of my life now, as an asylum seeker and now a citizen; subjectively, all I have seen is a continual and accelerating lurch towards more authoritarianism, a more stratified class system, more open racism. It is a much less pleasant place to live than it was, though I am of course still grateful for allowing my family to escape the horrors of war. Better pull my finger out and become an immigrant once more!
But Toni, where would one go? People should be reminded that it was American taxpayer dollars who paid for these and other biometrics to be developed in Afghanistan. The software and processes were then sold to the Chinese to practice and use on the Ugher (sp?) minority – who are now imprisoned by the thousands. The Chinese have been fine tuning it and seems like it’s come full circle to the UK, US, and …………
Thanks very much for this important article.
…started using facial recognition systems as a form of contactless payment in cashless canteens (cafeterias in the US). The BBC later reported that two schools in England were also piloting the system. …
In response to the revelations, the Information Commissioner’s Office issued a weak-tea statement, encouraging schools to “carefully consider the necessity and proportionality of collecting biometric data before they do so.”
Behind all this is an enormous, worldwide effort to financially profit from all that biometric data. The piece-by-piece rollout in several countries reminds me of the plot line in the old Sherlock Holmes movie “The Secret Weapon”, where the good guy breaks his invention into 5 parts and has each part manufactured separately by men, unaware of each other, who don’t have any idea they’re working on a part of something that will be much bigger when assembled. (Love those old Sherlock Holmes, Basil Rathbone movies.)
I left this url in today’s links section. I think it also fits here.
https://thegrayzone.com/2021/10/19/health-wealth-digital-passports-surveillance-capitalism/
What I don’t understand is any parent being all in on this. Many years ago (I wish I could find the cite) more parents than I’d like to see in the UK were all in for chipping their kids. Holy cow. If people can’t figure out the enormity of data gathering by now, (smdh)… I remain astounded. It just never stops. Since 2000, it’s been creeping nonstop insanity of the surveillance state. In the US, imo it’s just much worse than it outwardly appears. The Ring Doorbell and Amazon’s appliances on in your house 24/7 shared with law enforcement, despite the denial of this, such as assertions that The Ring doesn’t store data. Do I believe Mr. Alexa? Nahh. But people are all in, and it’s getting worse. Big $$ for third parties for big surveillance.
https://kvia.com/video/2021/10/01/amazons-controversial-vision-for-the-future-of-your-home-security-2/
Student ID’s with pictures are downloaded into many US school lunch point of service systems. The kid either inputs their id number or scans their id card and their face is pulled for the cashier to ensure the right kid is being charged or credited for the meal. It also retrieves any medically identified allergy restrictions the parents have chosen to share with the district. While there is no facial facial recognition scanners, there is a tying of the child to the meal served. This information is deleted annually and must be reloaded for current enrollment use.
There is a lot of money in these programs and the corporations that provide the systems.
“Welcome my friend. Welcome. To the machine”
Pink Floyd
Lets ban face recognition altogether. And also from smartphones! All the above makes the ban seem the only option to save democracy and private life and freedom of individuals. We should also protect identity of journalists and whistleblowers. Imagine that technology coupled with huge databasis and the use of drones. Remember, the camera in your smartphone can be turned on remotely.