Face up to it – this surveillance of kids in school is creepy | Stephanie Hare

A A few days ago, a friend sent me a screenshot of an online survey sent by his children’s school and a company called ParentPay, which provides technology for cashless payments in schools. “To help speed up school meal service, some areas of the UK are trialling using biometric technology such as facial identity scanners to process payments. Is this something you’d be happy to see used in your child’s school?” One of three responses was allowed: yes, no and “I would like more information before agreeing”.

My friend selected “no”, but I wondered what would have happened if he had asked for more information before agreeing. Who would provide it? The company that stands to profit from his children’s faces? Fortunately, Defend Digital Me’s report, The State of Biometrics 2022: A Review of Policy and Practice in UK Education, was published last week, introduced by Fraser Sampson, the UK’s biometrics and surveillance camera commissioner. It is essential reading for anyone who cares about children.

First, it reminds us that the Protection of Freedoms Act 2012, which protects children’s biometrics (such as face and fingerprints), applies only in England and Wales. Second, it reveals that the information commissioner’s office has still not ruled on the use of facial recognition technology in nine schools in Ayrshire, which was reported in the media in October 2021, much less the legality of the other 70 schools known to be using the technology across the country. Third, it notes that the suppliers of the technology are private companies based in the UK, the US, Canada and Israel.

The report also highlights some gaping holes in our knowledge about the use of facial recognition technology in British schools. For instance, who in government approved these contracts? How much does this cost the tax payer? Why is the government using a technology that is banned in several US states and which regulators in France, Sweden, Poland and Bulgaria have ruled unlawful on the grounds that it is neither necessary nor proportionate and does not respect children’s privacy? Why are British children’s rights not held to the same standard as their continental counterparts?

The report also warns that this technology does not just identify children or allow them to transact with their bodies. It can be used to assess their classroom engagement, mood, attentiveness and behaviour. One of the suppliers, CRB Cunninghams, advertises that it scans children’s faces every three months and that its algorithm “constantly evolves to match the child’s growth and change of appearance”.

So far, MPs have been strikingly silent on the use of such technology in schools. Instead, two members of the House of Lords have sounded the alarm. In 2019, Lord Clement-Jones put forward a private member’s bill for a moratorium and review of all uses of facial recognition technology in the UK. The government has yet to give this any serious consideration. Undaunted, his colleague Lord Scriven said last week that he would put forward a private member’s bill to ban its use in British schools.

It’s difficult not to wish the two lords well when ye return to CRB Cunninghams’ boasts about its technology. “The algorithm grows with the child,” it proclaims. That’s great, then: what could go wrong?

Stephanie Hare is a researcher and broadcaster. Her new book is Technology Is Not Neutral: A Short Guide to Technology Ethics

Leave a Comment