What should social media giants do to protect children? | Technology
JThis week, tech leads from GCHQ and the National Cyber Security Center made a powerful intervention in an incredibly contentious debate: What should social media companies do to protect children on their platforms?
But that is not how the intervention was taken by all parties. Others have heard something quite different: tired arguments against end-to-end encryption, dressed in new clothes but disguising the same attack on privacy rights with the same excuse that is always invoked by the forces of the order.
From our story:
Tech companies are expected to move forward with controversial technology that analyzes child abuse images on users’ phones, tech officials from GCHQ and the UK’s National Cyber Security Center said.
So-called “client-side scanning” would involve service providers such as Facebook or Apple building software that monitors communications for suspicious activity without the need to share message content with a centralized server.
Ian Levy, technical director of the NCSC, and Crispin Robinson, technical director of cryptanalysis – decoding – at GCHQ, said the technology could protect children and privacy at the same time. “We have found no reason why client-side analysis techniques cannot be implemented safely in many situations that one will encounter.they wrote in a new working paper .
You may remember the client-side analytics debate a year ago. To quote myself:
Apple is taking a big step into the unknown. Indeed, its version of this approach, for the first time from a major platform, will scan photos to users’ hardware, rather than waiting for them to be uploaded to the company’s servers.
By standardizing on-device analysis for CSAM [child sexual abuse material], critics worry, Apple has taken a dangerous step. From there, they argue, it’s simply a matter of degree that our digital lives are monitored, online and offline. This is a small step in a direction to extend digitization beyond CSAM; it is a small step in another to extend it beyond simple photo libraries; it’s one small step in another to go beyond perfect matches of known images.
So why is Levy and Robinson’s intervention important? To me, this is a sincere attempt to address the concerns of these critics, to expose the benefits of client-side scanning in combating specific categories of threats, and to offer meaningful solutions to common fears.
The devil is in the details
To take an example from the 70-page article: the duo are trying to dispel fears that the CSAM’s digitized photo listings will extend beyond the known CSAM to include, for example, politically charged images . In clearer terms, what would stop China from requiring Apple to include the famous images of Tank Man in its scanning device and forcing the company to flag any iPhone containing this image as potentially criminal?
Robinson and Levy suggest a system that would do just that. They propose that the list of images be compiled by child protection groups around the world – organizations like the National Center for Missing and Exploited Children in the US or the Internet Watch Foundation (IWF) in Britain . Each of these groups already maintains a database of “known” CSAMs, which they cooperate to keep as complete as possible, and the scan database can only consist of these images in all of the groups’ lists.
They can then publish a hash, a cryptographic signature, of that database when they turn it over to tech companies, who can display the same hash when it’s loaded onto your phone. Even if China were able to force its national child protection group to include Tank Man in its list, it wouldn’t be able to do the same for the IWF, so the image wouldn’t be streamed to devices; and if that caused Apple to load a different database for China, the hash would change accordingly and users would know that the system was no longer trustworthy.
The point is not that the proposed solution is the best possible way to solve the problem, write Levy and Robinson, but to demonstrate that “details matter”: “Discussing the topic broadly, using ambiguous language or hyperbole will almost certainly lead to the wrong outcome.
Fear and rage are genuine
In a way, it’s a powerful rhetorical move. To insist that the conversation focus on the details is an insistence that people who reject client-side analysis on principle are wrong to do so: if you believe that the privacy of private communications is and should be an inviolable right , then Levy and Robinson effectively argue that you’re cut out of the conversation in favor of more subdued people who are willing to discuss compromise.
But frustratingly, much of the response has been the same generalities that accompanied Apple’s announcement a year ago. The technology news site The Register, for example, ran a furious editorial saying, “The same argument has been used many times before, usually against one of the four horsemen of the infocalypse: terrorists, drug traffickers, child sexual abuse material (CSAM), and organized crime.
I’ve spent enough time talking to people who work in child protection to know that the fear and anger over the damage done by some of the biggest companies in the world is real, that you whether or not they are correctly targeted. I don’t pretend to know the motivations of Levy and Robinson, but this article represents an effort to create a conversation, rather than continuing with a shouting match between two irreconcilable sides of an argument. He deserves to be treated as such.
It’s not ‘your job’
Minecraft is big. You may have heard of it. So when the game makes a moderation decision, it’s a bit more important than when Bungie decided to nerf scout rifles in Destiny 2. Particularly when the moderation decision goes like this:
Minecraft will not allow the use of non-fungible tokens (NFTs) on the popular gaming platform, with the company describing them as contrary to Minecraft’s “creative inclusion and play-together values”.
Minecraft represented an attractive potential market for NFTs, with a user base – estimated at over 141 million as of August 2021 – already engaged in sharing unique digital items developed for the game.
But the Microsoft-owned development studio behind Minecraft, Mojang, has shut down speculation that NFTs might be allowed in the game. In a blog post on Wednesday, the developers said blockchain technology was not allowed, stating that it was against the values of Minecraft.
The incredible success of Minecraft is due to its extensibility. In addition to the game’s built-in creative aspects – often described as the 21st century’s answer to Lego – users can modify it in larger or smaller ways, yielding new experiences. This flexibility proved tempting to NFT creators, who decided to create new features in Minecraft and sell them as digital assets.
In theory, this is the perfect NFT opportunity: a digital native creation, with a truly feasible use case and a demonstrably viable market. Startups flocked to the field: NFT Worlds sells pre-generated Minecraft landscapes, which people could create experiences on and resell for a profit; Gridcraft operates a Minecraft server with its own crypto-based economy.
Or they did. Now, it seems NFTs have become such a toxic phenomenon that even passive acceptance is too much for a company like Mojang. If you want to succeed in this world, you have to go it alone.