dinsdag 6 juni 2017

Theresa May Blaming Internet

Theresa May outside No 10
 Theresa May outside No 10: ‘Ban WhatsApp, and would-be terrorists will find another app, as will those struggling against dictators.’ Photograph: Will Oliver/EPA
We can feel pretty certain that the London Bridge attackers did the following things: owned smartphones; and used Google, YouTube, Facebook and WhatsApp. That isn’t because owning those things and using those services marks you out as a terrorist: it’s because it marks you out as someone living in the west in the 21st century.
The problem, as those companies (actually only two: Google owns YouTube, and Facebook owns WhatsApp) are discovering, is that politicians aren’t too picky about the distinction. Speaking outside 10 Downing St this morning, Theresa May was much more aggressive in her tone than previously. The London Bridge attack had its roots in Islamic extremism, she observed: “We cannot allow this ideology the safe space it needs to breed. Yet that is precisely what the internet, and the big companies that provide internet-based services, provide.” She continued: “We need to work with allied democratic governments to reach international agreements that regulate cyberspace to prevent the spread of extremism and terrorism planning.”
Which goes to show that when you need a scapegoat, the internet will always be there, as will big internet companies. The latter are by now becoming familiar with the process: there is an attack; the dots are joined to show how their services were used for tuition and/or planning; governments demand “action”; the companies improve their methods for removing extremist content; and we await the next cycle.
But nothing substantive is different. The question is whether it ever can be. Germany is seeking a law which brings in hefty fines for being too slow to remove hate speech. But that isn’t the same as preventing it, or stopping planning.
Play Video
6:26
 Theresa May responds to ‘brutal terrorist attack’ in London
“The kneejerk ‘blame the internet’ that comes after every act of terrorism is so blatant as to be embarrassing,” commented Paul Bernal, a law lecturer at the University of East Anglia who has worked with the police. The pressure, he says, comes from the politicians. For an example look no further than John Mann, MP for Bassetlaw since 2001, who this morning said: “I repeat, yet again, my call for the internet companies who terrorists have again used to communicate to be held legally liable for content.”
Perhaps he has forgotten the 1970s, when in the pre-mobile phone era the IRA would use phones to organise its attacks – without anyone calling for (nor were there online social networks to “radicalise” would-be IRA members, but still they joined). The authoritarian sweep of Mann’s idea is chilling: since legal liability is meant to deter, the companies would need people to monitor every word you wrote, every video you watched, and compare it against some manual of dissent. It’s like a playbook for the dystopia of Gilead, in The Handmaid’s Tale (which, weirdly enough, most resembles Islamic State’s framework for living).
The problem is this: things can be done, but they open a Pandora’s box. The British government could insist that the identities of people who search for certain terror-related words on Google or YouTube or Facebook be handed over. But then what’s to stop the Turkish government, or embassy, demanding the same about Kurdish people searching on “dangerous” topics? The home secretary, Amber Rudd, could insist that WhatsApp hand over the names and details of every communicant with a phone number. But then what happens in Iran or Saudi Arabia? What’s the calculus of our freedom against others’?
Similarly, May and Rudd and every home secretary back to Jack Straw keep being told that encryption (as used in WhatsApp particularly) can’t be repealed, because it’s mathematics, not material. People can write apps whose messages can’t be read in transit, only at the ends. Ban WhatsApp, and would-be terrorists will find another app, as will those struggling against dictators.
It’s true that the internet companies’ business models, of selling adverts against your attention, means they never had an incentive to be careful about what gets on to their platforms. We’re living with the unintended consequences. Yet speaking to people in those companies, one still hears enormous resistance to the idea of pre-filtering. It’s instead becoming an article of faith that “artificial intelligence” or “machine learning” will learn to spot this stuff and act. That’s far from proven, however. These companies are struggling with a problem they made that dwarfs their present capabilities.
So what can be done? It might seem obvious, but while (to quote a famous hacker) “You cannot arrest an idea”, you can certainly stifle its attractiveness. Driving Isis out of Mosul and into the desert will cut its funding and its voice. Not helping countries that help jihadi groups would be smart too. Theresa May mentioned working with “allied democratic governments”. But it’s really the undemocratic ones – Saudi Arabia, for example – where one might start work. Ideas fade. The internet, though, isn’t going anywhere.

Geen opmerkingen:

We must dismantle Zionism. It’s an abomination and maker of monsters.

  https://x.com/kennardmatt/status/1861700707243413858 Matt Kennard @kennardmatt · 12 u Shame on Alan Dershowitz. Shame on every other demo...