Published 23 August 202323 August 2023 · Digital rights The rhetoric used to harm your digital rights Samantha Floreani The first time someone said that I must not care about children’s safety, it was as though a trapdoor had opened underneath me. I was relatively new to digital rights advocacy, and we were having a professional policy discussion about end-to-end encryption when the conversation came to an abrupt halt. I was totally unprepared, left gobsmacked and scrambling. This is how I began to learn that one of the quickest ways to shut down a debate is to misrepresent your critics as sympathetic to a loathsome, indefensible extreme. Want a privacy-enhancing approach to contact tracing? I guess you don’t care about public health. Want to prevent widespread proactive surveillance online? You must want children to suffer. Want to ensure the protection offered by end-to-end encryption? How can you say that while terrorists are out there. These manipulative debate tactics plague the work of many socially progressive activists. In the realm of digital rights, our denigrators seek to dismiss valid concerns regarding privacy, censorship and surveillance through bombastic statements and personal misrepresentations that effectively wipe away any nuance or complexity in tech policy debate. It’s a blatantly dishonest strategy, but it’s also very effective. As Covid-19 began to take hold, the digital rights community anticipated not only that such a crisis would create conditions for increased surveillance and techno-solutionism, but that it would also come with an onslaught of political messaging designed to crush critique. In an attempt to pre-empt this move, over one hundred human rights groups urged governments around the world to uphold human rights in their digital pandemic response. In Australia, lawyer and digital rights activist Lizzie O’Shea called for a politics of care and solidarity rather than coercion and fear. Over the course of 2020, Australian governments—generally speaking—prioritised technological responses to the pandemic that emphasised a punitive, surveillance-based approach, and bombarded people with political rhetoric to make criticism or resistance unfavourable. Those who called for data protection for the QR code check-in system or suggested privacy- and security-enhancing alternatives to the centralised model of the COVIDSafe App were accused of being unsympathetic to the severity of the crisis. Calling it a ‘Team Australia Moment’ and saying it was a ‘time for trade-offs’ finished the job. The divisive you’re-with-us-or-against-us dichotomy was exploited by the far-right using COVID-19 conspiracies to gain fresh recruits for the so-called ‘freedom rallies’. At the time, Jeff Sparrow warned against allowing the fascist right to claim the rhetoric of liberty—and that’s precisely what they did. As a result, the concept of freedom shifted further out of the grasp of the left. Digital rights activists, privacy advocates and security experts weren’t trying to get in the way of taking action in response to the pandemic. Many were, instead, working to ensure that the government’s use of tech was both rights-respecting and effective. We emphasised that robust human rights protections were essential to both survive the crisis and emerge with rights intact at the other end. Australian governments could have employed technology to improve and enhance support for the population they were responsible for. Instead, they prioritised tech projects based on an ideology of surveillance and control. There’s much to be learned from the COVIDSafe App’s embarrassing tale, including that ignoring rights advocacy only served to undermine the outcome in the long run. While much of the increased government surveillance in Australia through the height of the pandemic appears to have eased, the same cannot be said for other countries around the world. (It also cannot be said for the persistent increase in non-government surveillance in workplaces and schools, which appears to be here to stay.) Yet there is little comfort to be drawn in reflecting upon how quickly our governments turned to punitive surveillance rather than robust community support, and how little there is to prevent this from happening again. We also should not take comfort in the fact that one of the key reasons that Australia failed to develop immensely powerful digital tools for surveillance was not out of respect to human rights or freedoms, but sheer incompetence. The danger of using polarising language and tactics in a rights context is that it forces people to make false choices. It was never privacy or public health, just as it’s also not privacy or safety. In times of crisis, governments often rush through extraordinary powers—many of which undermine human rights—that are rarely wound back upon return to relative calm. For instance, Australia hastily passed just shy of one hundred national security laws replete with unprecedented rights-infringing powers, justified as necessary to respond to the immediate threat of terrorism in the wake of the September 11 attacks. Over twenty years later only one significant power has been repealed. But what happens when we’re always in crisis? Before the pandemic, terrorism was the galvanising force behind legislation to undermine encryption or to establish the metadata retention scheme. Before that, it was drugs and organised crime. Today, it’s the spread of Child Sexual Abuse Material (CSAM) and concerns regarding children accessing online pornography. Without dismissing the gravity of any of these issues, we are never without a looming Big Bad ready to justify undermining privacy and security, or to strengthen mechanisms for surveillance or censorship. Big, emotive justifications for human rights encroachments are politically effective, but what should alarm us most are the outcomes. For example, the controversial metadata retention scheme was established in 2015 under grand assurances and that the data would only be used to protect Australians from the most serious of crimes—like terrorism. Pushback on the proposal was met at the time with political disdain, positioning those with concerns as having a lack of appropriate regard for national security. Fast forward to 2016 and over sixty agencies applied to access the data, including local councils seeking to chase down fines. Jump to 2021, and the Ombudsman found that all agencies investigated had accessed Australians’ metadata without proper authorisation. In 2023, metadata is being used to check welfare-recipients’ relationship status. To continue to do digital rights activism despite the seemingly sisyphean nature of the task requires stamina, resilience and solidarity. Especially because absolutely none of this is new. In 1998, Tim May proposed that the ways that privacy and anonymity would be attacked would be by invoking the ‘Four Horsemen of the Infocalypse’—namely ‘terrorists, paedophiles, drug dealers, and money launderers.’ Over fifteen years later, Cory Doctorow highlighted how the very same tactics used to undermine encryption in the ‘Crypto Wars’ of the 90s were back again. While the digital rights movement has thankfully evolved away from cyber-libertarianism, the very same arguments are still being used around the world today to push for invasive and repressive tech policy that threatens our collective human rights and safety. Today, the major justification for tech policy that poses threats to our rights, security and safety is the highly politicised notion of protecting children online. Proposals for age-verification systems to restrict access to certain material—such as pornography—are rife across Australia, the UK and the US despite constant controversy and criticism. Such systems are not particularly effective at preventing children from accessing pornography, and create critical privacy and security risks. Calls for increased automated content moderation on social media platforms similarly seek to protect children from seeing inappropriate material, but ignore the persistent challenges of bias, inaccuracy and over-capture—resulting in, among other things, censorship of LGBTQ+ people, sex workers, and sexual health and education advocates. Moves to undermine end-to-end encryption, or side stepping it altogether with proposals for proactive detection and client side scanning for particular content are also framed as necessary to protect children, but ignore how weakening digital security makes everyone less safe, including children. None of this is to say that we should do nothing, but it is to say that we should be thinking very critically about the kind of technology we usher into existence and allow to become the norm. It would be dishonest to suggest that I don’t sometimes become shaken when people use rhetorical tactics to suggest that I personally must not care or worry about abuse, violence or public health. For many, fighting for a liberatory digital future is precisely because we care about these issues. We must find ways to use technology to build a future that doesn’t equate justice with punishment, or safety with surveillance. Technology—the devices, the industry, the political ideology underpinning it, and the policy and legislative agendas around it—present critical tensions for rights and freedoms in the digital age which stand to impact all of us. So how can we have the requisite discussions when the same rhetorical weapons are repeatedly used to crush criticism? The first time I met James Clark—a long-term social movement organiser—he made a compelling observation. The digital rights movement, he said, appeared to be stuck where the climate movement had been several years ago: we are relying on convincing people with research and logic, and it’s not working. Echoing Bill McKibben, he emphasised the importance of not mistaking an argument for a fight. You can win an argument with facts and logic, but to win a fight you need power. The arguments employed by those pushing repressive tech policy aren’t designed to be effective at convincing those with expertise in technology and digital rights—they’re designed to influence those reading, listening, or otherwise tuning in. Privacy-defender Meredith Whittaker has said: We are right, our arguments are robust, and we have done the reading. But if we want to defend privacy, we’ll need to be coordinated and bold, and not make the mistake of assuming that being correct is in itself a strategy. There is a vital role to play in continuing to show up and to hold ground. Having the same debates, bearing witness, and documenting injustices doesn’t always feel groundbreaking, but the stakes couldn’t be higher. By continuing to critique and offer alternative approaches to surveillance and control, digital rights activism is laying the groundwork so that one day, those ideas will be picked up and used to help build a better future for all of us. In moments of doubt, I take comfort in remembering that the Luddites—a group of working-class political organisers so purposefully misrepresented their name is now often wielded as an insult—laid essential groundwork for thinking critically about, and rebelling against, exploitative and oppressive uses of technology under capitalism. The Luddites were punished for their rebellion and had their name besmirched by the elite, but their work was not for nothing: it is increasingly being referenced and re-vitalised today. It should inspire us to keep showing up every day, despite semantic fatigue of the same recurring rhetorical battles. But we also need to avoid becoming trapped in an argument at the detriment of the fight. From digital rights to climate action, from prison abolition to LGBTQ+ rights, our fights are connected. We can all keep having our own individual arguments and attempting to convince people to care about our respective causes, but in the end what will really shift the dial is to recognise our common ground and build power together across movements. Socially progressive activism requires resilience and persistence—a muscle we exercise in every frustrating debate—but it also requires solidarity and care. Image: Flickr Samantha Floreani Samantha Floreani is a digital rights activist and writer based in Naarm. More by Samantha Floreani › Overland is a not-for-profit magazine with a proud history of supporting writers, and publishing ideas and voices often excluded from other places. If you like this piece, or support Overland’s work in general, please subscribe or donate. Related articles & Essays 1 First published in Overland Issue 228 28 April 202225 July 2022 · Digital rights The town square doesn’t belong in private hands James Clark Digital platforms are at the centre of modern life and we deserve to have a real say in shaping how those platforms work and who they work for. We should be able to engage in civic life free from surveillance, manipulation and control. Now is the perfect time to be asking serious questions about how we best build, maintain and govern such critical social infrastructure in a way that is accountable to the people. 2 First published in Overland Issue 228 16 February 202224 March 2022 · Digital rights Online safety begins with participation Lizzie O'Shea and Samantha Floreani If Labor wanted to set itself apart, it could start by promising to implement a range of policy proposals that are already on the table that could contribute to a more expansive idea of online safety. Bolder still, strong opposition against the introduction of rampant surveillance powers, invasive technologies such as facial recognition, and attempts to undermine our digital security would be welcome moves to enhance our collective safety in the digital age.