During the past few months, I’ve been doing a lot of writing work for IF, a technology studio founded by the award-winning designer and entrepreneur Sarah Gold in 2015. IF helps organisations design software that uses data and machine learning in a way that works better for people. It studies the way software asks for our consent, bombards us with impenetrable terms and conditions, and is difficult to understand or modify—and asks what an alternative might look like that prioritises transparency, accountability and public safety. They’re growing fast and have a lot on, so I’ve been helping them write and publish more: structuring and edit blog posts, speeches and pitch decks, as well as coming up with catchy names for abstract concepts.
I’ve been interested in the ethics of the internet and digital technology for some time—and the project of imagining better alternatives than our current reality—so this was a project that I was excited to get involved with. Like many people with an interest in the ethics of our current technological age, it was probably the Snowden leaks that first made me think seriously about the way my online activity was being monitored and what that meant.
A graphic novel called The Private Eye made me think more about the effects of surveillance on our everyday lives. Part of what grabbed me was the dissonance between my own thoughts and feelings about privacy. Intellectually, I could understand that it would creep me out to have a neighbour monitoring me through a window, and internet surveillance wasn’t so different. But, in practical terms, I clicked ‘agree’ and downloaded all the apps that everyone else did, freely giving up control over data about me. I couldn’t make myself feel the importance of protecting my information.
To sort through my thoughts on this, I talked to a lot of people who were doing practical work in the world to protect people from online surveillance—especially overpoliced and other vulnerable groups—and wrote an article about it for Huck that centred on the grassroots ’cryptoparty’ movement. That was in early 2017; the aftermath of both the 2016 Presidential elections and the British referendum on EU membership loomed large. It wasn’t clear until a year later just what an outsized role illicit data use played in the unexpected outcomes of those two events, when ex-Cambridge Analytica employee Christopher Wylie revealed that the organisation had used people’s Facebook data without their consent to influence political processes.
I came across Sarah Gold’s work at about this time, and was impressed by the combination of traits she held: practicality, deep research experience, and a strong ethical viewpoint. Rather than simply talking about privacy, IF’s approach involves working with organisations to improve the way that machine-learning systems and services that use data work.
I’ve worked with Sarah on her most recent blog posts that are up on IF’s site; more will be rolled out in the coming months. In the meantime I’m also slowly making my way through Shoshana Zuboff’s The Age of Surveillance Capitalism, which is galvanising and eye-opening. For a taster, I recommend this interview with Zuboff on the LRB-affiliated Talking Politics podcast.