Leading on from last week’s data-focused newsletter, let’s talk about doppelgängers.
When I first came across the notion of data doppelgängers I became very fascinated with the concept - what the researchers I was looking to referred to as “a person’s unwanted digital second self”.
Something I really liked about the perspective of those researchers in particular is the way they connect the idea of the data doppelgänger to the historical use of the doppelgänger in literature. Unfortunately it doesn’t appear the full paper was ever published (it wasn’t then, and I still can’t find it now). If you’re out there Alexander Reppel and Isabelle Szmigin - I want to know more!
If you’re someone who enjoys podcasts, I’d also recommend this episode of The Digital Human which presents a slightly broader overview of doppelgängers in the digital age. It’s a pretty massive topic and certainly much bigger than this newsletter can cover.
Spotify knows me better than I know myself
In short, you have a data doppelgänger. An uncanny approximation of you based on your online data trail. I say uncanny because, whilst constructed from elements of you, your data doppelgänger reflects back an image that may be familiar but isn’t entirely exact. To big brands and advertisers, they don’t really care about minor inaccuracies, they have enough to go on to sell you stuff - but you might be more impacted by those discrepancies than you’d first think.
“This can have consequences for the user when they and their doppelgänger become misaligned, such as in the case of women who have suffered miscarriages still receiving targeted content designed for expectant mothers (Merrick, 2014). In this example, whilst the user has experienced a miscarriage, her data doppelgänger is still pregnant.”
- me, I wrote this
We don’t necessarily notice our data doppelgänger until it shows itself through inaccuracy - when you laugh at a ridiculous advert that clearly doesn’t make sense for you and think who does the algorithm think I am? But most of the time the sheer levels of personalisation of the web are invisible, because there is no way of knowing what’s being hidden from you. Your experience of the web is directly mediated through the eyes of the doppelgänger. As a result, you get shaped by the algorithms, because the parts of you that don’t fit get rounded off as they guide you into measurable categories.
Like how some days I wonder if I’ve spent so long using Spotify’s recommendation algorithm that my music taste is completely different than what it might have been otherwise. Pick any song from my 2021 playlist so far and let Spotify do the rest. It will continue to play through songs I like - many that feature on that same 2021 playlist that I personally constructed for myself - even if you’re not in any way connected to my account. Because “music Edie likes” is in fact an invisible unnamed category (“choppy chill out electro pop feat weird noises”?) that they’re probably serving up to more than just me. What feels like me-ness is probably many other people’s-ness.
I wrote a whole essay on how this is kinda wild and a little bit scary - and gets much deeper and more impactful than mere Spotify recommendations. LMK if you fancy a read.
PS This is also the reason that no, your phone isn’t listening to you, it just knows you way better than you think it does. It appears we are frighteningly predictable.
What has this got to do with anything?
This interest and research inspired a piece of artwork I produced in 2018 called The Doppelgänger (which I’ve mentioned briefly before). It featured a live streamed view of an unseen user’s computer screen, giving hints to the type of person they might be through personalised search results and advertising suggestions.
I’m currently revisiting this work and some of its themes in my exploration of apophenia and delusional thinking. That’s where I think these ideas of unseen algorithmic actors, prediction and personalisation sit really interestingly within the unreality paradigm.
One of the first things you get asked by a doctor checking for delusional thinking is whether you feel like TV/radio shows are talking directly to you - but surely, in our current personalised reality, is that really so improbable?
If you’ve noticed I’ve been a bit less active on the interwebs lately that is because I’ve been working my ass off on a v exciting (but time consuming) augmented reality commission called Folklorica. It’s launching next Monday (17th) at www.folklorica.art and IRL in the vicinity of Warwick Arts Centre.
Warwick Arts Centre is a really interesting organisation to be working with right now as they connect the University of Warwick (which is in Coventry) with Coventry as UK City of Culture 2021. Folklorica features a bunch of hand-crafted digital sculptures I created in response to urban legends and folklore of the region, as told by people connected with the Arts Centre.
By this time next week that will be live! So wish me luck and see you next Wednesday pals xoxo
And ofc I meant organism!
More people need to read these musings! It is quite scary to think that we may be being "re-created" or totally reshaped by our technology. Do I really like Marmite or do I just think I do because I bought a book on "Horse Riding for Beginners" in 2007? I guess we need to consider whether a sense of self really matters in the long term. Are we just becoming one multicellular organ instead of individuals? When we have no private data left will it be liberating or appalling? Also how long is Folklorica running? Will we be able to see it?