Vertical video. Content is king. Personalisation. 2021 rolled around, and so did the ‘trends’ pieces – each with their own shiny take on the future of social and digital that sounded eerily familiar from every trend piece prior.

Daunted by the prospect of trying to add a new flavour to these stories after a year which showed just how little you could predict, two of TMW’s strategy team began to explore the more dystopian landscape of our digital futures.

Looking toward the next quarter, year, decade and century, Esme Noble and Olivia Wedderburn explored abstract concepts and scarily adjacent realities that, should sci-fi imitate life, would definitely impact how consumers operate across online spaces.


Explanation

Individuals are allowed to register key components of their existence to an AI database to let them know if they’re being impersonated elsewhere on the web. They can register their physical likeness, their fingerprints, iris ID and other traits they deem to be personal to them and possible to impersonate, such as their voice. 

In this world

You wake up in the morning to a notification on your phone that your face has been spotted in a video on a dating application in Hanoi at 2.37am. You are asked to verify whether this is a) your face and b) it was your action. You click on the link and see that it is a photo of you on holiday that is publicly accessible. You explain that it is you, but it wasn’t you who uploaded it, and add a flag on the IP address issuing it as high impersonation risk. You carry on with your day. The next week, you awake to another notification. However this time, your likeness has been used on a video uploaded to a well known pornography site from a location in the US. The video is not you, but it is indistinguishable to tell,  as the face and body profile is alarmingly similar. You flag that this is not you, but is a deepfake of your existence. The video is immediately removed, and the IP address forwarded to the location's authorities for them to decide how to proceed. 

Will it happen

Deepfake detection tech already exists, and there are various tools out there to help identify whether your image is being used in ‘catfish’ scenarios, but with deepfake technology evolving quicker than we can keep up with, there is no such activity to alert to your image being manipulated in real time. However, as various videos of government officials being deepfaked exist, and Channel 4 even doing a pretty good job of impersonating the queen, it is incredibly likely that this technology will be developed in the near future.