Machinic
Infrastructures
of Truth



In today’s saturated cyberscape of post-truth politics, social media platforms use blue verification marks as guarantees of authentic content, supposedly to help users discriminate within the binary rating system of ‘true’ and ‘fake’.


As a researcher in the field of cyberwarfare, I use these verification systems as an entry point that allows me to reverse engineer the mechanics of how content becomes deemed ‘fake’. This process depends on the scalability, also referred to as virality, of the content – a factor which, when analyzed, reveals the relational nature of the system.


To investigate the fluid identity of what is considered fake, I map the ‘Machinic Infrastructures of Truth’ (M.I.T.) - the term I use to define the back-end of verification systems powered by interconnected automated trackers. These systems discreetly measure user attention through cookies, transparent image files, and other devices. Yet, by turning their engagement into a standardized commodity, M.I.T. establish new protocols that alter the behaviour they aim to measure. Although these back-end infrastructures were initially designed to capture and retain users, they now generate feedback loops of misinformation and containment that are impossible to control by the platforms that profit from them.




          рус

















       
  





In today’s saturated cyberscape of post-truth politics, social media platforms use blue verification marks as guarantees of authentic content, supposedly to help users discriminate within the binary rating system of ‘true’ and ‘fake’. These verification systems are an entry point that allows me, as a researcher in the field of cyberwarfare, to reverse engineer the mechanics of how content becomes deemed ‘fake’. This process depends on the scalability, also referred to as virality, of the content – a factor which, when analyzed, reveals the relational nature of the system.



To investigate the fluid identity of what is considered fake, I map the ‘Machinic Infrastructures of Truth’ (M.I.T.) - the term I use to define the back-end of verification systems powered by interconnected automated trackers. These systems discreetly measure user attention through cookies, transparent image files, and other devices. Yet, by turning their engagement into a standardized commodity, M.I.T. establish new protocols that alter the behaviour they aim to measure. Although these back-end infrastructures were initially designed to capture and retain users, they now generate feedback loops of misinformation and containment that are impossible to control by the platforms that profit from them.



There are three layers to the project: an in-depth investigation dissecting cyberwar infrastructures, interviews with media scholars and counter-propaganda activists, and facial filters for users to apply when navigating machinic infrastructures of truth. Investigation contents list:





                                   









   



Investigation    

Interviews    

Filters    


   Info