рус    


Reconsidering Cyberwar:
an interview with Svitlana Matviyenko



The key thinker of digital militarism discusses cyber conflicts, the labour of users, and geometries of the internet.


      Anna Engelhardt: There have been quite some conversations about appealing to the materiality of cyberinfrastructures like internet cables and servers instead of a vague notion of the global web. These discussions brought into the spotlight the differences in practices of data production and analysis, showing these infrastructures as functioning under the jurisdictions of particular countries, situated in specific power constellations. So, instead of starting by outlining a sweeping definition of cyberwar, can I ask you to sketch out the main modes and structures it is operating through? What are the frictions that arise between these cyberwar diagrams if we look from the perspective of Russia and the U.S. as if it was stereoscopy?


        Svitlana Matviyenko: There is an important inconsistency between how cyberwar is imagined or discussed and how it actually erupts and disseminates. It is because cyberwar is often described by the military and security specialists in terms of binaries. Most typically, it is still an old Cold-War binary that structures the U.S. / Russia and/or the U.S. / China enmity to ensure and to support that stereoscopy, if I can use your wording, when two different (if not opposing) perspectives collaboratively produce a shared stereogram that is deeply decisive and that forecloses the complexity of cyberwar logistics, its hybrid nature, its non-linear, multidimensionality and asymmetric dissemination, its shifting proxy battlefields. Cyberwar may be there, where it is not (seen) as well as where it actually is (seen).

Because cyberwar appears as a very murky domain with multiple and contested activities and practices ranging from cybercrime, cyberespionage, and cyberactivism to both conventional and nuclear war, all of which is, of course, included in the military and security definitions. But there are several crucial aspects that are omitted and overlooked. One is a politico-economic analysis of cyberwar, the discussion of which became the subject of several books, and I want to draw your attention at least to one of them, The Real Cyber War: The Political Economy of Internet Freedom (2015) by Shawn Powers and Michael Jablonski, which was, perhaps, the closest in its conceptualization of cyberwar to Nick Dyer-Witheford's and mine in Cyberwar and Revolution. They define cyberwar broadly by conceptualizing it as: "the utilization of digital networks for geopolitical purposes, including covert attacks against another state's electronic systems, but also, and more importantly, the variety of ways the internet is used to further a state's economic and military agendas" (2). Building on their and others' work, we focus on its complexity where Clausewitz's "fog of war" becomes the "fog of cyberwar" as it operates against visibility and transparency, as a veritable fog machine: cyberwar operations are usually conducted covertly and are often intended to confuse; hacks are often hidden from view and, when discovered, are laden with misdirection; attacks attribution is often suspended; signals intelligence implodes into infoglut. This is where the epistemological condition of cyberwar oscillates between no clarity and too much clarity—between no proofs and too many proofs.

What also I like about this very useful metaphor of stereoscopy is how it captures the complementarity between the imaginary enemies. It reminds me of a 1970 American science fiction thriller film, Colossus: The Forbin Project, directed by Joseph Sargent, about a giant computer, Colossus, "the perfect defense system," which, upon its activation, immediately identifies another similar computer on the Soviet side, Guardian, and sends a message to the team of the military and engineers, "There is another system," with which Colossus wants to connect under the ultimatum that it would initiate a nuclear war otherwise. When allowed, these two defense system computers engage in the act of machinic love, speaking in only them understandable language, and in doing so, they excommunicate the humans. In a panic, people disconnect the computers, and the computers launch two nuclear missiles. This is where Dr Charles A. Forbin, the chief designer of the American computer, finally realizes that "Colossus may be built better than [they] thought." War (or, let us say, war as it is perceived by the humans) is, in a sense, an ultimate demand for fusion with a complemental or relative system. (It is similar to how Wendy Chun speaks about "promiscuity" of our machines that constantly initiate connections between themselves, beyond the awareness of the owners, by leaking and sharing information.) In this sense, it may serve as a metaphor for many non-sci-fi scenarios where the human users are an essential part of the war machine operations as far as they are an excommunicated or alienated appendix. Users are the labour force of cyberwar--sharing, liking, hating, commenting, arguing and in the process, revealing their own data and the data of their world-wide networks. Along with those who are part of the paid-for-hire troll armies, they are the precarious workers of war, exploited in different obvious ways, but also not so obvious way--on the level of the unconscious.


        A.E.: It struck me in the "Cyberwar and Revolution" book how you portray the rise of cyberwar as being connected to the rise of platform capitalism. I think this perspective might be extremely intriguing for the Russian-speaking users, whose imaginary is reasonably preoccupied with the state governance of all things digital. Platform monopolists such as Yandex and Vkontakte are typically portrayed as bringing life and innovation, a process that has been violently disrupted by the direct influence of the state. What can we learn from the role Google and Facebook played in bringing cyberwar to life?


        S.M.: These platforms, I am speaking about Google and Facebook, played a very important role, of course, in the rise of cyberwar (as we define it in our book). But it should not be surprising at all, because the genealogy of these platforms or their essential elements, as well as the genealogy of the Internet itself, goes back to the military research. The latter is probably well known to you and the readers: the story of the ARPANET funded by the Advanced Research Projects Agency (ARPA) of the United States Department of Defense, that became the first wide-area packet-switching network, constructed to survive a war and therefore—for war. Although it was re-imagined in the 1980s when the Internet dropped its military identification and became a mass media, the encryption of war values is in its infrastructure; it lives. Here I must admit that when Putin says that the Internet is "a CIA project," it sounds right… And we need to understand the extent of the military impact on the development of most of the popular technologies and platforms. While everyone speaks about the ARPANET, it is less known that Google Maps, for example, grew out of the acquisition of Keyhole, a small Silicon Valley company supported by venture capital from the CIA's venture capital front company In-Q-Tel that, as Powers and Jablonski remind us, worked to "develop fast, accurate and searchable digital maps for the U.S. Armed Forces" (84). Or as Mariana Mazzucato has shown in her work The Entrepreneurial State: Debunking Public vs Private Sector Myths (2013), the research behind almost every component of Apple's iPods, iPhones, and iPads was funded almost exclusively by government agencies, predominantly by the U.S. Department of Defense. There is no end to such examples. At the same time, we cannot deny the role and agency of users in the development of the Internet, they are, indeed, as Janet Abbate wrote in Inventing the Internet (1999), "the most neglected element." The Internet and, in fact, most of it is a collective creation, which remains unacknowledged by the corporate companies. For decades, they successfully manage to persuade their patrons, users, that technology is there for them to use for "for free." Instead, the users "are cast merely as its conscious linkages" (Marx), whose labour is appropriated by platform owners, extracting profit and miraculously making their billions. This schema is as old as capitalism; it is beautifully articulated in Marx' The Grundrisse back in 1858, in its very cybernetic "The Fragment on Machines," where he explains that "labour appears, rather, merely as a conscious organ, scattered among the individual living workers at numerous points of the mechanical system; subsumed under the total process of the machinery itself, as itself only a link of the system, whose unity exists not in the living workers, but rather in the living (active) machinery, which confronts his individual, insignificant doings as a mighty organism" (693). Cyberwar is a capitalist war, it operates like any other capitalist process, although, it is significantly more aggressive.

This is why the ideas of progress associated with platforms, foreign or domestic, should always be confronted by problematizing the conduction of engagement with technology that draws the attention to the most overlooked aspect of cyberwar, users, who have become a key element of the cyberwar machine and the corporate platforms are responsible for that. I am saying it because as much as cyberwar is led by the algorithmic means, users are essential because they function as almost automatic, mechanized relays in the process of information transmission. They are dehumanized (for example, in a way how they often exhibit robotic behaviour; or how they are reified and treated as data; and, of course, when their lost lives are qualified as random "collateral damage" on the margins of kinetic wars). At the same time, users are exploited precisely through their "humanness"—desire, fear, anxiety, knowledge or the lack thereof, all of which is used for polarizing online publics and sustains the reproduction of antagonism and toxicity within various accidental and non-accidental echo chambers and filter bubbles. This is what Peter Sloterdijk describes as a collapse of the global world that is now replaced with the continuous "war of foams."

Such polarization is another structural principle of cyberwar. And I want to suggest that on the level where it seems most visible to us, i.e., in Facebook battles, such polarization is without a substance. The division between opposing Facebook camps, mobilized by identity politics, is blurry and unstable. It functions to foreclose the real division between those who gain profit from the volumes of useful data produced by ideological agonies, the State and the Corporation, and those whose fears and desires are instrumentalized by war, platform users.

(For the Russian context, I want to draw the attention to the first recently released transparency report by Yandex, where you can learn about an extremely small percentage of refusal (only 16%) to the government's requests, which, of course, could be easily compensated by what users make publicly available on soc media platforms, so that in the result, the assembled user profile, indeed, reveals so much more than users, pure data-subjects, know about their own lives themselves. This is how they become prone to sophisticated (or even not so sophisticated) disinformation and manipulation or victims of leverage—now or in the future). Getting back to the point I made earlier about the substantial division, you can think of it through the category of class that reflects the meaning of the precarious positioning of users in the neoliberal process of data production and the uneven access to resources or possibilities, when they are excluded by platform algorithms from receiving certain information or, on the contrary, by receiving the information selected for them by algorithms on the basis of biases and stereotypes embedded in software (for example, the works by Frank Pasquale or Safiya Umoja Noble). This division is crucial because it reproduces and reinforces other substantial divisions within societies established by the expansion of modern capitalism—the divisions by race and gender. These substantial divisions with their legacies of colonialism and imperialism continue to enable and support structural racism and structural inequality. But until they are foreclosed by a stereoscopic image of the "war of foams," unsubstantial Facebook battles, will not be able to address these heavy legacies that tear us apart.


        A.E.: How could you define the connection between surveillance and propaganda in the field of cyberwar? In line with the narrative referred to in the previous question, I would love to look into the long shadow Russian state surveillance casts on the companies' tracking and information distribution ecosystems. In particular, the various approaches towards personalisation and targeting intrigues me, being silenced by the omnipresence of state. What are the new modes of personalisation and depersonalisation that are deployed in cyberwar through such state-capital collaboration? How does it weaponise the defining characteristic of social platforms to induce viral capacity of information?


        S.M.: I could address it through the notion of "communicative militarism," which I develop drawing on Jodi Dean's notion of "communicative capitalism," introduced more than a decade ago. Back in 2005, Dean wrote that "the fantasy of activity or participation [in the networked society] … materialized through technology fetishism,"… which was – and still is – often confused with freedom or democracy. She wrote about the sense of ineffectiveness of the online communication as those at power learned to ignore the attempts to hold the governments accountable for their action, while the continuous flow of such communication was only feeding capitalist platforms. At that time, Dean described it as a "post-political" world. A decade later, we are witnessing how capital learned the ways to monetize this communication by highly politicizing and militarizing it.

As we evaluate the role of corporate platforms in cyberwar, it is crucial to pay attention to problematic complementarities and reciprocities that often remain hidden. For example, there is troubling reciprocity between surveillance and mobilization. It is very disturbing that the power, energy, tactics and techniques of social mobilization, the examples of which we have been seeing consistently throughout the last decades, have been constantly hijacked by the state, police, or corporations. The surveillance practices are updated after each protest; states are growing their database of suspects; corporations are finding more about users, their troubles, sensitivities, hopes. And the same data archive is used for targeting users for political or military reasons and for commercial reasons, with advertising. That's regarding the responsibility of the platforms; but users are, too, responsible. Living in "surveillance capitalism" (Shoshana Zuboff), we are all complicit by agreeing with this ongoing deception that we normalized and do not any longer find menacing. By letting the platforms occupy our lives, we submitted our bodies and the bodies of our close ones as living advertising platforms that operate according to the YouTube regime. We agree that it is okay to be a little bit deceived. But an ad is a blueprint of fake news. Fake news enters our thoughts in a very similar way, playfully, as funny or silly digital parasites that, we think, could be easily dismissed or forgotten. But they live and make connections with other half-remembered hints and clues, which creates a perfect environment for breeding various conspiracy theories that are so needed in the time of general uncertainty. The only thing that can serve as an antidote here is critical thinking, but to develop it, one needs time and appropriate infrastructures. And even then, there is, or can be, disinformation to trick everyone, it just depends on the amount of data available about a user to craft a perfect product and the number of funds the government can spend on catching you.


        A.E.: To address the politics of virality, it would be essential to employ your insights on how we experience software as prosthetic, that you touch upon in the book "The Imaginary App". What are the tensions between the relation to something as a prosthesis as opposed to an extension/modification? How do these embodied relations to the online space manufacture cyberwar? Could you draw from such feelings of the digital landscape a knowledgeable way to talk about viral affects?


        S.M.: Indeed, several years ago I was doing work on apps when they became "a thing." One of the ways how apps were often interpreted was that they were called "extensions" (with the allusion to either McLuhan's "media, the extensions of man" or Mark Weiser's "ubiquitous commuting" when the "quiet" computer was described as an extension of user's unconscious). I noticed that the notion of "extension" was often used interchangeably with the notion of "prosthesis"/" prosthetic," which made me want to highlight the distinction between these concepts. While an "extension" is something more than you, a "prosthesis" compensates for what was lost, but here "extension" and "prosthesis" constitute a cycle of transformations, where every "extension" eventually becomes an essential "prosthesis." In the cybernetic cycles of merging with technologies, for example, by growing dependency for social, economic or other reasons, by delegating jobs to our machines, we transform in many curious ways but often at the cost of allowing the unprecedented closeness with the machine. The machine, of course, remains a foreign mechanical body, operated by the corporation or accessed by the state, being their part, regardless of how cute it is personalized and accessorized to represent their users. This misrecognized imaginary closeness is exploited by cyberwar.


        A.E.: Analysis of cyberspace and cyberwar is saturated with spatial metaphors. Even though, as you overview in "Cyberwar and Revolution" book, cyberspace was proclaimed in 2009 by the U.S. Department of Defense "as a military domain equivalent to land, sea, air, and outer space", it did not resolve the tension that one might field trying to think of cyber as space. What would be directions and geometries that would help us reflect along these lines?


        S.M.: I find any "global" and "spherical" metaphors very problematic when we speak about communication, economy or war. They are inherited from the 1960s' fantasies about "global village" (Marshall McLuhan) or the cybernetic world "where we are free of our labors and joined back to nature, returned to our mammal brothers and sisters, and watched over by machines of loving grace," as it was ironically documented by American poet Richard Brautigan in 1967. Today it is pretty clear that the "global Internet," the utopian concept of "network of networks" as a unifying apparatus has imploded. Instead, we find ourselves in a curved and traversed topological realm of ruptures, folds, gaps, chambers, bubbles and other panopticons that erect invisible walls between users' eye and the gaze of power or the gaze of war. Here space surrenders to time because, without the consideration of time, these shapes remain incomprehensible. You need to spend time to realize that a topological figure is never what it looks like: in transformation, it preserves the relation between its points, nodes and ties by hiding it from you so that the continuities may look like ruptures and vice versa.

For example, Zygmunt Bauman and his several co-authors described data-subjects as caught in a Mobius-strip-like network space, whose digital sovereignty claimed by various cloud sovereigns such as Amazon, Facebook, Google, Baidu, Tencent, VKontakte or Yandex. The two sides of the Mobius strip are local national security apparatuses and global transnational surveillance. They constitute one twisted surface of continuous data flow that corporate platforms disseminate planetarily, which transforms users into data-subject of conflicting laws. While totalitarian regimes silence users, neoliberal regimes nudge them to speak "freely," and it is almost impossible to say where the biggest danger is. But the longer we live the experience of divided data-subjects, the better we become aware of cyberwar's perfect trap, where everything said--and even more so the unsaid--will be used against us.


About the author


Svitlana Matviyenko is Assistant Professor of Critical Media Analysis in the School of Communication of Simon Fraser University in Vancouver, where she is also Associate Director of the Digital Democracies Institute. Her research and teaching are focused on information and cyberwar, political economy, media and environment, infrastructure studies, STS. She writes about practices of resistance and mobilization; digital militarism, dis- and misinformation; Internet history; cybernetics; psychoanalysis; posthumanism; the Soviet and the post-Soviet techno-politics; nuclear cultures, including the Chernobyl Zone of Exclusion. She is a co-editor of two collections, The Imaginary App (MIT Press, 2014) and Lacan and the Posthuman (Palgrave Macmillan, 2018). She is a co-author of Cyberwar and Revolution: Digital Subterfuge in Global Capitalism (Minnesota UP, 2019), a winner of the 2019 book award of the Science Technology and Art in International Relations (STAIR) section of the International Studies Association and of the Canadian Communication Association 2020 Gertrude J. Robinson book prize. She is currently working on the co-edited volume Cyberwar Topologies: In Struggle for a Post-American Internet.