How can we infect students with the curiosity and wonder to engage the world as citizens of the cognisphere, rather than 24-7 mercenaries operating an accidental planetary megastructure made of stacks and protocols? How do we metabolize high frequency feedback and manage our information diets to find nourishment outside the filter bubble? How do we cohabit the planet alongside a promiscuous population of machine cognizers, and bring them out of the cloud shadows into geopolitical agency? These are the central questions of this article. Ideas and praxes referenced in this article aim to diversify the pedagogic impulse towards tunnel vision expertise, to demonstrate lateral thinking in contexts typically not associated with innovation or computation, and contribute to an intellectual immune system for aspiring cognicitizens.

As we prepare a new generation of students for the computational regime (Brueck, 2016) we need to make room for cultural examination and critical reflection—not just to transcribe a liberal arts agenda in “an obligation to develop their abilities to think and live,” (Deresiewicz, 2015) —but also to re-instill astonishment and wonder in the “resolutely wonder-free zone” (Sloterdijk, 2016, p.3) of institutionalized philosophy and social sciences. We need to notice how we are “interpolating humans with machines that, as they become intelligent, increasingly interpenetrate and indeed constitute human bodies” (Hayles, 2005, p.62). We require new conceptual tools to acquire our own anthropotechnic immune system. The praxes discussed in this article, from the extreme ends of the planetary spectrum, are chosen to instill astonishment and wonder, but also to illustrate lateral and computational thinking through the lens of seminal scholarship on information and matter, cognition and agency, and geopolitics and infrastructure.

The computational regime transgresses geopolitical borders and jurisdictions, for it now dominates traditional governmental structures and control. As we cohabit the cognisphere alongside non-conscious machine cognizers of different scale, complexity, and kinetic ability, we need to close current thermodynamic loopholes for machine cognizers, and bring them out of the cloud shadows to enjoy agency and accountability. I will refer to a diverse range of such machine cognizers simply as ‘cogninodes’. We might afford cogninodes varying degrees of consideration and empathy. We already find ourselves running daily Turing tests to decipher increasingly sophisticated machine learning systems, reverse-engineering their inherent humanity. The capacity to decipher, and to do so with ease, becomes our anthrophotechnic immune system. This also points to a new digital divide between cognicitizens and non-citizens—who are more vulnerable to exercise basic rights and more susceptible to be disciplined by the computational regime.

Customs in the Cognisphere

The cognisphere, a concept coined by Thomas Whalen (1994) is a critical resource for this article, defined by Katherine Hayles in her excellent elaboration of Haraway’s cyborg as “the globally interconnected cognitive systems in which humans are increasingly connected” and “not the only actors within the system: machine cognizers are crucial players as well” (2006, p. 161). Hayles sets the stage for this analysis in her seminal book My mother was a computer: Digital subjects and literary texts (2005) where she demonstrates the inherent linguistic properties of code as a form of natural language, its ability to form multi-order systems of emergences, and relationship to ideology—critical notions on which I rely heavily.

Networked things already outnumber humanity, and sightings of autonomous vehicles have become commonplace. Not only in the skies over conflict areas (Singer, 2009), but also domestically—hauling shipping containers and cargo (“Freightliner.,” 2015), driving cautiously (Gibbs, 2015), crashing ski races (Willemsen, 2015) and presidential lawns (Schmidt & Shear, 2015). Instead of autonomous vehicles and networked things, let’s focus instead on less tangible and able-bodied descendants of the DARPA family tree — artificial organisms constituted solely based on electric differences reverberating through data networks.

To fathom the cognisphere, we first need to come to terms with our own posthuman condition and not succumb to an oversimplified technophile or technophobe dichotomy. That would leave us with the old and unsavory dilemma: resist the information stack and its implications while becoming increasingly disenfranchised and less competitive, or, concede our quantified selves to the “inverse-panopticon”, trading “cognitive capital … in exchange for global infrastructural services” (Bratton, 2014). It’s a choice we’ve already made. Under the cloak of convenience and the spell of access, we exchange cognitive feedback for doses of dopamine (Berridge, 1998) in a transaction that adds urgency to the idea of cogninode’ agency and accountability, closing some ethical and thermodynamic loopholes currently afforded to them.

To establish a sense of cognisphere scale and material variance, I will begin by outlining two types of cogninodes that are in many ways scalar opposites, before returning to critical geopolitical aspects of the cognisphere. Both examples demonstrate fundamental computational principles, and show how planetary computing transcends both geopolitical and metaphysical boundaries.

Life-changing on a molecular level is the discovery of a natural defense mechanism in bacteria turned editing technology called CRISPR (clustered regularly interspaced short palindromic repeats)—a bacterial DNA database that stores virus mug-shots within its very own genetic code. This memory mechanism allows the microorganism to match exogenous DNA exactly, and then apply molecular blades to cut and eliminate intruders with precision. The underlying principle of this immune system, combined with a natural genetic repair pathway, has become the highly influential CRISPR-Cas9 gene editing technology—a “programmable” (Jinek et al., 2012) and cost-effective cut-and-paste technology for manipulating genetic code of virtually any organism—fish, flies, mice, monkeys and many more.

Deemed unethical for use on human DNA by one of the technology’s inventors (Abumrad & Krulwich, 2015), it took about two years until the first research lab edited a non-viable human embryo—with mixed results. The possibilities and implications are enormous: cure cancer and HIV, eradicate malaria, pluck invasive species, bring back creatures of the past, engineer human life without hereditary disease, these are some of the initial ideas. Bio-ethicists can barely keep up with the exponential growth of research and patents in this area, with more than 1200 CRISPR-Cas9 citations since the key paper has been published in 2012 (Lewis, 2015). A volatile domain for investment, it is also heavily invested in the protection of intellectual property—a dynamic familiar to us from the GMO monopolies of the agricultural sector. The monetary value of the investment resides in the genetic code, the software of life. Sequencing genetic code has already progressed to an inexpensive mail-order business, Cas9 added the capacity to rewrite DNA for double-digit dollars per edit. There peril of unforeseen consequence is stark, and the need for safety precautions obvious.

Let’s journey Powers of Ten from the fundamental principles of molecular immune systems to a telescope of planetary scale, adding the computational significance of synchronicity to the discussion, to illustrate how cogninodes can be made up recursively of smaller autonomous units. The Event Horizon Telescope—”a telescope as big as the world” (Overbye, 2015, para. 1) seeks the unseeable at the center of the Milky Way. Astronomers suspect a massive black hole named Sagittarius A* behind a haze of gases “about 50 million miles across…into which the equivalent of four million suns has evidently disappeared” (2015) The Event Horizon Telescope was conceived with the fundamental goal to finally proof the theory by determining the exact dimension of Sagittarius A*.

An individual telescope on Earth is unable to penetrate the haze as its wavelength is too long to see through it. To reduce the wavelength, a quantum shift in scale was necessary. The solution was to create a planetary telescope megastructure, a network of individual telescopes that could be synchronized. When the first composite image of the black hole emerges, Sagittarius A*’s size and shape would “provide a judgment on general relativity” (2015, para. 73). Judgement day for Einstein’s general theory of relativity finally came by listening rather than looking, on September 14, 2015, when a pair of L-shaped antennas in Hanford Washington and Livingston Louisiana detected the collision of two black holes from about 1.2 billion years ago.

Rooted in fundamental research, the molecular and planetary systems discussed so far demonstrate machine cognizers in a range of scales, contexts, and applications. Resulting data can be captured and synchronized in a time continuum, enabling computational processes to unfold at any consecutive point in time and at the speed of Moore’s law—until a particular inference pattern can be corroborated. Computational inference generally builds on captured data, or capta, which is “taken not given, constructed as an interpretation of the phenomenal world, not inherent in it” (Drucker, 2014, p. 128) — upon which qualitative knowledge production can ensue.

In a sphere seemingly unfazed by geopolitical or economic constraints, fundamental research by international research conglomerates suits as case study to illustrate the principles of planetary computing. Individuals and geopolitical subjects in contrast, are much more familiar with inference patterns though consumer data and communication (DoJ, 2016), perhaps voting behavior (Biesecker & Bykowicz, 2016), and potentially law enforcement (“Facial Recognition,” 2015) and national security (Ackerman & Thielman, 2016). It’s an opaque space, occasionally resold to new stakeholders if privately owned (Abumrad & Krulwich, 2015). The stark differences in regards to jurisdiction and geo-policy come into critical focus when we consider the private domain of the computational regime, which we’ll do next.

Mapping the Cognisphere

To understand computationally redefined geopolitical boundaries and control structures at a deeper level, we’ll consider closely two thorough and far-reaching sources: Alexander Galloway’s Protocol (2004), and Benjamin Bratton’s The Stack (2016). Galloway examines “how control exists after decentralization”, and the power structures that he sees expressed in the technical protocols that govern data networks, which becomes particularly relevant to us geopolitically. Bratton illuminates aspects of “geography, jurisdiction, and sovereignty” within the “platform-as-totality” and its users, an “accidental megastructure … that produces new territories in its image” (2014, para. 2). He uses the scalar of ‘the Stack,’ dissected into “six interdependent layers: Earth, Cloud, City, Address, Interface, User” (2016, p. 11). Both authors conceptualize the computational regime by way of topology.

In Bratton’s view, it all fits together. The megastructure that he outlines takes the form of “a vast (if also incomplete), pervasive (if also irregular) software and hardware Stack” (2014, para 1). The Stack redefines transnational jurisdiction and labor, rewires memory and pedagogy, and redirects natural resources. He sees a conflict over the geometry of political geography “bound by the territorial integrity of the state,” where Cloud platforms are “displacing, if not also replacing, traditional core functions of states” (2016, p11). The Stack adds hierarchy to the cognisphere and reminds us how each information layers might “grind against the grain” of the other (2014, para. 6).

Bratton sees platforms not only as a “technical architecture; they are also institutional form.” He seeks geopolitical theory to develop “models for the organization of durable alter-totalities which command the force of law, if not necessarily its forms and formality” (2014, para. 12). Further, “҉Planetary-scale computation takes different forms at different scales: energy grids and mineral sourcing; chthonic cloud infrastructure; urban software and public service privatization; massive universal addressing systems; interfaces drawn by the augmentation of the hand, of the eye, or dissolved into objects; users both overdetermined by self-quantification and exploded by the arrival of legions of nonhuman users (sensors, cars, robots). Instead of seeing the various species of contemporary computational technologies as so many different genres of machines, spinning out on their own, we should instead see them as forming the body of an accidental megastructure.” (Bratton, 2014, p. 1)

These “durable alter-totalities which command the force of law” are governed by 47 U.S. Code § 230: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” (“47 U.S.,” 1996). This is a core United States statute that holds platforms harmless, extended to other nations through distributed data centers. The ‘safe harbor’ agreement (“Safe Harbor:,” 2015) that governs data exchange between data centers between the U.S. and Europe essentially redraws territorial boundaries—undermining the sovereignty of states through clandestine surveillance programs such as PRISM. The key principle of 47 U.S. Code § 230 is to immunize web services and content providers from the harms committed by 3rd party users. Its litigation history (“Section 230,” 2015) shows how broadly immunity is granted since it became law in 1996 (“Reno vs.,” 1997). Formulated before networked things inhabited the world wide web, this statute no longer fits the requirements of a globally distributed network of machine cognizers and cloud software protocols.

“The contradiction at the heart of protocol is that it has to standardize in order to liberate. It has to be fascistic and unilateral in order to be utopian” (Galloway, 2004, p. 95). ҉In fact, the global information map was redrawn with very broad strokes on October 6, 2015, when the European Court of Justice declared a U.S.-E.U. Safe Harbor Framework (“Safe Harbor:,” 2015) invalid (Scott, 2015) — illustrating the ideological differences on privacy and speech. The ruling affects personal data and social media, and highlights the territorial dimension of data flows, giving new relevance to the location of data infrastructures and the investments that go along with it (“Europe’s Top,” 2016).

The case of the Right to be Forgotten provides evidence of the contested geopolitics of personal data. Tackling some of the most cherished ideological values—freedom of expression and the right to privacy, a French court asserted the human right of an individual to determine their life in an autonomous way, and ordered Google to remove search results deemed misleading and inappropriate by an individual user (Hern, 2015). Google complied locally on, but refused to do so on the main domain, appealing the ruling. (“InfoCuria,” 2015). Regulators justified their verdict this way: “Contrary to what Google has stated, this decision does not show any willingness on the part of the CNIL to apply French law extraterritorially. It simply requests full observance of European legislation by non-European players offering their services in Europe” (2015).

The Right to be Forgotten breaks new ground in the dispute between information and territory, illuminating the distinct intercontinental interpretations of freedom and expression, reminding us also that data packets don’t only flow in and out of tax havens but also into contested safe harbors, transgressing local jurisdictions in the process (“Facebook’s Data,” 2011). However, what looks like a decisive move back towards territorial sovereignty makes also clear that courts have no adequate tools to deter violators. For platforms worth tens of billions, “the agency’s one-off maximum financial penalty of 150,000 euros… is essentially a mere rounding error” (“Europe’s Top,” 2016). It confirms that the platform overwhelms the territory, and the protocol hollows territory’s sovereignty.

Legal scholar Lawrence Lessig ascertains that “code is law” (2010, p. 1), because it regulates the changing landscape of cyberspace in a way that is less free and anonymous. Lessig argues against a false choice between regulation and no regulation, identifying code itself as the potential agent of change. He suggests the World Wide Web Consortium’s P3P project as one approach to fill the void. P3P enables web sites to standardize privacy practices in a way that can be retrieved and interpreted easily by a user, allowing them to be informed and automate decision-making based on these practices (Wenning & Schunter, 2016, para. 1). For Lessig, it is people who write code. Choosing no regulation and self-government leaves the determination of collective values to expert coders. The collective has no role in the matter. And when the government plays no role, private interests take its place. Lessig questions why private interests should be considered any better than the flawed character of government regulation (2010, p. 255).҉

In an information society run by coders and knowledge workers, routing information to new harbors and havens also has the consequence of rerouting information labor. According to McKenzie Wark, the growing separation of territory and information produces an abstract new terrain which he calls ‘third nature’—manifested in finance, copyright, and supply chains. It extends the notion of second nature, industry, which in turn has redefined first nature through “flows of energy, labor, and raw materials” (2015). In third nature, economic activities become vectors within a network of flows that can be re-routed any time if “supply becomes erratic”, “labor at the processing site becomes difficult,” or profit margins become too slim (2015, para. 29).

Wark’s “vectoralist class … does not control land or industry anymore, just information” (2015, para. 3). The content creators who ‘inform’ third nature, “the hacker class” (2015, para. 35), consists in his view of an information elite with stock options, followed by some specialized workers in control positions who run information infrastructure, and a broader third tier of knowledge workers who engage in ‘in-sourcing’ (2015, para. 39) or what others call precarious labor or non-labor. Wark constructs his topology on the vector space from “the dense network of information that overlays the territory enables the landscape to be stretched, compressed, folded, and twisted into new shapes” (2015, para. 27). Wark: “If outsourcing sends a worker’s job overseas to another worker, insourcing assigns the hacker’s job to anyone who will perform the task for free. Thus the cooperative effort and the commons of information is itself treated as a resource from which to extract interest” (Wark, 2015).

Synchronizing asynchronous labor is no trivial task. Assigning a hacker’s jobs to anyone who does the task for free in a cooperative effort should ring a bell for anyone who’s written code recently as part of a team or in a public context. Github, the predominant code in-sourcing platform—with the occasional contribution from journalism, science, and policy—is a free-range pasture where cogninodes are born and raised. Defaulting to public, code can be private for a monthly fee. The platform’s exponential growth indicates the platform’s popularity and also the growing class of hackers. It works so well that even Google shut its own service and moved over to the platform (“Bidding farewell,” 2012). On Github, code gets a chance to grow up and be popular, in a highly coordinated ballet of push and pull, commit, comment, and merge (2016).

Interest can be extracted from this information commons by starring and forking projects, submitting bug fixes and pull requests—logged exactly, character by character. Deleting content also adds a copy to the version stack, in keeping with Github’s archival totality. Precise and systematic feedback signals to the commons individual proficiency and productivity, by itself incentive and motivation to maintain free labor in exchange for status currency. Github makes inscriptions “combinable, superimposable and could, with only a minimum of cleaning up, be integrated” (1981, p. 4) in whatever you are coding on. Such free and open source software (F/OSS) plays a significant role in the creation and dissemination of economic value. F/OSS depends and thrives on protocolean standardization. It enables flexibility through standardized application interfaces and licensing agreements. F/OSS is a gift economy that still exchanges economic value, along with the social norms and customs put forth by the information commons.

The Passport to the Cognisphere

Lev Manovich has shown that “[s]oftware has become our interface to the world, to others, to our memory and our imagination—a universal language through which the world speaks, and a universal engine on which the world runs” (2013, p. 2). The path to cogni-citizenship requires coding capacity and machine learning literacy, so we can effectively run our daily Turing tests on interpolated systems that have the capacity to evolve. Those anthropotechnic competencies are one part of the story. Its humanist complement balances how we address a new form of digital divide between cognicitizens and non-citizens. In the cognisphere, work follows information—to discrete information hubs and havens where conditions are opportune. As labor platforms become increasingly standardized and employer risk democratized, the much larger group of non-cognicitizens is more vulnerable and disenfranchised—free from labor, bound to geopolitical territory.

All of us are disciplined by machines and protocols to some degree, but we have choices to make as it comes to propagating telematics as the new normal at work and in our private lives. As experts working longer hours, we might compensate for less time to rest, run chores, and regenerate by way of premium subscriptions and prime delivery. A new generation of cognicitizens requires the social and cultural competencies that reigns in technical proficiency if necessary—let’s call it anthropotechnic autoimmunity—which can be attained through a liberal arts education that takes both technical capacity and humanist competencies seriously. “Design as innovation just isn’t strong enough of an idea by itself. We need to talk a lot more about design as immunization, actually preventing certain innovations that we don’t want from happening” (Bratton, 2013).


This is an edited version of Sauter’s contribution to the edited collection Teaching  Computational Creativity, Cambridge University Press, 2017. 


  • 47 U.S. Code § 230 – Protection for private blocking and screening of offensive material. (2015). Retrieved August 20, 2015, from
  • Abumrad, J., & Krulwich, R. (2015, June 18). Eye in the Sky. Retrieved March 12, 2016, from
  • Abumrad, J., & Krulwich, R. (2015, June 6). Antibodies Part 1: CRISPR. Retrieved August 15, 2015, from
  • Ackerman, S., & Thielman, S. (2016, February 09). US intelligence chief: We might use the internet of things to spy on you. Retrieved February 09, 2016, from
  • Berridge, K. C., & Robinson, T. E. (1998). What is the role of dopamine in reward: Hedonic impact, reward learning, or incentive salience? Brain Research Reviews, 28(3), 309-369.
  • Biesecker, M., & Bykowicz, J. (2016, February 11). Cruz app data collection helps campaign read minds of voters. Retrieved February 22, 2016, from
  • Bratton, B. H. (2014, March). The black stack. Retrieved August 23, 2015, from
  • Bratton, B. H. (2016). The stack: on software and sovereignty. Cambridge, MA: MIT Press.
  • Brueck, H. (2016, February 26). Here’s Why Chicago Just Made Computer Science a Graduation Requirement. Retrieved March 07, 2016, from
  • Deresiewicz, W. (2015, September 1). The Neoliberal Arts: How college sold its soul to the market. Harper’s Magazine, 25-32. Retrieved August 16, 2015, from
  • (2016, February 19). Government’s motion to compel Apple inc. to comply with this court’s February 16, 2016 order compelling assistance in search. Retrieved February 22, 2016, from
  • Drucker, J. (2014). Graphesis: Visual forms of knowledge production. Cambridge, MA: Harvard University Press.
  • Facebook: What types of ID does Facebook accept? (2015, December). Retrieved February 16, 2016, from
  • Facebook’s Data Pool. Retrieved August 20, 2015, from
  • (2015). Retrieved February 23, 2016, from
  • Galloway, A. (2004). Protocol how control exists after decentralization. Cambridge, Mass.: MIT Press.
  • Gibbs, S. (2015, November 13). Google’s self-driving car gets pulled over for driving too slowly. Retrieved February 05, 2016, from
  • Hayles, N. K. (2005). My mother was a computer: Digital subjects and literary texts. Chicago: University of Chicago Press.
  • Hayles, N. K. (2006). Traumas of Code. Critical Inquiry, 33(1), 136-157.
  • Hayles, N. K. (2006). Unfinished Work: From Cyborg to Cognisphere. Theory, Culture & Society, 23(7-8), 159-166. doi:10.1177/0263276406069229
  • Hern, A. (2015, July 30). Google says non to French demand to expand right to be forgotten worldwide. Retrieved February 10, 2016, from
  • Hern, A. (2015, July 30). Google says non to French demand to expand right to be forgotten worldwide. Retrieved February 15, 2016, from
  • Jinek, M., Chylinski, K., Fonfara, I., Hauer, M., Doudna, J. A., & Charpentier, E. (2012). A Programmable Dual-RNA-Guided DNA Endonuclease in Adaptive Bacterial Immunity. Science, 337(6096), 816-821. Retrieved from
  • Lessig, L. (2010). Code: Version 2.0. Retrieved February 15, 2016, from remix/Lessig-Codev2.pdf available at
  • Lewis, R. (2015, December 03). A Conversation with CRISPR-Cas9 Inventors Charpentier and Doudna | DNA Science Blog. Retrieved February 19, 2016, from
  • Manovich, L. (2013). Software takes command: Extending the language of new media. New York, NY: Bloomsbury Academic.
  • Overbye, D. (2015, June 8). Black Hole Hunters. Retrieved August 16, 2015, from
  • Reno v. American Civil Liberties Union, 117 S.Ct. 2329, 138 L.Ed.2d 874 (1997). (1997, March 19). Retrieved August 20, 2015, from
  • Safe Harbor: Advisory. (16, February 11). Retrieved February 15, 2016, from
  • Sauter, D. (2016, March 13) Cognisphere Codebase. Retrieved March 13, 2016, from
  • Schmidt, M. S., & Shear, M. D. (2015, January 26). A Drone, Too Small for Radar to Detect, Rattles the White House. Retrieved February 05, 2016, from
  • Scott, M. (2015, October 06). Data Transfer Pact Between U.S. and Europe Is Ruled Invalid. Retrieved February 10, 2016, from
  • Scott, M. (2016, January 24). Europe’s Top Digital-Privacy Watchdog Zeros In on U.S. Tech Giants. Retrieved January 28, 2016, from
  • Section 230 of the Communications Decency Act. Retrieved August 20, 2015, from
  • Singer, P. W. Wired for War: The Robotics Revolution and Conflict in the Twenty-first Century. New York: Penguin Press, 2009.
  • Sloterdijk, P. (2016). Stress and freedom. Cambridge: Polity.
  • State National Bank of Big Spring v. Jacob J. Lew. (2014, November 19). Retrieved August 31, 2015.
  • Steve Fossett – Mechanical Turk Results. (2007, September 24). Retrieved September 7, 2015, from
  • Wark, M. (2015, August 29). E-flux journal 56th Venice Biennale – SUPERCOMMUNITY – The Vectoralist Class. Retrieved September 21, 2015, from
  • Wenning, R., & Schunter, M. (2006, November 13). The Platform for Privacy Preferences 1.1 (P3P1.1) Specification. Retrieved February 22, 2016, from
  • Whalen, T. (1994). My Experience with the 1994 Loebner Competition. Retrieved January 29, 2016, from
  • Willemsen, E. (2015, December 23). Ski federation bans drones after camera nearly hits racer. Retrieved February 19, 2016, from