Appeals to human origins are surprisingly common features of discourse in academic computer science and the commercial software industry alike. Startup founders tie their business plans to evolutionary psychology (Levy 2010); conference talks show indigenous people, framed as “primordial man,” using contemporary information technologies. Yuval Noah Harari, a medievalist who has found great popular success in a turn to species-scale histories like Sapiens (2014) and Homo Deus (2016), has been described as “Silicon Valley’s favorite historian” (Ungerleider 2017). If the ordinary work of programming computers to do things seems boring, like staying in one place and typing all day, these stories change it into transformative action on the largest possible scale.

We can think of these stories as scaling devices. They establish the scope of discussion, telling us that we are not talking about minor acts of coding, but rather enduring abstract problems of human existence. “Scale-climbing,” as Judith Irvine has argued, is an ideological operation: by claiming the broader view, people try to encompass each other within their own explanatory frameworks (2016, 228; Gal and Irvine 1995). Epochal software stories set human species-being within a computational frame, recasting practically all social activities as precursors to their narrators’ technological projects. David Golumbia, in The Cultural Logic of Computation (2009), has termed this expansionist tendency in the rhetoric of computing “computationalism,” which allows the work of computing to alternately lay claim to the future and the past: new companies figure themselves as both innovators and inheritors of timeless human truths.

Anthropology is well positioned to critique the accuracy of such stories because they often drape themselves in anthropological garb, as the titles of Harari’s books suggest. Michelle Rosaldo wrote of such appeals to origins in “The Use and Abuse of Anthropology” in 1980: “the ‘primitive’ emerges in accounts like these as the bearer of primordial human need,” but these vernacular anthropologies are typically stuck in an evolutionist mode, finding in their origin stories a universalized version of contemporary concerns, “the image of ourselves undressed” (392). While these just-so stories often borrow a sense of scientific authority from evolutionary psychology (McKinnon 2005), they are not factual in any concrete sense. Konstan is not really claiming that his allegory of the cave actually happened, but rather that it, or something like it, must have happened. Overload is taken to be a constitutive part of human existence, not merely a recent challenge to be answered by human sociality and technology, but a force that has shaped the evolution of humans and our techniques. It can be known from personal experience and readily extrapolated to any situation, however remote.

In his 1961 short story, “The Overloaded Man,” JG Ballard depicted a business school professor losing his mind, overwhelmed by the world outside his head. Harry Faulkner secretly quits his job and sits all day on the veranda of his university apartment, in a modernist housing complex called “the Bin.” He has trained himself to cognitively dissociate from the world, using “cut-off switches” in his mind to render objects around him as unrecognizable shapes. Eventually, Faulkner loses touch with the world entirely, unable to bear “its overlay of nagging associations.” When his wife discovers him at home during the workday in a stupor, Faulkner’s effort to disconnect culminates in a depersonalized murder-suicide, described through an abstracted interplay of shapes and colors. Ballard captures some of information overload’s signature features: the description of the brain and perception in terms of manipulable mental circuitry, the prototypical overloaded person being a man, a businessman or intellectual, aggravated by the excesses of modern life into a state of unreason (see Erickson et al. 2013, How Reason Almost Lost Its Mind).

At the beginning of the Cold War, the year after she died, the Yale Poetry Review published a piece by Gertrude Stein: a paragraph titled “Reflection on the Atomic Bomb” (1947/1998). Atomic bombs were simply too big to care about, she wrote: “if they are really as destructive as all that there is nothing left and if there is nothing there is nobody to be interested and nothing to be interested about” (1998, 823). But where the overwhelming destructiveness of the bomb led Stein to apathy, it drove others into an unnatural frenzy. Stein concluded: “Everybody gets so much information all day long that they lose their common sense. They listen so much that they forget to be natural. This is a nice story” (1998, 823).

Stein’s story captured the apathy and anxiety of everyday life as it first fell under the shadow of the bomb—what Joe Masco has described as “the overstimulation of the body produced by an all-or-nothing Cold War cosmology” (17)—but it also marked the beginnings of what would eventually be described as a new historical age: the Information Age. It was not the scale of nuclear war that drove people mad, according to Stein, but the scale of information, overwhelming their common sense. The following year would see the publication of Claude Shannon’s “A Mathematical Theory of Communication” (1948), which formalized a definition of information that became so influential, spreading so quickly across scientific fields, that he was soon compelled to disavow “the bandwagon” (1956) that had carried it away from its technical roots in communication engineering. While scientists from biology to sociology tried to recast their fields in informational terms, in a dense post-war “epistemic ecology” (Boyer), anxieties about information and its growing scale spread through popular culture, as well.