The Postmodernity of Big Data

or data after metanarratives

Michael Pepi


The following is a version of a talk I delivered at the Theorizing the Web conference in April 2014 on the subject of "Suggestive Results: Algorithms, Information, and Control". My topic was the "Postmodernity of Big Data." It was delivered in under 15 mins (amazingly). I've edited it slightly and added citations and some of the slides. Reference is made to an essay I wrote for The New Inquiry in December of 2013 that you can find here.


Three years ago in a plenary address at Theorizing the Web George Ritzer lamented that literature about digital technology somehow seemed to ignore postmodern theory altogether. 1 This was in spite of the fact that many of these theories seemed to be more applicable after rise of digital networks. What follows here is a continuation of that vein, yet here I focus on the emerging discourse around “big data.” I see this as an exercise in Platform Studies. I am concerned with big data’s claims about/implications for epistemology, historiography, and aesthetics.  

A note on terms: though I refer to the circulation of postmodern ideas in philosophy , I privilege the concept as manifested in art history and practice, for example the closing of modernist criticism, the rise of conceptualism, the demise of judgment, the retreat of the object, and a general suspicion of traditional aesthetic categories.  

What I am really looking at are two discourses (in fact two very diverse discourses) each with a lifetime of texts to consider. 

On the one hand, the discourse of postmodernity, but also the end of modernist criticism, and perhaps in both cases we refer to Jean-Francois Lyotard’s “incredulity towards grand narratives.” 2

And secondly, the new discourse around “big data” that I take to be laden with an ideology that implicates several disciplines outside of so called “data science” or computer science. For me, big data is a discourse, but is also a reference to a set of tools used when an organization creates more data than it can reasonably process and store and uses analytics platforms that rely on new parallel programming architectures.

Specifically I ask how the new discourse around big data draws upon the postmodern condition. To go a step further, we might interrogate how big data employs the skepticism of postmodernist discourse as reformatted under the language of capitalism. Or, to identify the prefix "big" in big data as simply data as reconstituted by capitalism, which builds platforms to aggregate, mine, and exploit it for its surplus value.



I would like first show how big data’s practical applications filter throughout our language to impact a range of disciplines. Big data owes it popularity and “buzzword" status to the generality of its claims. It is a slippery, sort of catch-all term that is used to various ends. Like the term “postmodernism” itself, it is enigmatic, at once over-hyped and cringe-worthy.

One example is recent panel discussion on big data in frieze magazine. 3 The discussion was telling in the degree to which it obscured almost all of the technical tools at the very heart of all big data operations. In 3,200 words the following terms were never mentioned. (Notable omissions include “database,” “MapReduce”, “cloud”, and “server.”)

  1. database
  2. server
  3. MapReduce
  4. schema
  5. cloud
  6. analytics
  7. metadata
  8. SQL
  9. unstructured data
  10. variability
  11. volume
  12. velocity
  13. query
  14. predictive
  15. analytics
  16. veracity
  17. noSQL
  18. hadoop
  19. shard(ing)
  20. Oracle
  21. IBM
  22. SAP
  23. memory
  24. mongoDB
  25. index
  26. Pig
  27. real-time
  28. transactional
  29. storage
  30. petabyte

The absence of technical terms is understandable but still indicative of a significant characteristic of the term “big data”: it’s used loosely to signify any manner of societal anxieties about the insufficiency of human intuition and judgment, cultural criticism, and the networked humanities. Discussions range from questions of hardware, very the rare discussion of philosophy, all the way up to the most popular references to surveillance and consumer services.

© Mike Pepi 2014

© Mike Pepi 2014

Yet at the same time big data vendors that perform the evangelical work make arguments about the rise of such platforms that touch upon the concerns of the humanities. A major component of big data is the ability to gather, store, and query data using recent hardware and software advancements. Platform vendors refer to noSQL, which is a transition away from of a more encumbering SQL, the Structured Query Language that runs relational databases. And here there is a tacit opposition embedded in the rhetoric surrounding the values of SQL vs. noSQL that is consistent with the epistemological questions present at the shift from modernity to postmodernity.

Take Intel’s description, where the “complexity of unstructured data” drives a “shift from relational to nonrelational.” 4 They speak of the “orderly, structured, dense” world of relational databases, contrasted with the “network oriented, semistructured” big data solutions that “scale horizontally.” David Harvey likewise described a monotonous, universal, planned, fixed modernist version of the world as it was met by “fragmentation, indeterminacy, and intense distrust of universal or totalizing discourses.” 5



I begin by positing a modernity that could not scale. Today the historical subject is too various, voluminous, and unstructured, and easily creates an environment that overwhelms modernity’s critical apparatus. Firms running off a relational database and modernist criticism were similar in the sense that they defined and marked all of the relevant inputs in their system with fixed criteria and rules that determine meaning in an operational totality. The so-called “bottleneck” of relational databases addressed by noSQL solutions draws in part on the breakdown of modernist ideas about narrative. It makes less and less sense to exclude new inputs for the sake of a telos, and in fact appeared increasingly advantageous to instead seek out these inputs in a rejection of the notion of absolute truth. Where modernity arrived at logical, historically justified, final conclusions, postmodernism and big data both assume the perpetual role of acquiring new information.

The revelations about the contingent nature of meaning that animated much critical theory where in their own way a declaration that the way we received, stored, and analyzed data were insufficient for entire sections of cultural production. Thus both are characterized by Lyotard’s “incredulity toward metanarratives.” Both collapsed models (either cultural or scientific). Today the sheer scale of big data provide new ammunition to reanimate the critiques leveled at narrative at the end of modernity.

Likewise, for big data vendors, current data are too unstructured, voluminous, variegated, and are all being created at far too fast a rate to be handled by legacy database tools. And in the same vein these data embarrass modernist judgment; or the idea that there is a uniform history or trajectory of society. Though "analytics" carries positivist overtones, it’s useful to note how big data analytics is so often interested in debunking myths (even those that may have arisen out of a positivist drive). In big data's interpretive logic, such narratives rarely survive as you increase the volume of new data.

Postmodernist intellectuals likewise spoke of the contingencies of meaning, about deconstructing language, about challenging teleology and canonical progression. We see that both discourses share an acceptance of radical acceptance of unfamiliar inputs, fragmentation, and indeterminate and heterogeneous relationships. While not a clean break with the history of legacy databases, big data (in part) relies upon postmodernism’s rejection of totalities so as to constantly ask for more data.

At our current rate we will create an estimated 18 million Libraries of Congress by next year. The EMC corporation estimates that 90% of that will be unstructured. 6  And perhaps it was postmodern discourse that first embraced the principle of unstructured data, that is to say, it prized a critical language that was accustomed to indeterminacy, one that destabilized certainty. Even Lyotard had little sense of the amount of data that “computerized” society would be able to produce. Yet still he was interested in the “status of knowledge” as metanarratives proved obsolete, offering that “the nature of knowledge cannot survive unchanged.” 7

Referring to digitization he wrote: “We can predict that anything in the constituted body of knowledge that is not translatable in this way will be abandoned and that the direction of new research will be dictated by the possibility of its eventual results being translatable into computer language.” 8

Thus the "old poles of attraction represented by nation-states, parties, professions, institutions, and historical traditions” lose traction. “Identifying with the great names” of history became more and more difficult, just challenging as scaling out a dynamic data model on a relational database. 9



I want to turn to the work done by the critic Jack Burnham, who 1968 wrote about Systems Esthetics which described the shift from an object-oriented to systems-oriented culture. Burnham was among the earliest to think about what he termed the “implementation of the art impulse in an advanced technological society.” 10 He helps tie the rise of information theory to the end of the modernist object-oriented critical system using language that relates directly to issues relevant to contemporary big data discourse.

In the formalist lineage leading up until the 1970s, the ultimate goal was a sort of self-contained judgmental system that could sustain criticism as well as avant-garde art production. In Greenbergian formalism, and in modernism more generally, the judgment emanated from the content of the object. The judged aesthetic value of a work was synonymous with its content, and this required highly structured inputs, rules, and critical outcomes.

In a teleology that was bound up with and terminated by various means (the white canvas, minimalism, and conceptual art), new information was rarely sought from outside of the object itself, that is, until the artists  Burnham cites acknowledged a systems-oriented model. Judgment ceased to be the primary factor in aesthetic value once non-immanent information began to enter into the discourse, ripping the object away from the logic of its material category and into a position in a pluralized network of information.

Recently Lane Relyea has pointed to theorists like Rosalind Krauss who spoke about how aesthetic discourse has organized itself around architectural structures such as the exhibition wall. This being an example of a feature of modernist object storage that legitimated its own critical framework.  

Relyea states: “The wall’s continuous lateral reach supplied the necessary register not only for establishing the paradigmatic set of canonical signifiers, but also for the linear left-to-right sequencing of those signifiers to form a master narratives about nation, culture, and civilization.” 11

However postmodernist thought was cognizant of such legitimizing structures, and instead provided for discourse that scaled out horizontally, empowering the indeterminacy of unstructured sources. These were striking for the way they undermined the validity of modernist cultural models. One thinks of the curatorial impulses already latent in Harald Szeeman’s “when attitudes become form” (1969) and Documenta 5 (1972), or Jean-Hubert Martin’s “Les Magiciennes de la terre” (1989). Coming full circle, we can also consider Lyotard's own exhibition curated in 1985 at Centre Pompidou entitled "Les Immatériaux," described by John Rajchman as an attempt to present "a kind of ‘post-industrial’ techno-scientific condition" bound up with the postmodern condition's "dramaturgy" of information. 12

So here again we might draw a line; just as the west’s hegemonic exhibition space became a target of an institutional critique at a time when modernist canons were under attack, today big data discourse targets similar legacy models of capturing, storing, and interpretation in a new era of accelerated data creation.



Next, I consider a major blind spot in my original essay: that being big data’s relationship to objectivity. It seems that the discourse assumes there is something “out there” to be discovered, as the marketing rhetoric obsesses over “truth” and “insights”  



But instead of the worn out question of subject-object relations, it seems below the surface big data’s new (non-modernist) totality is actually more interested in successfully mapping a subjectivity, synthesizing various and contingent data points in order to help show that what we know is an “artificial” subjective construction of meaning.  

The “big” in big data functions relationally.  “Big” takes a step past modernist ideas about history. Scientific models can’t provide what they promise. But instead we can perpetually aim for a contingent “reliability without truth,” a term I borrow from Antoinette Rouvroy. “Big” means to be skeptical of the limited perceived inputs, to cast doubt on such a convenience when there are so many more potentially significant data points available for an algorithm.  



I’d argue that big data, as an institution, a discourse with an inherent structure, has risen to the level of a new ‘legitimation” of knowledge. One that welcomes unstable data, contingencies, context collapses, and shifting inputs all founded on an ideal of empiricism, though often built with specific operations in mind. Its requirement that unstructured data become “handled” is the new criterion by which legacy narrative elements are eliminated—to never again form false subjectivities.


This empowers the “performativity” of an institution that “legitimizes” itself by an allegiance to the primacy of digital information. Big data, as the omniscient and incorrigible receiver and sender of these data threaten to always directly affect its subjects in this manner… to become, in a sense, a new "performativity". Science’s “rational argument” is replaced by the fastest query.



Is the discourse around big data simply postmodernist skepticism reformatted under the language of capitalism? 

The key here is that unlike past institutions, this is by in large a private one. Its main engine of intellectual production and ownership emanate from a few companies, with niche players trying to nudge in with historic claims about what big data can “unlock.” Thus the production of meaning that can be attached to these utterances or result sets are tipped towards the logic of capitalism. A significant shift occurs when our questions about knowledge production and validity become indistinguishable from questions of database structure and implementation.

That which are excluded are twofold:

1.    information not aggregated or properly stored and scaled.

2.   those subjects that inherently perform narrative.

Database theory then becomes less a specialist, technical, engineering problem and starts to bleed into ideas about historiography or scientific progress. After big data, questions about validity of knowledge have much to do with issues of database applications and corporate “best practices.”

Then to be “data-driven” now is not the same as being empiricist. Because to be data driven now means to be driven by data that can be stored, manipulated, and updated in real-time, all while it is algorithmically engineered. To be data-driven means to be limited to the hegemonic mechanisms of capital, to collapse humanistic judgment and historiography into the class interests of aggregators and data brokers.



When new sources of data suddenly hit a system and there are not enough tools or schemas to deal with them, a common reaction is to deny the agency of the data sources. Then, for a brief period, to suspend interpretation until new tools are built to handle the new variety. My fundamental point is that this occurs both with cultural discourses as well as with information. Big data, then, is likewise an attempt at forming new flexible schemas in order to continue the project of interpretation under radically new conditions.

The discourse around big data continually asks for more data because it mistrusts any formula or theory that implicates subjects whose data have not yet been collected. To participate in big data analytics is to fully close the book on modernity’s narrative, the scientific method’s claims to objectivity, and the “language games” played by historical agents.

So in this sense, big data might be viewed as sort of newly-enacted postmodernity; a new grand narrative suppressant on the heels of the skepticism and doubt and left in the wake of a foreclosed modernist framework. However, while independent examples of big data analytics may counter narrative elements, when taken together as an unfolding discourse or institution, big data serve as a new form of legitimation, in a sense, forming a new metanarrative.






1. Ritzer, George. “Why the Web Needs Post-Modern Theory” Theorizing the Web 2011, University of Maryland, College Park. 29 April 2011. Plenary Address

2. Lyotard, Jean-Francois.  “The Postmodern Condition: A Report on Knowledge” Theory of History and Literature, Vol. 10 (Manchester University Press, 1984)

3. “Safety in Numbers.” Frieze magazine. March 2014. Issue 161

4. Intel IT Center, "Big Data 101: Unstructured Data Analytics, A crash course on the IT landscape for big data and emerging technologies", online:

5. Harvey, David. “The Condition of Postmodernity” (Blackwell, Oxford) 1990. 9

6. Intel IT Center, "Big Data 101: Unstructured Data Analytics"

7. Lyotard, Jean-Francois.  “The Postmodern Condition: A Report on Knowledge”

8. Ibid. pg. 4

9.  Ibid. pg. 14

10 Jack Burnham, “System Esthetics,” Artforum 7, no 1 (September 1968). 31

11. Relyea, Lane, "Your Everyday Art World" (MIT Press, Cambridge) 2013. 172

12. John Rajchman, "Les Immatériaux or How to Construct the History of Exhibitions" Tate Papers (autumn 2009) accessed online at