Post-Cinema / Post-Phenomenology

artifact

Following my talk last week at the Texas State Philosophy Symposium, details have now been finalized for another talk at Texas State: this time in the context of the Philosophy Department’s Dialogue Series, where I’ll be talking about post-cinema (i.e. post-photographic moving image media such as video and various digital formats) and what I’ve been arguing is an essentially post-phenomenological system of mediation (see, for example, my talk from the 2013 SCMS conference or these related musings). For anyone who happens to be in the area, the talk will take place on Monday, April 14, 2014 at 12:30 pm (in Derrick Hall 111). UPDATE: The time has been changed to 10:00 am.

Philosophy of Science De-Naturalized: Notes towards a Postnatural Philosophy of Media (full text)

Phil-Sci-Denaturalized.034

As I recently announced, I was invited to give the keynote address at the 17th annual Texas State University Philosophy Symposium. Here, now, is the full text of my talk:

Philosophy of Science De-Naturalized: Notes towards a Postnatural Philosophy of Media

Shane Denson

The title of my talk contains several oddities (and perhaps not a few extravagances), so I’ll start by looking at these one by one. First (or last) of all, “philosophy of media” is likely to sound unusual in an American context, but it denotes an emerging field of inquiry in Europe, where a small handful of people have started referring to themselves as philosophers of media, and where there is even a limited amount of institutional recognition of such appellations. In Germany, for example, Lorenz Engell has held the chair of media philosophy at the Bauhaus University in Weimar since 2001. He lists as one of his research interests “film and television as philosophical apparatuses and agencies” – which, whatever that might mean, clearly signals something very different from anything that might conventionally be treated under the heading of “media studies” in the US. On this European model, media philosophy is related to the more familiar “philosophy of film,” but it typically broadens the scope of what might be thought of as media (following provocations from thinkers like Niklas Luhmann, who treated everything from film and television to money, acoustics, meaning, art, time, and space as media). More to the point, media philosophy aims to think more generally about media as a philosophical topic, and not as mere carriers for philosophical themes and representations – which means going beyond empirical determinations of media and beyond concentrations on media “contents” in order to think about ontological and epistemological issues raised by media themselves. Often, these discussions channel the philosophy of science and of technology, and this strategy will indeed build the bridge in my own talk between the predominantly European idea of “media philosophy” and the context of Anglo-American philosophy.

OK, but if the idea of a philosophy of media isn’t weird enough, I’ve added this weird epithet: “postnatural.” The meaning of this term is really the crux of my talk, but I’m only going to offer a few “notes towards” a postnatural theory, as it’s also the crux of a big, unwieldy book that I have coming out later this year, in which I devote some 400 pages to explaining and exploring the idea of postnaturalism. As a first approach, though, I can describe the general trajectory through a series of three heuristic (if oversimplifying) slogans.

Phil-Sci-Denaturalized.010

First, in response to debates over the alleged postmodernity of (Western) societies at the end of the twentieth century, French sociologist and science studies pioneer Bruno Latour, most famous for his association with so-called actor-network theory, claimed in his 1991 book of the same title that “We have never been modern.” What he meant, centrally, was that the division of nature and culture, nonhuman and human, that had structured the idea of modernity (and of scientific progress), could not only be seen crumbling in contemporary phenomena such as global warming and biotechnology – humanly created phenomena that become forces of nature in their own right – but that the division was in fact an illusion all along. We have never been modern, accordingly, because modern scientific instruments like the air pump, for example, were simultaneously natural, social, and discursive phenomena. The idea of modernity, according to Latour, depends upon acts of purification that reinforce the nature/culture divide, but an array of hybrids constantly mix these realms. In terms of a philosophy of media, one of the most important conceptual contributions made by Latour in this context is the distinction between “intermediaries” and “mediators.” The former are seen as neutral carriers of information and intentionalities: instruments that expand the cognitive and practical reach of humans in the natural world while leaving the essence of the human untouched. Mediators, on the other hand, are seen to decenter subjectivities and to unsettle the human/nonhuman divide itself as they participate in an uncertain negotiation of these boundaries.

Phil-Sci-Denaturalized.012

The NRA, with their slogan “guns don’t kill people, people kill people,” would have us believe that handguns are mere intermediaries, neutral tools for good or evil; Latour, on the other hand, argues that the handgun, as a non-neutral mediator, transforms the very agency of the human who wields it. That person takes up a very different sort of comportment towards the world, and the transformation is at once social, discursive, phenomenological, and material in nature.

Phil-Sci-Denaturalized.014

With Donna Haraway, we could say that the human + handgun configuration describes something on the order of a cyborg, neither purely human nor nonhuman. And Haraway, building on Latour’s “we have never been modern,” ups the ante and provides us with the second slogan: “We have never been human.” In other words, it’s not just in the age of prosthetics, implants, biotech, and “smart” computational devices that the integrity of the human breaks down, but already at the proverbial dawn of humankind – for the human has co-evolved with other organisms (like the dog, who domesticated the human just as much as the other way around). From an ecological as much as an ideological perspective, the human fails to describe anything like a stable, well-defined, or self-sufficient category.

Phil-Sci-Denaturalized.016

Now the third slogan, which is my own, doesn’t so much try to outdo Latour and Haraway as to refocus some of the themes that are inherent in these discussions. Postnaturalism, in a nutshell, is the idea not that we are now living beyond nature, whatever that might mean, but that “we have never been natural” (and neither has nature, for that matter). Human and nonhuman, natural and unnatural agencies are products of mediations and symbioses from the very start, I contend. In order to argue for these claims I take a broadly ecological view and focus not on discrete individuals but on what I call the anthropotechnical interface (the phenomenal and sub-phenomenal realm of mediation between human and technical agencies, where each impinges upon and defines the other in a broad space or ecology of material interaction). This view, which I develop at length in my book, allows us to see media not only as empirical objects, but as infra-empirical constraints and enablers of agency such that media may be described, following Mark Hansen, as the “environment for life” itself. Accordingly, media-technical innovation translates into ecological change, transforming the parameters of life in a way that outstrips our ability to think about or capture such change cognitively – for at stake in such change is the very infrastructural basis of cognition and subjective being. So postnaturalism, as a philosophy of media and mediation, tries to think about the conditions of anthropotechnical evolution, conceived as the process that links transformations in the realm of concrete, apparatic media (such as film and TV) with more global transformations at a quasi-transcendental level. Operating on both empirical and infra-empirical levels, media might be seen, on this view, as something like articulators of the phenomenal-noumenal interface itself.

So the more I unpack this thing, the weirder it gets, right? Well, let me approach it from a different angle. Here’s where the first part of my title comes into play: “Philosophy of Science De-Naturalized.” Now, I mentioned before that postnaturalism does not postulate that we are living “after” nature; what I want to emphasize now is that it also remains largely continuous with naturalism, conceived broadly as the idea that the cosmos is governed by material principles which are the object, in turn, of natural science. And, more to the point, the first step in the derivation of a properly postnatural theory, which never breaks with the idea of a materially evolving nature, is to work through a naturalized epistemology, in the sense famously articulated by Willard V. O. Quine, but to locate within it the problematic role of technological mediation. By proceeding in this manner, I want to avoid the impression that a postnatural theory is based on a merely discursive “deconstruction” of nature as a concept. Against the general thrust of broadly postmodernist philosophies, which might show that our ideas of nature and its opposites are incoherent, mine is meant to be a thoroughly materialist account of mediation as a transformative force. So the “Philosophy of Science De-Naturalized,” as I put it here, marks a particular trajectory that takes off from what Ronald Giere has called “Philosophy of Science Naturalized” and works its way towards a properly postnatural philosophy of media.

Phil-Sci-Denaturalized.019

Giere’s naturalized philosophy of science is of interest to me because it aims to coordinate evolutionary naturalism (in the sense of Darwin) with revolutionary science (in the sense of Thomas Kuhn). In other words, it aims to reconcile the materialism of naturalized epistemology with the possibility of radical transformation, which Kuhn sees taking place with scientific paradigm shifts, and which I want to attribute to media-technical changes. Taking empirical science as its model, and taking it seriously as an engagement with a mind-independent reality, an “evolutionary epistemology” posits a strong, causal link between the material world and our beliefs about it, seeing knowledge as the product of our biological evolution. Knowledge (and, at the limit, science) is accordingly both instrumental or praxis-oriented and firmly anchored in “the real world.” As a means of survival, it is inherently instrumental, but in order for this instrumentality to be effective – and/or as the simplest explanation of such effectivity – the majority of our beliefs must actually correspond to the reality of which they form part. But, according to Kuhn’s view of paradigm shifts, “after a revolution scientists work in a different world” (Structure of Scientific Revolutions 135). This implies a strong incommensurability thesis that, according to critics like Donald Davidson, falls into the trap of idealism, along with its attendant consequences; i.e. if paradigms structure our experience, revolution implies radical relativism or else skepticism. So how can revolutionary transformation be squared with the evolutionary perspective?

Phil-Sci-Denaturalized.020

Convinced that it contains important cues for a theory of media qua anthropotechnical interfacing, I would like to look at Giere’s answer in some detail. Asserting that “[h]uman perceptual and other cognitive capacities have evolved along with human bodies” (384), Giere’s is a starkly biology-based naturalism. Evolutionary theory posits mind-independent matter as the source of a matter-dependent mind, and unless epistemologists follow suit, according to Giere, they remain open to global arguments from theory underdetermination and phenomenal equivalence: since the world would appear the same to us whether it were really made of matter or of mind-stuff, how do we know that idealism is not correct? And because idealism contradicts the materialist bias of physical science, how do we know that scientific knowledge is sound? According to Giere, we can confidently ignore these questions once the philosophy of science has itself opted for a scientific worldview. Of course, the skeptic will counter that naturalism’s methodologically self-reflexive relation to empirical science renders its argumentation circular at root, but Giere turns the tables on skeptical challenges, arguing that they are “equally question-begging” (385). Given the compelling explanatory power and track record of modern science and evolutionary biology in particular, it is merely a feigned doubt that would question the thesis that “our capacities for operating in the world are highly adapted to that world” (385); knowledge of the world is necessary for the survival of complex biological organisms such as we are. But because this is essentially a transcendental argument, it does not break the circle in which the skeptic sees the naturalist moving; instead, it asserts that circularity is an inescapable consequence of our place in nature. In large part, this is because “we possess built-in mechanisms for quite direct interaction with aspects of our environment. The operations of these mechanisms largely bypass our conscious experience and linguistic or conceptual abilities” (385).

Phil-Sci-Denaturalized.024

So much for the evolutionary perspective, but where does revolutionary science fit into the picture? To answer this question, Giere turns to the case of the geophysical revolution of the 1960s, when a long established model of the earth as a once much warmer body that had cooled and contracted, leaving the oceans and continents more or less fixed in their present positions, was rapidly overturned by the continental drift model that set the stage for the now prevalent plate tectonics theory (391-94). The matching coastlines of Africa and South America had long suggested the possibility of movement, and drift models had been developed in the early twentieth century but were left, by and large, unpursued; it was not just academic protectionism that preserved the old model but a lack of hard evidence capable of challenging accepted wisdom – accepted because it “worked” well enough to explain a large range of phenomena.

Phil-Sci-Denaturalized.025

The discovery in the 1950s of north-south ocean ridges suggested, however, a plausible mechanism for continental drift: if the ridges were formed, as Harry Hess suggested, by volcanism, then “sea floor spreading” should be the result, and the continents would be gradually pushed apart by its action. The discovery, also in the 1950s, of large-scale magnetic field reversals provided the model with empirically testable consequences (the Vine-Matthews-Morley hypothesis): if the field reversals were indeed global and if the sea floor was spreading, then irregularly patterned stripes running parallel to the ridges should match the patterns observed in geological formations on land. Until this prediction was corroborated, there was still little impetus to overthrow the dominant theory, but magnetic soundings of the Pacific-Antarctic Ridge in 1966, along with sea-floor core samples, revealed the expected polarity patterns and led, within the space of a year, to a near complete acceptance of drift hypotheses among earth scientists.

According to Giere, naturalism can avoid idealistic talk of researchers living “in different worlds” and explain the sudden revolution in geology by appealing only to a few very plausible assumptions about human psychology and social interaction – assumptions that are fully compatible with physicalism. These concern what he calls the “payoff matrix” for accepting one of the competing theories (393). Abandoning a pet theory is seldom satisfying, and the rejection of a widely held model is likely to upset many researchers, revealing their previous work as no longer relevant. Resistance to change is all too easily explained. However, humans also take satisfaction in being right, and scientists hope to be objectively right about those aspects of the world they investigate. This interest, as Giere points out, does not have to be considered “an intrinsic positive value” among scientists, for it is tempered by psychosocial considerations (393) such as the fear of being ostracized and the promise of rewards. The geo-theoretical options became clear – or emerged as vital rather than merely logical alternatives – with the articulation of a drift model with clearly testable consequences. We may surmise that researchers began weighing their options at this time, though it is not necessary to consider this a transparently conscious act of deliberation. What was essential was the wide agreement among researchers that the predictions regarding magnetic profiles, if verified, would be extremely difficult to square with a static earth model and compellingly simple to explain if drift really occurred. Sharing this basic assumption, the choice was easy when the relevant data came in (394).

Phil-Sci-Denaturalized.026

But the really interesting thing about this case, in my opinion, is the central role that technology played in structuring theoretical options and forcing a decision, which Giere notes but only in passing. The developing model first became truly relevant through the availability of technologies capable of confirming its predictions: technologies for conducting magnetic soundings of the ocean floor and for retrieving core samples from the deep. Indeed, the Vine-Matthews-Morley hypothesis depended on technology not only for its verification, but for its initial formulation as well: ocean ridges could not have been discovered without instruments capable of sounding the ocean floor, and the discovery of magnetic field reversals depended on a similarly advanced technological infrastructure. A reliance on mediating technologies is central to the practice of science, and Giere suggests that an appreciation of this fact helps distinguish naturalism from “methodological foundationism” or the notion that justified beliefs must recur ultimately to a firm basis in immediate experience (394). His account of the geological paradigm shift therefore “assumes agreement that the technology for measuring magnetic profiles is reliable. The Duhem-Quine problem [i.e. the problem that it is logically possible to salvage empirically disconfirmed theories by ad hoc augmentation] is set aside by the fact that one can build, or often purchase commercially, the relevant measuring technology. The background knowledge (or auxiliary hypotheses) are embodied in proven technology” (394). In other words, the actual practice of science (or technoscience) does not require ultimate justificational grounding, and the agreement on technological reliability ensures, according to Giere and contra Kuhn, that disagreeing parties still operate in the same world.

But while I agree that Giere’s description of the way technology is implemented by scientists is a plausible account of actual practice and its underlying assumptions, I question his extrapolation from the practical to the theoretical plane. With regard to technology, I contend, the circle problem resurfaces with a vengeance. As posed by the skeptic, Giere is right, in my opinion, to reject the circle argument as invalidating naturalism’s methodologically self-reflexive application of scientific theories to the theory of science. Our evolutionary history, I agree, genuinely militates against the skeptic’s requirement that we be able to provide grounds for all our beliefs; our survival depends upon an embodied knowledge that is presupposed by, and therefore not wholly explicatable to, our conscious selves. But as extensions of embodiment, the workings of our technologies are equally opaque to subjective experience, even – or especially – when they seem perfectly transparent channels of contact with the world. Indeed, Giere seems to recognize this when he says that “background knowledge (or auxiliary hypotheses) are embodied by proven technology” (394, emphasis added). In other words, scientists invest technology with a range of assumptions concerning “reliability” or, more generally, about the relations of a technological infrastructure to the natural world; their agreement on these assumptions is the enabling condition for technology to yield clear-cut decision-making consequences. Appearing neutral to all parties involved, the technology is in fact loaded, subordinated to human aims as a tool. Some such subordinating process seems, from a naturalistic perspective, unavoidable for embodied humans. However, agreement on technological utility – on both whether and how a technology is useful – is not guaranteed in every case. Moreover, it is not just a set of cognitive, theoretical assumptions (“auxiliary hypotheses”) with which scientists entrust technologies, but also aspects of their pre-theoretically embodied, sensorimotor competencies. Especially at this level, mediating technologies are open to what Don Ihde calls an experiential “multistability” – capable, that is, of instantiating to differently situated subjectivities radically divergent ways of relating to the world. But it is precisely the consensual stability of technologies that is the key to Giere’s contextualist rebuttal of “foundationism.”

Phil-Sci-Denaturalized.030

Downplaying multistability is the condition for a general avoidance of the circle argument, for a pragmatic avoidance of idealism and/or skepticism. This, I believe, is most certainly the way things work in actual practice; (psycho)social-institutional pressures work to ensure consensus on technological utility. But does naturalism, self-reflexively endorsing science as the basis of its own theorization, then necessarily reproduce these pressures? Feminists in particular may protest on these grounds that the “nature” in naturalism in fact encodes the white male perspective historically privileged by science because embodied by the majority of practicing scientists. What I am suggesting is that the tacit, largely unquestioned processes by which technological multistability is tamed in practice form a locus for the inscription of social norms directly into the physical world; for in making technologies the material bearers of consensual values (whether political, epistemic, psychological, or even the animalistically basic preferability of pleasure over pain) scientific practice encourages certain modes of embodied relations to the world – not just psychic but material relations themselves embodied in technologies. It goes without saying that this can only occur at the expense of other modes of being-embodied.

More generally stated, the real problem with naturalism’s self-reflexivity is not that it fails to take skeptical challenges seriously or that it provides a false picture of actual scientific practice, but that in extrapolating from practice it locks certain assumptions about technological reliability into theory, embracing them as its own. While it is contextually – indeed physically – necessary that assumptions be made, and that they be embodied or exteriorized in technologies, the particular assumptions are contingent and non-neutral. This may be seen as a political problem, which it is, but it also more than that. It is, moreover, an ontological problem of the instability of nature itself – not just of nature as a construct but of the material co-constitution of real, flesh-and-blood organisms and their environments. Once we enter the naturalist circle – and I believe we have good reason to do so – we accept that evolution dislodges the primacy of place traditionally accorded human beings. At the same time, we accept that the technologies with which science has demonstrated the non-essentiality of human/animal boundaries are reliable, that they show us what reality is really, objectively like. This step depends, however, on a bracketing of technological multistability. If we question this bracketing, as I do, we seem to lose our footing in material objectivity. Nevertheless convinced that it would be wrong to concede defeat to the skeptic, we point out that adaptive knowledge’s circularity or contextualist holism is a necessary requirement of human survival, that it follows directly from embodiment and the fact that the underlying biological mechanisms “largely bypass our conscious experience and linguistic or conceptual abilities” (Giere 385). But if we admit that technological multistability really obtains as a fact of our phenomenal relations to the world, this holism seems to lead us back precisely to Kuhn’s idealist suggestion that researchers (or humans generally) may occupy incommensurably “different worlds.” If we don’t want to abandon materialism, then we have to find an interpretation of this idea that is compatible with physicalism.

Indeed, it is the great merit of naturalism that it provides us with the means for doing so; however, it is the great failure of the theory that it neglects these resources. The failure, which consists in reproducing science’s subordination of technology to thought – in fact compounding the reduction, as contextually practiced, by subordinating it to an overarching (i.e. supra-contextual) theory of science – is truly necessary for naturalism, for to rectify its oversight of multistability is to admit the breakdown of a continuous nature itself. To consistently acknowledge the indeterminacy of human-technology-world relations and simultaneously maintain materialism requires, to begin with, that we extend Giere’s insight about biological mechanisms to specifically technological mechanisms of embodied relation to the world: they too “bypass our conscious experience and linguistic or conceptual abilities.” If we take the implications seriously, this means that technologies resist full conceptualization and are therefore potentially non-compliant with human (or scientific) aims; reliance on technology is not categorically different in kind from reliance on our bodies: both ground our practice and knowledge in the material world, but neither is fully recuperable to thought. Extending naturalism in this way means recognizing that not only human/animal but also human/technology distinctions are porous and non-absolute. But whereas naturalism tacitly assumes that the investment of technology with cognitive aims is only “natural” and therefore beyond question, the multistability of non-cognitive investments of corporeal capacities implies that there is more to the idea of “different worlds” than naturalism is willing or able to admit: on a materialistic reading, it is nature itself, and not just human thought or science, that is historically and contextually multiple, non-coherently splintered, and subject to revolutionary change. Serious consideration of technology leads us, that is, to embrace a denatured naturalism, a techno-evolutionary epistemology, and a material rather than social constructivism. This, then, is the basis for a postnatural philosophy of media.

Phil-Sci-Denaturalized.033

Philosophy of Science De-Naturalized: Notes towards a Postnatural Philosophy of Media

010406-N-0000X-002

I am very honored to have been invited to hold a keynote address at the Texas State University Philosophy Department’s annual philosophy symposium on April 4, 2014. Having studied as an undergraduate at Texas State (which back then was known as Southwest Texas State University, or SWT for short), this will be something of a homecoming for me, and I’m very excited about it!

In fact, one of the first talks I ever delivered was at the 1997 philosophy symposium — the very first year it was held. My talk back then, titled “Skepticism and the Cultural Critical Project,” sought to bridge the divide between, on the one hand, the analytical epistemology and philosophy of science that I was studying under the supervision of Prof. Peter Hutcheson and, on the other hand, the Continental-inspired literary and cultural theory to which I was being exposed by a young assistant professor of English, Mark B. N. Hansen (before he went off to Princeton, then University of Chicago, and now Duke University).

In a way, my effort back then to mediate between these two very different traditions has proved emblematic for my further academic career. For example, my dissertation looked at Frankenstein films as an index for ongoing changes in the human-technological relations that, I contend, continually shape and re-fashion us at a deeply material, pre-subjective, and extra-discursive level of our being. The cultural realm of monster movies was therefore linked to the metaphysical realm of what I call the anthropotechnical interface, and my argument was mounted by way of a lengthy “techno-scientific interlude” in which I revisited many of the topics in Anglo-American epistemology and philosophy of science that I had first thought about as an undergrad in Texas.

Thus, without my knowing it (and it’s really only now becoming clear to me), my talk back in 1997 marked out a trajectory that it seems I’ve been following ever since. And now it feels like a lot of things are coming full circle: A book based upon my dissertation, for which Mark Hansen served as reader, is set to appear later this year (but more on that and a proper announcement later…). In addition, as I announced here recently, I will be moving to North Carolina this summer to commence a 2-year postdoctoral fellowship at Duke, where I will be working closely with Hansen. Now, before that project gets underway, I have the honor to return to the philosophy symposium in San Marcos, Texas and, in a sense, to revisit the place where it all started.

I thought it would be appropriate, therefore, if I delivered a talk that continued along the trajectory I embarked upon there 17 years ago (wow, that makes me feel old…). My talk, titled “Philosophy of Science De-Naturalized: Notes towards a Postnatural Philosophy of Media,” takes a cue from Ronald N. Giere’s “Philosophy of Science Naturalized” — which sought to reconcile Thomas Kuhn’s idea of revolutionary paradigm shifts in the history of science with W. V. O. Quine’s notion of “Epistemology Naturalized,” i.e. a theory of knowledge based more in the material practice and findings of natural science (especially evolutionary biology) than in the “rational reconstruction” of ideal grounds for justified true belief. As I will show, my own “postnaturalism” — which is ultimately a philosophy of media rather than of knowledge or science — represents not so much a break with such naturalism as a particular manner of thinking through issues of technological mediation that emerge in that context, issues that I then subject to phenomenological scrutiny and ultimately post-phenomenological transformations in order to arrive at a theory of anthropotechnical interfacing and change.

DAAD Postdoctoral Fellowship at Duke University

Logo-DAAD-webDuke_logo

At long last, I am excited to announce that my application for a 2-year postdoctoral fellowship at Duke University has been approved for funding through the DAAD (German Academic Exchange Service). At Duke, I will be working closely with Mark B. N. Hansen and other scholars of media and culture to develop a media-archaeological perspective on serialization processes in video games and digital media culture more generally. The fellowship, which runs from August 2014 to July 2016, will allow me to conduct archival research in the US that will supplement and expand my work in the project “Digital Seriality” that I co-direct with Andreas Jahn-Sudmann in the context of the DFG Research Unit “Popular Seriality — Aesthetics and Practice.” Needless to say, I am very excited about this, and I will continue to post updates here! More soon…

CFP: Digital Seriality — Special Issue of Eludamos: Journal for Computer Game Culture

Digital_Seriality.003a

I am pleased to announce that my colleague Andreas Jahn-Sudmann and I will be co-editing a special issue of Eludamos: Journal for Computer Game Culture on the topic of “Digital Seriality.” Here, you’ll find the call for papers (alternatively, you can download a PDF version here). Please circulate widely!

Call for Papers: Digital Seriality

Special Issue of Eludamos: Journal for Computer Game Culture (2014)
Edited by Shane Denson & Andreas Jahn-Sudmann

According to German media theorist Jens Schröter, the analog/digital divide is the “key media-historical and media-theoretical distinction of the second half of the twentieth century” (Schröter 2004:9, our translation). And while this assessment is widely accepted as a relatively uncontroversial account of the most significant media transformation in recent history, the task of evaluating the distinction’s inherent epistemological problems is all the more fraught with difficulty (see Hagen 2002, Pias 2003, Schröter 2004). Be that as it may, since the 1990s at the latest, virtually any attempt to address the cultural and material specificity of contemporary media culture has inevitably entailed some sort of (implicit or explicit) evaluation of this key distinction’s historical significance, thus giving rise to characterizations of the analog/digital divide as caesura, upheaval, or even revolution (Glaubitz et al. 2011). Seen through the lens of such theoretical histories, the technical and especially visual media that shaped the nineteenth and twentieth centuries (photography, film, television) typically appear today as the objects of contemporary digitization processes, i.e. as visible manifestations (or remnants) of a historical transition from an analog (or industrial) to a digital era (Freyermuth and Gotto 2013). Conversely, despite its analog pre-history today’s digital computer has primarily been addressed as the medium of such digitization processes – or, in another famous account, as the end point of media history itself (Kittler 1986).

The case of digital games (as a software medium) is similar to that of the computer as a hardware medium: although the differences and similarities between digital games and older media were widely discussed in the context of the so-called narratology-versus-ludology debate (Eskelinen 2001; Juul 2001; Murray 1997, 2004; Ryan 2006), only marginal attention was paid in these debates to the media-historical significance of the analog/digital distinction itself. Moreover, many game scholars have tended to ontologize the computer game to a certain extent and to treat it as a central form or expression of digital culture, rather than tracing its complex historical emergence and its role in brokering the transition from analog to digital (significant exceptions like Pias 2002 notwithstanding). Other media-historiographical approaches, like Bolter and Grusin’s concept of remediation (1999), allow us to situate the digital game within a more capacious history of popular-technical media, but such accounts relate primarily to the representational rather than the operative level of the game, so that the digital game’s “ergodic” form (Aarseth 1999) remains largely unconsidered.

Against this background, we would like to suggest an alternative angle from which to situate and theorize the digital game as part of a larger media history (and a broader media ecology), an approach that attends to both the representational level of visible surfaces/interfaces and the operative level of code and algorithmic form: Our suggestion is to look at forms and processes of seriality/serialization as they manifest themselves in digital games and gaming cultures, and to focus on these phenomena as a means to understand both the continuities and the discontinuities that mark the transition from analog to digital media forms and our ludic engagements with them. Ultimately, we propose, the computer game simultaneously occupies a place in a long history of popular seriality (which stretches from pre-digital serial literature, film, radio, and television, to contemporary transmedia franchises) while it also instantiates novel forms of a specifically digital type of seriality (cf. Denson and Jahn-Sudmann 2013). By grappling with the formal commensurabilities and differences that characterize digital games’ relations to pre-digital (and non-ludic) forms of medial seriality, we therefore hope to contribute also to a more nuanced account of the historical process (rather than event) of the analog/digital divide’s emergence.

Overall, seriality is a central and multifaceted but largely neglected dimension of popular computer and video games. Seriality is a factor not only in explicitly marked game series (with their sequels, prequels, remakes, and other types of continuation), but also within games themselves (e.g. in their formal-structural constitution as an iterative series of levels, worlds, or missions). Serial forms of variation and repetition also appear in the transmedial relations between games and other media (e.g. expansive serializations of narrative worlds across the media of comics, film, television, and games, etc.). Additionally, we can grasp the relevance of games as a paradigm example of digital seriality when we think of the ways in which the technical conditions of the digital challenge the temporal procedures and developmental logics of the analog era, e.g. because once successively appearing series installments are increasingly available for immediate, repeated, and non-linear forms of consumption. And while this media logic of the database (cf. Manovich 2001: 218) can be seen to transform all serial media forms in our current age of digitization and media convergence, a careful study of the interplay between real-time interaction and serialization in digital games promises to shed light on the larger media-aesthetic questions of the transition to a digital media environment. Finally, digital games are not only symptoms and expressions of this transition, but also agents in the larger networks through which it has been navigated and negotiated; serial forms, which inherently track the processes of temporal and historical change as they unfold over time, have been central to this media-cultural undertaking (for similar perspectives on seriality in a variety of media, cf. Beil et al. 2013, Denson and Mayer 2012, Jahn-Sudmann and Kelleter 2012, Kelleter 2012, Mayer 2013).

To better understand the cultural forms and affective dimensions of what we have called digital games’ serial interfacings and the collective serializations of digital gaming cultures (cf. Denson and Jahn-Sudmann 2013), and in order to make sense of the historical and formal relations of seriality to the emergence and negotiation of the analog/digital divide, we seek contributions for a special issue of Eludamos: Journal of Computer Game Culture on all aspects of game-related seriality from a wide variety of perspectives, including media-philosophical, media-archeological, and cultural-theoretical approaches, among others. We are especially interested in papers that address the relations between seriality, temporality, and digitality in their formal and affective dimensions.

Possible topics include, but are not limited to:

  • Seriality as a conceptual framework for studying digital games
  • Methodologies and theoretical frameworks for studying digital seriality
  • The (im)materiality of digital seriality
  • Digital serialities beyond games
  • The production culture of digital seriality
  • Intra-ludic seriality: add-ons, levels, game engines, etc.
  • Inter-ludic seriality: sequels, prequels, remakes
  • Para-ludic seriality: serialities across media boundaries
  • Digital games and the limits of seriality

******************************************************************************

Paper proposals (comprising a 350-500 word abstract, 3-5 bibliographic sources, and a 100-word bio) should be sent via e-mail by March 1, 2014 to the editors:

  • a.sudmann[at]fu-berlin.de
  • shane.denson[at]engsem.uni-hannover.de

Papers will be due July 15, 2014 and will appear in the fall 2014 issue of Eludamos.

*******************************************************************************

References:

Aarseth, Espen. 1999. “Aporia and Epiphany in Doom and The Speaking Clock: The Temporality of Ergodic Art.” In Marie-Laure Ryan, ed. Cyberspace Textuality: Computer Technology and Literary Theory. Bloomington: Indiana University Press, 31–41.

Beil, Benjamin, Lorenz Engell, Jens Schröter, Daniela Wentz, and Herbert Schwaab. 2012. “Die Serie. Einleitung in den Schwerpunkt.” Zeitschrift Für Medienwissenschaft 2 (7): 10–16.

Bolter, J. David, and Richard A, Grusin. 1999. Remediation: Understanding New Media. Cambridge, Mass.: MIT Press.

Denson, Shane, and Andreas Jahn-Sudmann. “Digital Seriality: On the Serial Aesthetics and Practice of Digital Games.” Eludamos. Journal for Computer Game Culture 1 (7): 1-32. http://www.eludamos.org/index.php/eludamos/article/view/vol7no1-1/7-1-1-html.

Denson, Shane, and Ruth Mayer. 2012. “Grenzgänger: Serielle Figuren im Medienwechsel.” In Frank Kelleter, ed. Populäre Serialität: Narration – Evolution – Distinktion. Zum seriellen Erzählen seit dem 19. Jahrhundert. Bielefeld: Transcript, 185-203.

Eskelinen, Markku. 2001. “The Gaming Situation” 1 (1). http://www.gamestudies.org/0101/eskelinen/.

Freyermuth, Gundolf S., and Lisa Gotto, eds. 2012. Bildwerte: Visualität in der digitalen Medienkultur. Bielefeld: Transcript.

Glaubitz, Nicola, Henning Groscurth, Katja Hoffmann, Jörgen Schäfer, Jens Schröter, Gregor Schwering, and Jochen Venus. 2011. Eine Theorie der Medienumbrüche. Vol. 185/186. Massenmedien und Kommunikation. Siegen: Universitätsverlag Siegen.

Hagen, Wolfgang. 2002. “Es gibt kein ‘digitales Bild’: Eine medienepistemologische Anmerkung.” In: Lorenz Engell, Bernhard Siegert, and Joseph Vogl, eds. Archiv für Mediengeschichte Vol. 2 – “Licht und Leitung.” München: Wilhelm Fink Verlag, 103–12.

Jahn-Sudmann, Andreas, and Frank Kelleter. “Die Dynamik Serieller Überbietung: Zeitgenössische Amerikanische Fernsehserien und das Konzept des Quality TV.” In Frank Kelleter, ed. Populäre Serialität: Narration – Evolution – Distinktion. Zum seriellen Erzählen seit dem 19. Jahrhundert. Bielefeld: Transcript, 205–24.

Juul, Jesper. 2001. “Games Telling Stories? – A Brief Note on Games and Narratives.” Game Studies 1 (1). http://www.gamestudies.org/0101/juul-gts/.

Kelleter, Frank, ed. 2012. Populäre Serialität: Narration – Evolution – Distinktion: Zum seriellen Erzählen seit dem 19. Jahrhundert. Bielefeld: Transcript.

Kittler, Friedrich A. 1986. Grammophon, Film, Typewriter. Berlin: Brinkmann & Bose.

Manovich, Lev. 2001. The Language of New Media. Cambridge, Mass.: MIT Press.

Mayer, Ruth. 2013. Serial Fu Manchu: The Chinese Supervillain and the Spread of Yellow Peril Ideology. Philadelphia: Temple University Press.

Murray, Janet H. 1997. Hamlet on the Holodeck: The Future of Narrative in Cyberspace. Cambridge: MIT Press.

Murray, Janet H. 2004. “From Game-Story to Cyberdrama.” In Noah Wardrip-Fruin and Pat Harrigan, eds. First Person: New Media as Story, Performance, and Game. Cambridge, MA: MIT Press, 2-10.

Pias, Claus. 2002. Computer Spiel Welten. Zürich, Berlin: Diaphanes.

Pias, Claus. 2003. “Das digitale Bild gibt es nicht. Über das (Nicht-)Wissen der Bilder und die informatische Illusion.” Zeitenblicke 2 (1). http://www.zeitenblicke.de/2003/01/pias/.

Ryan, Marie-Laure. 2006. Avatars of Story. Minneapolis: University of Minnesota Press.

Schröter, Jens. 2004. “Analog/Digital – Opposition oder Kontinuum?” In Jens Schröter and Alexander Böhnke, eds. Analog/Digital – Opposition oder Kontinuum? Beiträge zur Theorie und Geschichte einer Unterscheidung. Bielefeld: Transcript, 7–30.

“What if the camera / really do / take your soul?”

If the medium is the message, as Marshall McLuhan famously claimed, then the so-called “selfie” may be less about the face that constitutes the recognizable content of such an image, and more about a deeper, less obvious form of material-aesthetic mediation with respect to the transformation of “self” in an age of ubiquitous post-cinematic cameras.

Clearly, such acts of mediation have many levels. On the one hand, we “stage” or “perform” our selves for ourselves and for our friends (and, of course, for our Facebook “friends”); at the same time, though, we do so with an awareness of the machinery of geolocated surveillance and algorithmic facial recognition systems that we feed and help to optimize with the offering of our selfies (and the metadata they contain). Is this a self-destructive tendency or an act of defiance? Do we taunt and shake our fists at the invisible all-seeing God of Hyperinformatic Imagery (or the NSA), heroically though baselessly asserting our autonomy despite our knowledge of its baselessness? Or is it just that we have resigned ourselves to the new “situation,” in which Berkeley’s maxim esse est percipi has been made a reality through a media-technical dispositif that renders superfluous the whole apparatus of angelic and divine perceptions that Bishop Berkeley still needed to keep his system from falling apart?

But the post-cinematic camera is a post-perceptual camera. Esse is now post-percipi in the sense that networks of digital and increasingly “smart” cameras are not just collecting images of “you” or “me” but instituting radical changes in the fine-grained, “molecular” scale of temporal becoming that subtends subjective (or “molar”) perception. As I have been arguing recently (see here, for example), post-cinematic cameras produce “metabolic images” — images that operate outside of visual or perceptual registers and modulate our pre-personal relations to the environment, directly influencing us at the level of our metabolic processing of duration and relation through which our embodied agencies are defined. This has to do with (among other things) the sheer speed of computational processes, which outstrip our own cognitive and perceptual processing abilities. But it also has to do with the affective density that post-cinematic cameras themselves accrue by virtue of the gap — what Bergson would call a “center of indeterminacy,” or simply a body — that these cameras install between the input and output of images, in the space of their microtemporal computational processing. On this basis, a synchronization of human and technical temporalities is made possible at the micro-level. And perhaps this is the hidden message of the medium: the selfie is not just a paradoxical performance of self (in the way that, say, reality shows problematize authenticity), it is in fact the product of a whole new ecology of agency, an ecology of anthropotechnically co-ordinated metabolisms invisibly subtending the visible images by which we seek to represent our “selves.”

With every selfie, we experiment with this interplay of visible manifestation and invisible infrastructure. Who can we be, now, and in relation to an environment filled with rapidly proliferating digital images, where everything is in flux, nothing apparently stable? Perhaps we encounter here, and try to dispel, an old fear in a new guise: that the camera is capable of stealing our souls — both through integration into systems of surveillance, and in the dissolution of our former agencies when set in relation to the molecular, metabolic processes embodied by the post-cinematic camera. In the words of Montreal-based indie rock band Arcade Fire:

What if the camera
Really do
Take your soul?
Oh no...

Hit me with your flashbulb eyes!
Hit me with your flashbulb eyes!
You know I've got nothing to hide
You know I got nothing
No I got ... nothing

Above, my own mixed-media “reflections” on the problem of the selfie in the age of metabolic modulation. Featuring artworks by Thomas Böing (Ohne Titel [Museum König], 2006), currently on display at the impressive Kolumba — Art Museum of the Archdiocese of Cologne as part of the exhibition “show cover hide. Shrine. An exhibition on the aesthetics of the invisible,” which runs until August 25, 2014.