Frankenstein, Film, and Chemistry

slideshow1

The following is the text of my short presentation on “Frankenstein, Film, and Chemistry,” which I delivered on October 18, 2019 at the “Chemistry and Film: Experiments in Living” joint symposium of the Stanford Departments of Chemistry and of Art & Art History.

Frankenstein, Film, and Chemistry”

Shane Denson

It has been estimated, perhaps with a bit of exaggeration, that there are over 200 cinematic adaptations of Frankenstein. These films are sometimes more, sometimes less true to the letter and the spirit of Mary Shelley’s Gothic novel, but they are all, in the words of today’s symposium title, what might be called “experiments in living”—aesthetic, narrative, and technological experiments in giving, shaping, and controlling life.

slideshow2

The most iconic versions of the tale (such as James Whale’s 1931 film starring Boris Karloff as the flat-headed monster with electrodes in his neck) generally frame these efforts as studies in electrical engineering rather than applied chemistry.

slideshow3.jpg

However, a minor strand of Frankenstein films, including Thomas Edison’s 1910 film and Hammer Studios’ 1957 Curse of Frankenstein, attend also to chemistry—picking up on Victor Frankenstein’s keen interest in alchemy, as suggested in Shelley’s novel, and focusing it towards the cinematic medium’s own photochemical processes. In this way, narrative and medial chemistries mix in self-reflexive spectacles that foreground the material processes of giving life to, or “animating,” moving images as much as the monsters they depict.

slideshow4

Both the self-reflexive impulse of these films and their focus on chemistry have their roots in Mary Shelley’s novel, first published in 1818. Most famously, with respect to self-reflexivity, Shelley established a relation between literary authorship and the book’s monstrous act of creation when, in the Introduction to the 1831 revised edition of the novel, she “bid [her] hideous progeny go forth and prosper.” Author and creator, medium and monster were thus linked around the question of life-giving and its legitimate means. But what were these means?

slideshow5

The novel devotes just a single paragraph to the creation scene, eliding any detailed description of “the instruments of life” employed, but suggestively indicating Frankenstein’s goal of “infus[ing] a spark of being” into the creature. Shelley’s 1831 Introduction returned to this scene, but it focused more on the author’s own act of creation during a dreary summer spent at Lake Geneva, where Shelley listened to ghost stories as well as reports of scientific advances, prior to having a waking dream that revealed to her the terrifying sight of the monster stirring to life.

slideshow6

The Introduction does add one other detail to the fictional act of creation, attributing it to “the working of some powerful engine.”

slideshow8

Furthermore, the author reports listening to conversations between Lord Byron and Percy Bysshe Shelley on “the nature of the principle of life,” paraphrasing or elaborating that “Perhaps a corpse would be re-animated; galvanism had given token of such things.” Connecting the “spark of being” to galvanism, this would seem to be the source of the electrical interpretation of Frankensteinian creation.

slideshow9

However, it is important to note that the novel itself indicates that the young Frankenstein was obsessed with alchemical authors, and he later declares that “natural philosophy, and particularly chemistry, in the most comprehensive sense of the term, became nearly my sole occupation.” A recent article by Mary Fairclough argues convincingly that, for both Mary Shelley and the characters in her novel, electricity and chemistry were intimately entwined, and with them also questions of physiology.

slideshow11

It is possible that the “powerful engine” envisioned by Shelley was a “battery” of “galvanic piles,” a device invented by Alessandro Volta to refute Galvani’s claim that he had discovered so-called “animal electricity.” The device, applied widely in early 19th-century chemistry, was alternately seen as demonstrating that chemical reactions produced electricity, or that electricity was responsible for producing said chemical reactions. In any case, life—and the physiological reactions of living and dead bodies to the operation of the galvanic pile—remained a crucial site of practical and theoretical concern around this intersection of chemistry and electricity.

slideshow12

In the 20thcentury, the cinema would return again and again to Frankenstein, due in no small part to the fact that Shelley’s tale provided a seductive allegory for the life-giving powers of the medium of film itself. Like Frankenstein selecting parts from corpses and infusing life into a composite body, filmmakers utilized the technical means of film to (re)animate the “dead” (or photographically preserved) traces of living organisms (such as actors) into new visual narrative compositions. Discourses of animation, or the giving of life, were central to the early cinema’s understanding of itself, as is evidenced in the names of companies and devices such as Vitascope and Biograph. Given this overarching concern with the life-giving force of moving images, the Frankenstein tale provided an opportunity for reflecting on the changing conditions of animation, or the life of kinetic images. Seen in this light, a focus on electricity versus chemistry is less a matter of competing interpretations of Shelley’s narrative, and more a matter of cinema’s interpretation of its own media-historical situation at any given juncture.

Edison1

Consider, for example, Thomas Edison’s Frankenstein from 1910, which stands between the early, image-technology oriented “cinema of attractions” and the coming narrative-oriented classical Hollywood style that would take shape around 1917. In 1910, the medium was in transition, torn between lowbrow technological spectacle and an uncertain reorientation along the lines of the respectable theater. Accordingly, advertising for the film emphasized both the “photographic marvel” of the creation sequence and the story’s origin in “Mrs. Shelley’s […] work of art.” The film aimed to be both visual and technological spectacle and narrative high culture. And this multiple address relates directly to the uncertain significance of “animation” at this historical juncture.

Edison2

The creation sequence’s so-called “marvel” consists of footage of a burning mannequin projected in reverse—a bit of cinematic magic that, in the context of early film, served to exhibit cinematic technology by focusing attention on the filmic images themselves rather than the objects they depict. Frankenstein’s reactions here channel the scopophilic pleasure of a “primitive” viewer, for whom he stands in as a proxy. Importantly, “animation” here is a self-reflexive topos which links the monster’s creation with the term “animated photography,” still common in 1910 as a description of film in general. And the vaguely chemical or alchemical means of creation serve to link Frankenstein’s act to that of photochemical exposure and development processes—to the chemistry of film itself as the material basis of the images onscreen.

Whale

I have already noted how James Whale’s Frankenstein from 1931 changes course and places electricity at the center of creation, but this is again less about a specific reading of Shelley’s novel, and more about the cinema’s then-current means of animation. The film follows right on the heels of the massive transition from silent to sound cinema. And cinematic sound was widely associated with electricity, due to the revolution of vacuum-based amplification processes and the spread of radio in the 1920s. The noisy electrified lab where the monster is created gives objective form to the association, but it ironically gives birth to a mute, hyperphotographic monster—which is to say, a monster that didn’t quite live up to the electric promise of sound but instead reverted to the chemical base of filmic inscription. In any case, these media-historical aspects and associations faded from view as the Karloff monster was rendered an icon, or a visual cliché. After the sound transition faded from memory, only the narrative role of electricity as a life-giving force remained.

slideshow15

But there would be further opportunities for chemical reflection, most notably following the introduction of Eastmancolor, a cheaper alternative to the expensive and proprietary Technicolor process, which required studios to rent equipment and a technician to operate a special camera that recorded colors onto three separate strips of film. Eastmancolor, which recorded its images on a single strip of film, and thus allowed filmmakers to use standard black-and-white cameras for color shooting, was an innovation in terms both of chemistry and of economics. In terms of chemistry, the new color film superimposed three emulsion layers onto a single support, each responding to either blue, green, or red light. Each layer contains silver halide grains, gelatin as a binder, and what technicians at the time called a “color former,” a chemical that reacts with certain developing agents to produce a colored dye in the direct vicinity of exposed silver halide grains.

teenage-frankenstein.2019-10-26 14_07_38

In terms of economics, this new process made color filmmaking available to lower budget productions, including B-movie exploitation films like I Was a Teenage Frankenstein (from 1957), which switched from black-and-white to color for the final, shocking scene. The result was a heightened “medium sensitivity” on the part of viewers, who were effectively confronted with the material difference between black-and-white and color film stocks—and made sensitive, on a visceral if not cognitive level, to the different sensitivities of their emulsions.

slideshow18

Also in 1957, the British Hammer Studios made creation itself a thoroughly chemical process in Curse of Frankenstein. Here, the camera positively dwells on lab equipment with bubbling liquids. Unlike the electric lab of Whale’s 1931 film, this is a thoroughly chemical lab—a perfect negative, in fact, to the darkroom lab in which the chemical processes of color film processing take place. Red, in particular, is a recurring color, associated with blood: the blood infused into the body of a monstrous corpse, and the blood spilled of innocent victims. But, ultimately, it would seem, a remark made by Jean-Luc Godard in reference to his own filmmaking rings true: Ne c’est pa du sang, c’est du rouge. It’s not blood, it’s red. Indeed, life is at stake, but the significance of red is unstable, wavering between the lifeblood of biology and the chemical emulsion of film’s own body.

curse

James Leo Cahill, “Neo-Zoological Dramas: Comparative Anatomy by Other Means”

Cahill-poster

I am happy to announce that James Leo Cahill, Associate Professor and Director of the Cinema Studies Institute at the University of Toronto, will be holding a workshop session devoted to his recent book Zoological Surrealism at Stanford on November 13, 2019 (2-4pm in McMurtry 370). We are aiming for a discussion more than a lecture, and participants are asked to read the first chapter of Cahill’s book prior to the event.

Chemistry and Film: Experiments in Living

Chem-and-Film

Coming up on October 18, I am happy to be a part of this event on the topic of “Chemistry and Film: Experiments in Living,” a symposium jointly sponsored by the Departments of Art & Art History and Chemistry at Stanford. I will be presenting on “Frankenstein and the Chemistry of Film.”

Ends of Cinema: Center for 21st Century Studies 2018 Conference at UW Milwaukee

C21-Ends-of-Cinema-poster

I am excited to be participating in the Ends of Cinema conference at the Center for 21st Century Studies, taking place May 3-5, 2018 at University of Wisconsin Milwaukee. There are some great keynote speakers, including my colleague Jean Ma and lots of other wonderful people. The C21, under the expert leadership of Richard Grusin (who is now back at the helm after a short hiatus), has put on some of my personal favorite conferences, and I expect this one to be no less exciting and thought-provoking.

My own contribution will be a paper titled “Post-Cinematic Realism” — work in progress for my current book project Discorrelated Images. Here is the abstract:

Post-Cinematic Realism

Shane Denson, Stanford University

In its classical formulation, cinematic realism is based in the photographic ontology of film, i.e. in the photograph’s indexical relation to the world, which grants to film its unique purchase on reality; upon this relation also hinged, for many realist filmmakers, the political promise of realism. Digital media, meanwhile, are widely credited with disrupting indexicality and instituting an alternative ontology of the image. David Rodowick, for example, argues that the interjection of digital code disrupts film’s “automatisms” and eradicates the index in favor of the symbolic. But while such arguments are in many respects compelling, I contend that the disruption of photographic indexicality might also be seen to open up spaces in which to explore new automatisms that communicate reality and/or realism with and through post-indexical technologies.

Whereas André Bazin privileged techniques like the long take and deep focus for their power to approximate our natural perception of time and space, theorists like Maurizio Lazzarato and Mark Hansen emphasize post-cinematic media’s ability to approximate the sub-perceptual processing of duration executed by our pre-personal bodies. The perceptual discorrelation of computational images gives way, in other words, to a more precise calibration of machinic and embodied temporalities; simultaneously, the perceptual richness of Bazin’s images becomes less important, while “poor images” (in Hito Steyerl’s term) communicate more directly the material and political realities of a post-cinematic environment. As I will demonstrate with reference to a variety of moving-image texts employing glitches, drones, and other computational objects, post-cinematic media might in fact be credited with a newly intensified political relevance through their institution of a new, post-cinematic realism.

Rethinking Temporalities in Cinema and Digital Media, SLSA 2017

SLSA-2017

At this year’s SLSA conference, “Out of Time,” hosted by Arizona State University, I will be chairing a panel titled “Rethinking Temporalities in Cinema and Digital Media” (Saturday, November 11, 2017; 4:00-5:30pm). My own talk is titled “Pre-Sponsive Gestures: Post-Cinema Out of Time.” Here is the complete list of panelists and topics:

2017-11-04 12.29.24 pm

Speculative Data: Full Text, MLA 2016 #WeirdDH

SpeculativeData-jpg.001

Below you’ll find the full text of my talk from the Weird DH panel organized by Mark Sample at the 2016 MLA conference in Austin Texas. Other speakers on the panel included Jeremy Justus, Micki Kaufman, and Kim Knight.

***

Speculative Data: Post-Empirical Approaches to the “Datafication” of Affect and Activity

Shane Denson, Duke University

A common critique of the digital humanities questions the relevance (or propriety) of quantitative, data-based methods for the study of literature and culture; in its most extreme form, this type of criticism insinuates a complicity between DH and the neoliberal techno-culture that turns all human activity, if not all of life itself, into “big data” to be mined for profit. Now, it may sound from this description that I am simply setting up a strawman to knock down, so I should admit up front that I am not wholly unsympathetic to the critique of datafication. But I do want to complicate things a bit. Specifically, I want to draw on recent reconceptions of DH as “deformed humanities” – as an aesthetically and politically invested field of “deformance”-based practice – and describe some ways in which a decidedly “weird” DH can avail itself of data collection in order to interrogate and critique “datafication” itself.

SpeculativeData-jpg.002

My focus is on work conducted in and around Duke University’s S-1: Speculative Sensation Lab, where literary scholars, media theorists, artists, and “makers” of all sorts collaborate on projects that blur the boundaries between art and digital scholarship. The S-1 Lab, co-directed by Mark Hansen and Mark Olson, experiments with biometric and environmental sensing technologies to expand our access to sensory experience beyond the five senses. Much of our work involves making “things to think with,” i.e. experimental “set-ups” designed to generate theoretical and aesthetic insight and to focus our mediated sensory apparatus on the conditions of mediation itself. Harnessing digital technologies for the work of media theory, this experimentation can rightly be classed, alongside such practices as “critical making,” in the broad space of the digital humanities. But due to their emphatically self-reflexive nature, these experiments challenge borders between theory and practice, scholarship and art, and must therefore be qualified, following Mark Sample, as decidedly “weird DH.”

SpeculativeData-jpg.003.jpeg

One such project, Manifest Data, uses a piece of “benevolent spyware” that collects and parses data about personal Internet usage in such a way as to produce 3D-printable sculptural objects, thus giving form to data and reclaiming its personal value from corporate cooptation. In a way that is both symbolic and material, this project counters the invisibility and “naturalness” of mechanisms by which companies like Google and Facebook expropriate value from the data we produce. Through a series of translations between the digital and the physical—through a multi-stage process of collecting, sculpting, resculpting, and manifesting data in virtual, physical, and augmented spaces—the project highlights the materiality of the interface between human and nonhuman agencies in an increasingly datafied field of activity. (If you’re interested in this project, which involves “data portraits” based on users’ online activity and even some weird data-driven garden gnomes designed to dispel the bad spirits of digital capital, you can read more about it in the latest issue of Hyperrhiz.)

SpeculativeData-jpg.004

Another ongoing project, about which I will say more in a moment, uses data collected through (scientifically questionable) biofeedback devices to perform realtime collective transformations of audiovisual materials, opening theoretical notions of what Steven Shaviro calls “post-cinematic affect” to robustly material, media-archaeological, and aesthetic investigations.

SpeculativeData-jpg.005

These and other projects, I contend, point the way towards a truly “weird DH” that is reflexive enough to suspect its own data-driven methods but not paralyzed into inactivity.

Weird DH and/as Digital Critical (Media) Studies:

So I’m trying to position these projects as a form of weird digital critical (media) studies, designed to enact and reflect (in increasingly self-reflexive ways) on the use of digital tools and processes for the interrogation of the material, cultural, and medial parameters of life in digital environments.

SpeculativeData-jpg.006

Using digital techniques to reflect on the affordances and limitations of digital media and interfaces, these projects are close in spirit to new media art, but they are also apposite with practices and theories of “digital rhetoric,” as described by Doug Eyman, with Gregory Ulman’s “electracy,” or with Casey Boyle’s posthuman rhetoric of multistability, which celebrates the rhetorical affordances of digital glitches in exposing the affordances and limitations of computational media in the broader realm of an interagential relational field that includes both humans and nonhumans. In short, these projects enact what we might call, following Stanley Cavell, the “automatisms” of digital media – the generative affordances and limitations that are constantly produced, reproduced, and potentially transformed or “deformed” in creative engagements with media. Digital tools are used in such a way as to problematize their very instrumentality, hence moving towards a post-empirical or post-positivistic form of datafication as much as towards a post-instrumental digitality.

SpeculativeData-jpg.007

Algorithmic Nickelodeon / Datafied Attention:

My key example is a project tentatively called the “algorithmic nickelodeon.” Here we use consumer-grade EEG headsets to interrogate the media-technical construction and capture of human attention, and thus to complicate datafication by subjecting it to self-reflexive, speculative, and media-archaeological operations. The devices in question cost about $100 and are marketed as tools for improving concentration, attention, and memory. The headset measures a variety of brainwave activity and, by means of a proprietary algorithm, computes values for “attention” and “meditation” that can be tracked and, with the help of software applications, trained and supposedly optimized. In the S-1 Lab, we have sought to tap into these processes in order not just to criticize the scientifically dubious nature of these claims but rather to probe and better understand the nature of the automatisms and interfaces taking place here and in media of attention more generally. Specifically, we have designed a film- and media-theoretical application of the apparatus, which allows us to think early and contemporary moving images together, to conceive pre- and post-cinema in terms of their common deviations from the attention economy of classical cinema, and to reflect more broadly on the technological-material reorganizations of attention involved in media change. This is an emphatically experimental (that is, speculative, post-positivistic) application, and it involves a sort of post-cinematic reenactment of early film’s viewing situations in the context of traveling shows, vaudeville theaters, and nickelodeons. With the help of a Python script written by lab member Luke Caldwell, a group of viewers wearing the Neurosky EEG devices influence the playback of video clips in real time, for example changing the speed of a video or the size of the projected image in response to changes in attention as registered through brain-wave activity.

At the center of the experimentation is the fact of “time-axis manipulation,” which Friedrich Kittler highlights as one of the truly novel affordances of technical media, like the phonograph and cinema, that arose around 1900 and marked, for him, a radical departure from the symbolic realms of pre-technical arts and literature. Now it became possible to inscribe “reality itself,” or to record a spectrum of frequencies (like sound and light) directly, unfiltered through alphabetic writing; and it became possible as well to manipulate the speed or even playback direction of this reality.

SpeculativeData-jpg.009

Recall that the cinema’s standard of 24 fps only solidified and became obligatory with the introduction of sound, as a solution to a concrete problem introduced by the addition of a sonic register to filmic images. Before the late 1920s, and especially in the first two decades of film, there was a great deal of variability in projection speed, and this was “a feature, not a bug” of the early cinematic setup. Kittler writes: “standardization is always upper management’s escape from technological possibilities. In serious matters such as test procedures or mass entertainment, TAM [time-axis manipulation] remains triumphant. [….] frequency modulation is indeed the technological correlative of attention” (Gramophone Film Typewriter 34-35). Kittler’s pomp aside, his statement highlights a significant fact about the early film experience: Early projectionists, who were simultaneously film editors and entertainers in their own right, would modulate the speed of their hand-cranked apparatuses in response to their audience’s interest and attention. If the audience was bored by a plodding bit of exposition, the projectionist could speed it up to get to a more exciting part of the movie, for example. Crucially, though: the early projectionist could only respond to the outward signs of the audience’s interest, excitement, or attention – as embodied, for example, in a yawn, a boo, or a cheer.

SpeculativeData-jpg.010

But with the help of an EEG, we can read human attention – or some construction of “attention” – directly, even in cases where there is no outward or voluntary expression of it, and even without its conscious registration. By correlating the speed of projection to these inward and involuntary movements of the audience’s neurological apparatus, such that low attention levels cause the images to speed up or slow down, attention is rendered visible and, to a certain extent, opened to conscious and collective efforts to manipulate it and the frequency of images now indexed to it.

According to Hugo Münsterberg, who wrote one of the first book-length works of film theory in 1916, cinema’s images anyway embody, externalize, and make visible the faculties of human psychology; “attention,” for example, is said to be embodied by the close-up. With our EEG setup, we can literalize Münsterberg’s claim by correlating higher attention levels with a greater zoom factor applied to the projected image. If the audience pays attention, the image grows; if attention flags, the image shrinks. But this literalization raises more questions than it answers, it would seem. On the one hand, it participates in a process of “datafication,” turning brain wave patterns into a stream of data called “attention,” but whose relation to attention in ordinary senses is altogether unclear. But this datafication simultaneously opens up a space of affective or aesthetic experience in which the problematic nature of the experimental “set-up” announces itself to us in a self-reflexive doubling: we realize suddenly that “it’s a setup”; “we’ve been framed” – first by the cinema’s construction of attentive spectators and now by this digital apparatus that treats attention as an algorithmically computed value.

So in a way, the apparatus is a pedagogical/didactic tool: it not only allows us to reenact (in a highly transformed manner) the experience of early cinema, but it also helps us to think about the construction of “attention” itself in technical apparatuses both then and now. In addition to this function, it also generates a lot of data that can indeed be subjected to statistical analysis, correlation, and visualization, and that might be marshaled in arguments about the comparative medial impacts or effects of various media regimes. Our point, however, remains more critical, and highly dubious of any positivistic understanding of this data. The technocrats of the advertising industry, the true inheritors of Münsterberg the industrial psychologist, are anyway much more effective at instrumentalizing attention and reducing it to a psychotechnical variable. With a sufficiently “weird” DH approach, we hope to stimulate a more speculative, non-positivistic, and hence post-empirical relation to such datafication. Remitting contemporary attention procedures to the early establishment of what Kittler refers to as the “link between physiology and technology” (73) upon which modern entertainment media are built, this weird DH aims not only to explore the current transformations of affect, attention, and agency – that is, to study their reconfigurations – but also potentially to empower media users to influence such configuration, if only on a small scale, rather than leave it completely up to the technocrats.

Demon Debt

Leyda_Demon-Debt

I am pleased to announce that on Friday, January 17, 2014 (12:00 pm in room 609, Conti-Hochhaus), Prof. Julia Leyda from Sophia University in Tokyo will be giving a talk on “Demon Debt: Paranormal Activity as Recessionary Post-Cinematic Allegory.” The lecture will take place in the context of my seminar on 21st-century film, but attendance is open to all.

Julia Leyda has participated in the two roundtable discussions on “post-cinematic affect” that have appeared to date in the pages of La Furia Umana, and she served as respondent at SCMS 2013 on a panel that included papers by Steven Shaviro, Therese Grisham, and myself. Among other projects, she is currently collaborating with me on the preparation of an edited collection on “post-cinematic theory” that we hope to see published in 2014! (More details soon…)

Here is the abstract for her talk in January:

Demon Debt: Paranormal Activity as Recessionary Post-Cinematic Allegory

Julia Leyda

The Paranormal Activity film franchise serves as a case study in twenty-first-century neoliberal post-cinema. The demon in the Paranormal films comes to claim a debt resulting from a contract with an ancestor, who has in a sense “mortgaged” her future offspring in exchange for power and wealth; the demon here is an allegory of debt under capitalism, invisible, conveyed through digital media, and inescapable. Set entirely inside feminine spaces of the home — bedroom, kitchen, and nursery — the films construct a post-feminist narrative that reconfigures the gender politics of horror cinema. But the post-cinematic moment also demands analysis of form in addition to a thematic reading. Digital data constitutes the “film” itself in the form of video footage, like transnational finance capital and the intangible systems of consumer credit, and like the unseen and immaterial demon. The incursion of debtor capitalism and financialization into the home in these films has turned deadly. Finally, like the demonic home invasion, the financialization of private life drafts the immaterial labor of the audience into the branding of the film.