Deformative Criticism at #SCMS17

ScannableImages-smallgif

At the upcoming SCMS conference in Chicago, I will be participating in a workshop on “Deformative Criticism and Digital Experimentations in Film & Media Studies” (panel K3 on Friday, March 24, 2017 at 9:00am):

Deformative criticism has emerged as an innovative site of critical practice within media studies and digital humanities, revealing new insights into media texts by “breaking” them in controlled or chaotic ways. Deformative criticism includes a wide range of digital experiments that generate heretical and non-normative readings of media texts; because the results of these experiments are impossible to know in advance, they shift the boundaries of critical scholarship. Media scholars are particularly well situated to such experimentation, as many of our objects of study exist in digital forms that lend themselves to wide-ranging manipulation. Thus, deformative criticism offers a crucial venue for defining not only contemporary scholarly practice, but also media studies’ growing relationship to digital humanities.

Also participating in the workshop will be Jason Mittell (Middlebury College), Stephanie Boluk (UC Davis), Kevin L. Ferguson (Queens College, City University of New York), Mark Sample (Davidson College), and Virginia Kuhn (USC).

My own presentation/workshop contribution will focus on glitches and augmented reality as a deformative means of engaging with changing media-perceptual configurations, including the following case study:

Glitch, Augment, Scan

Scannable Images is a collaborative art/theory project by Karin + Shane Denson that interrogates post-cinema – its perceptual patterns, hyperinformatic simultaneities, and dispersals of attention – through an assemblage of static and animated images, databending and datamoshing techniques, and augmented reality (AR) video overlays. Viewed through the small screen of a smartphone or tablet – itself directed at a computer screen – only a small portion of the entire spectacle can be seen at once, thus reflecting and emulating the selective, scanning regard of post-cinematic images and confronting the viewer with the materiality of the post-cinematic media regime through the interplay of screens, pixels, people, and the physical and virtual spaces they occupy.

Post-Cinema AR

13122900_1733166986928333_1269794987351584805_o13131607_1733166983595000_88851548299674089_o

The augmented reality piece featured on the cover of Post-Cinema: Theorizing 21st-Century Film (http://reframe.sussex.ac.uk/post-cinema/), a collaborative piece made by Karin Denson and me, was displayed recently at a glitch-oriented gallery show organized by some nice people associated with Savannah College of Art and Design.

Try it out for yourself here: http://reframe.sussex.ac.uk/post-cinema/artwork/.

After.Video at Libre Graphics 2016 in London

banner_glitch_1

Recently, I posted about a project called after.video, which contains an augmented (AR) glitch/video/image-based theory piece that Karin Denson and I collaborated on. It has now been announced that the official launch of after.video, Volume 1: Assemblages — a “video book” consisting of a paperback book and video elements stored on a Raspberry Pi computer packaged in a VHS case, which will also be available online — will take place at the Libre Graphics Meeting 2016 in London (Sunday, April 17th at 4:20pm).

The Gnomes Are Back: Business cARd 2.0

gnome-cARd

Ever since our old AR platform was bought out and shut down by Apple, the “data gnomes” that Karin and I developed in conjunction with the Duke S-1: Speculative Sensation Lab’s “Manifest Data” project have been bumbling about in digital limbo, banished to 404 hell. So today I finally made the first steps in migrating our beloved creatures over to a new AR platform (Wikitude), where they’re starting to feel at home. While I was at it, I went ahead and reprogrammed my business card:

2016-01-31 12.21.55 pm

The QR code on the front now redirects the browser to shanedenson.com, while the AR content on the back side is made visible with the Wikitude app (free on iOS or Android) — just search for “Shane Denson” and point your phone/tablet’s camera at the image below:

2016-01-31 12.22.20 pm

(In case you’re wondering what this is: it’s a “data portrait” generated from my Internet browsing behavior. You can make your own with the code included in the S-1 Lab’s Manifest Data kit.)

DEMO Video: Post-Cinema: 24fps@44100Hz

As Karin posted yesterday (and as I reblogged this morning), our collaborative artwork Post-Cinema: 24fps@44100Hz will be on display (and on sale) from January 15-23 at The Carrack Modern Art gallery in Durham, NC, as part of their annual Winter Community Show.

Exhibiting augmented reality pieces always brings with it a variety of challenges — including technical ones and, above all, the need to inform viewers about how to use the work. So, for this occasion, I’ve put together this brief demo video explaining the piece and how to view it. The video will be displayed on a digital picture frame mounted on the wall below the painting. Hopefully it will be both eye-catching enough to attract passersby and it will effectively communicate the essential information about the process and use of the work.

Post-Cinema: 24fps@44100Hz

The New Krass

Cover-painting_small

This piece, consisting of an acrylic painting (24″x24″) and an augmented reality (AR) digital-video overlay, is a collaborative artwork made together with my husband Shane Denson. It will be featured on the cover of the soon forthcoming book Post-Cinema: Theorizing 21st Century Film, which Shane edited along with Julia Leyda, and it will be on display at The Carrack Modern Art gallery in Durham, NC from January 15-23, 2016.

For the painting, I started with a photo that I took of our dog Evie lying in front of another of my paintings, Glitchesarelikewildanimals!No.1, which itself was based on a digitally glitched (databent) picture of Evie. I subjected the new photo to a similar process, first turning it into a video in a nonlinear editing program (Final Cut Pro) and then deforming the video by sonifying it in an audio editing program (audacity).

You can see the…

View original post 192 more words