Five Bodied
 
Advanced Search
   
 
Home Register FAQ Memberlist Usergroups  
 
 

Five Bodied Forum Index Five Bodied (A)Wake Mark Hansen on My Tiny Note Chart
Display posts from previous:   
      All times are GMT - 5 Hours  
Post new topic  Reply to topic

Sat May 05, 2012 5:57 am
Author Message
sixbodied
Site Admin


Joined: 09 Jul 2005
Posts: 4048

Post subject: Mark Hansen on My Tiny Note Chart Reply with quote

From CRITICAL TERMS FOR MEDIA STUDIES, Edited by W.J.T. Mitchell and Mark B.N. Hansen, Chicago: University of Chicago Press, 2010, pp.173-185

Mr. Hansen was given the task of defining "new media" in the "Technology" section:

{{ The term NEW MEDIA has achieved the kind of widespread cultural dissemination that seems to strip away all specificity. "New" media is everywhere around us, in the gadgets and devices we use to keep organized, to do our work, to play, to access information, and to communicate with friends and acquaintances. At the same time, "old" new media-the zograscope, the optical telegraph, the physiognotrace-have become newly interesting thanks to recent studies of such bypassed technologies. Books have been written on new media as the convergence brought about by digital technology and, at the other extreme, on the newness that accompanies all media at the moment of their introduction. Cutting across these contemporary projects and responsible for their complementarity, if not indeed for their imbrication, is the ambivalent or double case of that central term, NEW MEDIA.

At once singular and plural, "new media" would seem to designate both a qualitatively new kind of media and a quality of all media (of every medium) at the point at which they are (it is) introduced into and disseminated across society. The plural face of new media goes hand in hand with the larger dialectic of media innovation that characterizes Western culture from antiquity onward; the singular face suggests that we may, today, have come to a moment of impasse in this very dialectic, a moment in which media may in fact be separated from the technical means by which a culture stores its knowledge and history (see chapter 13, "Hardware/Software/Wetware"). Can it be that, for the first time in our history, media (meaning the storage, dissemination, and transmission of experience) has become distinct from its own technical infrastructure, from the computational networks and machines that undergird most of what we consume as media? And if so, what are the consequences for our understanding of the future prospects for human beings and for the life of our planet? Such are the stakes bound up in the issue of the "newness" of new media.

Both faces of new media-singular and plural-arise on the basis of a common dialectic of media innovation: by changing the conditions for the production of experience, new media destabilize existing patterns of biological, psychical, and collective life even as they furnish new facilities. This convergence of privation and supplementation already informs what many critics hold to be the primal scene of media innovation in Western thought: Plato's meditation on the new medium of writing in the Phaedrus. There the issue is writing's status as a pharmakon, at once a poison and its antidote, a threat to memory and its extension. This profound ambivalence is clearly expressed in the myth of Theuth, the Egyptian God who invented writing, which Socrates recounts to Phaedrus:

[But when it came to writing Theuth said [to the Egyptian king Thamus], "Here, 0 king, is a branch of learning that will make the people of Egypt wiser and improve their memories; my discovery provides a recipe for memory and wisdom." But the king answered and said, "0 man full of arts, to one it is given to create the things of art, and to another to judge what measure of harm and of profit they have for those that shall employ them. And so it is that you, by reason of your tender regard for the writing that is your offspring, have declared the very opposite of its true effect. If men learn this, it will implant forgetfulness in their souls; they will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks. What you have discovered is a recipe not for memory, but for reminder. And it is no true wisdom that you offer your disciples, but only its semblance."]

Thamus's well-reasoned reserve notwithstanding, this myth captures the fundamental duality that will drive media innovation from the invention of writing onward: to the extent that each new medium operates by exteriorizing some function of human cognition and memory, it involves both gain and loss. In this case, even if writing results in a waning of onboard memory skills, it furnishes an external supplement to internal memory that will become ever more necessary as information proliferates and life becomes more complex.

The fundamental duality that drives media innovation has often taken the form of myth. In the Protagoras, Plato himself deploys the Hesiodic myth of Prometheus and Epimetheus as a means of characterizing the singularity of the human, but also of grasping our fundamental dependence on technology. Let us recall the salient details of Plato's account: charged with the task of equipping mortal creatures with suitable powers, Epimetheus makes his distribution following the principle of compensation, giving to each creature those capacities that will insure their survival. Not being a particularly clever person, Epimetheus uses all of the available powers on the brute beasts, leaving the human race unprovided for and compelling the famous theft of fire by his perhaps justly more famous brother, Prometheus. Because of our Promethean legacy, so Plato's myth recounts, we humans have "had a share in the potion of the gods" and have distinguished ourselves from all other animals through our use of the arts of fire, which is to say, technologies. This use has resulted in the development of articulate speech and names, the invention of "houses and clothes and shoes and bedding," and the introduction of agriculture. To this Platonic list, we might add the continual development of new technologies and media, which has, in the view of many, led to our ever more powerful domination over nature and now, with the development of genetic engineering, over life itself.

That Prometheus suffered unending punishment for his theft from the Gods should not be forgotten. Indeed, it is this aspect of the myth that reappears throughout our history, at moments of large-scale technological change. To cite only one example, Mary Shelley's Frankenstein, or the Modern Prometheus casts Victor Frankenstein as a promethean scientist whose theft of the spark of life leads to disastrous consequences. Leaving aside questions concerning the allegorical scope of Shelley's tale (is it a criticism of industrialism? a plea for proper parenting?), what is important here is its invocation of Prometheus to mark the human engagement with, and ambivalence toward, new technology. Any Promethean step forward is, so it seems, necessarily accompanied by fears that we have overstepped, that we have introduced something detrimental to our "natural" life. One need only recall the anxieties that welled up around cinema at its origin, which stretched the gamut from the physiological (it would hurt our eyesight) to the moral (it would cater to our lowest impulses). Or consider the myriad anxieties that today surround genetic engineering and stem cell research. What the longevity of this mythic kernel would seem to point toward is the dialectic that surrounds adaptation to the new: to the extent that new media introduce modes of experience that challenge the familiar, they are bound to occasion anxiety, resistance, even hostility, as they make their way toward cultural acceptance or "naturalization." The Promethean dimension in this dialectic underscores the fact that such anxiety is not trivial or misguided, but is a constitutive dimension of the human experience of cultural change.

It is this dialectic of media innovation that informs the influential, though much misunderstood, work of pioneering Canadian media scholar Marshall McLuhan. In Understanding Media (1964) and various other texts, McLuhan inventoried a broad range of media, from cars to the computer, describing them as both extensions of and auto-amputations from the human body. For McLuhan, the development of media technology (up to then-contemporary electronic technologies) has operated as an externalization of the nervous system, which in today's terms we might describe as a technical distribution of human cognitive capacities into the environment. Perhaps the most crucial dimension of McLuhan's vision for the topic of new media is his concerted effort to couple media form and media use. Indeed, it is perhaps this dimension that best anticipates current developments in social networking technologies, developments dubbed Web 2.0, that have driven home the profound interdependence of content (use) and form (technology) in the wired age. Far from the technological determinist he has commonly and simplistically been held to be, McLuhan can now be seen as the keen social analyst he always was. In arguing that the "medium is the message," McLuhan certainly did not intend to advance a purely formalistic doctrine; rather, he sought to establish and to foreground the large-scale societal impact of particular media as a phenomenon distinct from their concrete deployments by individuals and groups. What McLuhan argues is that the widespread adoption of a particular medium impacts experience at a different (and higher) level of magnitude than its use to convey this or that content. Even though there is and can be no such thing as a medium without content, to reduce the social impact of a medium to the content it conveys is to overlook the profound changes that ensue from revolutions in cultural techniques of information processing and consumption.

In The Gutenberg Galaxy (1962), the study immediately preceding Understanding Media, McLuhan had focused on the transformational impact of the invention of movable type and the print revolution it catalyzed. While he did analyze the role of knowledge storage and dissemination afforded by the book (hence the title), his central focus was on the altered form of consciousness that emerged in the wake of print. According to McLuhan, the shift from manuscript culture to print culture entailed the dissolution of sensorily distributed and integrated experience in favor of the tyranny of the visual.

Other scholars have eschewed McLuhan's emphasis on the alienation of individual experience in order to concentrate on the profound material effects of the new medium of print. In her important study, The Printing Press as an Agent of Change (1979), Elizabeth Eisenstein analyzed the social and political impact of print; specifically, she studied the printing press as a form of standardization that afforded unprecedented capacities for storage and dissemination of information. Using empirical methods, Eisenstein convincingly demonstrated that the invention of movable type and the print revolution played an important role in the Protestant Reformation, the Renaissance, and the Scientific Revolution. In a sense, Eisenstein's stress on the standardization of linguistic marks that lies at the heart of the printing press anticipates the media revolution of the nineteenth century as analyzed by German media scientist Friedrich Kittler. For Kittler, the triad of gramophone, film, and typewriter differentiated the inscription, storage, and dissemination of the various sensory fluxes-aural, visual, and linguistic-in a way that expands the standardization of print to other experiential registers. Interestingly enough, for all of these scholars, otherwise so different in their methodologies and commitments, the advent of digital technology promises some form of experiential reunification, whether utopian (McLuhan) or dystopian (Kittler).

In Technics and Time: The Fault of Epimetheus, French philosopher Bernard Stiegler transforms McLuhan's vision of media into a full-fledged philosophy of technical evolution. At the heart of Stiegler's thought is the understanding that human beings, from the very origin of the species, have always been technically mediated. Stiegler's effort to overturn the repression of technics in Western philosophy follows in the wake of efforts by his teacher and mentor, Jacques Derrida, to deconstruct the metaphysics of presence by way of the essential technicity of writing and other technologies of differance. As Derrida has shown, in studies on topics ranging from the Platonic myth of writing to the logic of the supplement in Jean-Jacques Rousseau, the antecedence of writing (here understood as "arche-writing," the iterability of the mark or gramme) in relation to speech and concrete writing systems means that the origin of meaning is always given through differance, which is to say, as differing and deferred. In his own take on Derrida's crucial concept of arche-writing, Stiegler insists on the necessity of thinking a history of the supplement, such that the operation of differance is put into a functional relation with concrete technologies of storage and transmission (see Stiegler, "Derrida and Technology"). With this move, Stiegler relativizes what he calls the "quasi-transcendental" field of differance or arche-writing in relation to the material infrastructure of its appearance and efficacy in the world at any given moment in time. Thus the paradoxical anteriority or withdrawal of any moment of origin (presence) becomes tightly bound up with the technical conditions of its belated appearance.

As the subtitle of the first volume of his study Technics and Time indicates, Stiegler routes his own negotiation with the figure of paradoxical origin through the crucial but neglected figure of Epimetheus in the Hesiodic myth and its legacy. In a compelling argument, he insists that the figure of Prometheus, and the dialectic of technological change it expresses, would have no meaning without the "fault" of Epimetheus-the originary act of forgetting that left the "natural" human being naked and unprotected, in need of technical supplementation. In Stiegler's reading, what the myth expresses is the "originary technicity" of the human, the fact that human beings have always depended on and coevolved with technologies. Drawing on paleontological studies of early flint tools, Stiegler foregrounds the fundamental correlation of the organic cortex with the inorganic silex as the basic characteristic of the human: from the outset, human beings have evolved not simply genetically but culturally, which is to say, by exteriorizing their know-how and collective memory in the form of cultural artifacts and objective memory supports. This means, of course, that the evolution of the human can be characterized in terms of a long series of "new media" revolutions: what our material history teaches us is that human beings evolve in correlation with the evolution of technics; the long line of once-new new media would simply be the index of this coevolution. In light of the complex form of human evolution ensuing from our coupling to technics (a form Stiegler dubs "epiphylogenesis," meaning evolution by means other than life), it follows-and this is Stiegler's thesis-that human beings, in their developmental and genetic evolution, are "essentially" correlated with technical media. Understood broadly as the objective or exterior support for human life in its diverse sensory, perceptual, cognitive, and collective modalities, technical media on this picture are nothing less than the necessary correlates of human beings. Contemporary cognitive scientists speak of "cognitive distribution" to describe the significance of this exteriorization of know-how and memory into media, but what their claims really underscore, when viewed through the lens of media studies, is how mediation forms the very basis of human existence. Human beings literally exist in the medium of the world, which is to say, in a medium that has always been technical.

Lest this sound overly anthropocentric, as if media existed exclusively to support human evolutionary and developmental processes, it should be pointed out that media have increasingly converged with technical forms of inscription of experience and of time; as a result, they now participate in processes of technological evolution and development that, at least since the Industrial Revolution, can lay claim to some sort of qualified autonomy. More than any other critical corpus, the work of Friedrich Kittler has drawn attention to this sobering reality. In Discourse Networks 1800/1900 and in Gramophone, Film, Typewriter, Kittler has articulated a history of media that moves from the monopoly on storage long exercised by the alphabet to the media differentiation of the nineteenth century and finally to the contemporary convergence of media in the form of digital code and computer processing. At the core of his media history is an appreciation for technics as a material production (a production of the real) that is not preadapted to or constrained by the sensory and perceptual thresholds of human experience (see chapter 14, "Technology").

A glimpse of this qualified autonomy of technics can be found in techniques for sound analysis that developed out of the phonographic revolution, which is to say, in the wake of the new medium of the gramophone. While the dominant uses of the gramophone, from its invention until its recent obsolescence (and now, of course, in its afterlife), invest almost wholly in the synchrony of technical recording and human sense perception (meaning that they involve the recording and replaying of sound for human consumption), the capacity of technical sound recording to inscribe frequencies outside the range of human hearing allows for an inscription (or "symbolization") of the flux of the real that is not narrowly bound to human modes of symbolization. Sound inscription thus instances the break with natural language and alphabetic writing that characterizes technical recording as such; whereas the inscription of natural language operates on the discrete ordering of the alphabet, the inscription of sound operates on a far more fine-grained discretization of the sonic flux. One technique for such discretization, Fourier analysis, symbolizes the raw flux of sound by means of intervals that periodize nonperiodic, innumerable frequency series. According to Kittler, what is most important about these so-called Fourier intervals-and what makes them exemplary of digital signal processing per se-is the recourse to real number analysis (a mathematical technique encompassing the continua between whole numbers) they make necessary. Generalizing from the technicalities of Kittler's discussion, we can say that high-frequency analysis "symbolizes" the flux of the real on the matrix of real numbers (whereas the alphabet does it on the matrix of natural language). To say this is to suggest that the technical inscription of sound symbolizes the real for systems other than human sense perception, and indeed this is what, for Kittler, makes it exemplary of the operation of the computer as such. It is the reason why computers, as Kittler puts it, are "becoming ever more necessary" while people "are becoming ever more contingent."

As the generalization of an operation (machinic symbolization) that could (and did) remain marginal until its widespread social proliferation, the computer marks a certain dissociation of media from technics. Arguably for the first time in history, the technical infrastructure of media is no longer homologous with its surface appearance. As distinct from phonography, where the grooves of a record graphically reproduce the frequency ranges of humanly perceivable sound, and from film, where the inscription of light on a sensitive surface reproduces what is visible to the human eye, properly computational media involve no direct correlation between technical storage and human sense perception. What we see on the computer screen (or other interface) and hear on the digital player is not related by visible or sonic analogy to the data that is processed in the computer or digital device. Indeed, as the work of some digital media artists has shown, the same digital data can be output in different registers, yielding very different media experiences. Pioneering new media theorist Lev Manovich has described this unique situation in terms of a divide between the media surface and the underlying code:

[New media in general can be thought of as consisting of two distinct layers-the "cultural layer" and the "computer layer." ... Because new media is created on computers, distributed via computers, and stored and archived on computers, the logic of a computer can be expected to significantly influence the traditional cultural logic of media; that is, ... the computer layer will affect the cultural layer. The ways in which the computer models the world, represents data, and allows us to operate on it; the key operations behind all computer programs (such as search, match, sort, and filter); the conventions of HeI [human-computer interface]in short, what can be called the computer's ontology, epistemology, and pragmatics-influence the cultural layer of new media, its organization, its emerging genres, its contents. (2002, 46)]

Manovich situates the conjunction in computational media of surface and code as the legacy of two converging yet hitherto distinct cultural traditions: media and computation. In his telling (and it must be stressed that his Language of New Media first appeared in 1999), these two traditions are held together by the cultural dominance of the cinematic metaphor, which has largely dictated how digital data have been transposed into readily consumable media forms (one need only think of the role of cut scenes in video games or introductory pages on Web sites circa, say, 2004).

While this may be (or may have been) an appropriate analysis of the empirical deployment of computational media, it doesn't begin to tap the potential that the computer holds for fundamentally remapping our experience of space and time (see Hansen 2004). Taking stock of the expansive role played by computational processes in creating the infrastructure for experience today, it becomes difficult to ignore the reality that we depend on regimes of technical mediation, what geographer Nigel Thrift has called the "technological unconscious," that not only exceed our attention but remain fundamentally unfathomable by us. Put another way, the forms of media -visual, aural, tactile-through which we interface with our informational universe are no longer homologous with the actual materialites, the temporal fluxes, that they mediate. While these media forms may still adequately capture the flux of our experience (although recent studies in the fine-scale temporal processes of cognition suggest that they may not), they, like the experiences they inscribe, are only indirectly coupled to the underlying computational processes supporting them. In light of the disjunction of technics and media, we must differentiate and hold separate two distinct functions of media: on one hand, to exteriorize human experience in durable, repeatable, and hence transmissible form; on the other, to mediate for human experience the non- (or proto-) phenomenological, fine-scale temporal computational processes that increasingly make up the infrastructure conditioning all experience in our world today. What is mediated in both cases is, to be sure, human experience, but according to two distinct programs: for whereas media in the first, traditional sense mediates human experience itself (its content is that experience), media in the second sense mediates the technical conditions that make possible such experience-the "transcendental technicity" underlying real experience in our world today.

What has been termed Web 2.0-a blanket term encompassing the host of social networking sites and collectively produced archives (wikis) that developed in the wake of the dot-com crash of 2001-perfectly illustrates this bifurcation in the function of media, and in the process demonstrates the thoroughly social and collective dimension of "transcendental technicity." By taking full advantage of the many-to-many connectivity facilitated by the Internet, the explosion of user-generated digital "content" (blogs, discussion forums, photo-sharing, video animation, and so on) has refocused the function of computational media from storage to production, from the archiving of individual experience to the generation of collective presence and of connectivity itself. By now (2009), this refocusing has itself been commodified by a myriad of companies created to host this content, including platforms like MySpace, Flickr, Facebook, and YouTube. As attested by the massive popularity of these and similar social networking sites, what is mediated by Web 2.0 is less the content that users upload than the sheer connectivity, the simple capacity to reach myriad like-minded users, that is afforded by that act of uploading content. What is mediated here, in other words, is the technical capacity to connect on a massive, many-to-many scale, which is to say (although this dimension need not, and perhaps rarely does, come to the fore), the entire computational system that facilitates this new scale of connectivity. This is a truly McLuhanesque moment in the precise sense that, over and above any content that happens to be transmitted, what is involved in Web 2.0 is a widespread mediatic regime change-nothing less, I would suggest, than a change in the vocation of media and mediation themselves.

If new media can be held to be "new" in a new way-if digital computational media are distinctive in a qualitatively different sense than were all previous forms of media at the moment of their introduction-it is precisely because of this new vocation assumed by media in the age of networked computation. In addition to storing experience, as it has always done, media today mediates the conditions of mediation, which is to say, it brokers the experiential impact of the new computational networks that comprise today's "technological unconscious." Beyond mediating individual users' stored experience, the transmission of media -of photos on Flickr or videos on YouTube-itself mediates the situation of the user in the regime of networked computation. It mediates, in short, the new capacities for making contact that individuals acquire simply by distributing (traces of) themselves on many-to-many computational networks. To the extent that commercialized Web 2.0 technologies channel this impact exclusively toward the ends of human social networking, the "total" significance of this bifurcation, of this new vocation of media, remains obscured. For the basic reality is that the "social" or networked transmission of media by Web 2.0 sites is built atop a technical infrastructure that is and must be structurally dissociated from the form of that media. The experiences afforded by social media sites are made possible on the basis of a technical logic that operates at a temporal scale far finer than that of human sense perception (and its mediation) and with a level of complexity that defies capture in the form of (traditional) media. That the fundamental disjunction between media output and technical basis can be (and has been) sutured bears witness, as Kittler bas astutely noted, to the power of economics: Web 2.0 perfectly expresses the reality that money can be made by using computation to offer connectivity. With this in mind, perhaps we could say that the commercialized Web 2.0 technologies operate precisely by collapsing the dissociation between media and technics: they give us a new functionality-massive connectivity-by transmitting familiar media forms in ways that avoid drawing attention to the new "transcendental technicity" of computational networks.

That a new technicity is nonetheless at stake in social media appears obliquely here to the precise extent that connectivity emerges as an end in itself, distinct from the actual sharing of the (traditional) media content transmitted in these networks. This disjunction of connectivity from the sharing of content provides evidence of some minimal embrace of the technical logic underlying Web 2.0 as the ground for human experience. Indeed, to the extent that massive connectivity is a new capacity for human beings, its emergence here attests to our willingness to let our experience be organized in ways that cut against the grain of what we've known hitherto and, specifically, of the function of those media forms we have developed up to this point in our history. Given the tension that exists between such emergence and the commercial profitability of social media, it is fitting that one key source of insight into the revolutionary promise of Web 2.0'S technical logic would come by way of aesthetic transformations of paradigmatic social media sites. An exemplary series of such transformations can be found in artist Mario Klingemann's Flickr.com-based works: Flickeur, Clockr, Tagnautica, Picturedisco, Islands of Consciousness, and The Stake. Despite their differences, all of these works involve an effort to exploit for aesthetic ends the technical processes of massive-scale data organization and retrieval that underlie social media sites. In so doing, these works raise the possibility that the new vocation of media-to mediate our indirect relation with computational networks-might in fact go hand in hand with a new aesthetics of experience.

The first of Klingemann's transformations, Flickeur is, as its name hints, a voyeuristic appropriation of the photo-sharing site Flickr.com. When loaded onto a browser, Flickeur grabs images randomly from Flickr and strings them together using randomly selected transitions from the grammar of cinema (fade-ins and -outs, pans, jump cuts, etc.) to the accompaniment a looping soundtrack. What results, as Flickeur continues to grab new images, is an "infinite" film that, in addition to being open onto the indeterminate future, has no internal principle of composition. The principle governing the images' selection is not the aesthetics of the human audiovisual flux (as is the case with cinema and all previous audiovisual media) but rather the capacities embedded in the computational algorithms themselves. What the human viewer encounters is, accordingly, less an internally coherent sequence than a proliferating series of discrete audiovisual events, bounded by the temporal cycling of the computer networks carrying out Klingemann's algorithms for image selection and combination. The work does not yield any stable objects and indeed, every instance is unique: if two computers download the site simultaneously, they will grab different images and assemble them in different ways. In the place of the linear, cinematic flux of images, Flickeur engages a virtual matrix of potential image combinations that will never come close to being completely actualized, no matter how many users download the site at any given time.

The work's aesthetic interest does not lie in its ordering of images; discrete juxtapositions may be startling or otherwise interesting juxtapositions, but the flux is by no means cinematic. What does become interesting, however, is the way in which the work mediates for the human perceiver the technical logic of computational networks: these discrete, randomly associated images (random, that is, from our perspective) furnish aesthetic analogs of discrete computational processes and thereby give some degree of experiential, aesthetic access to the technical logic of computation. In this way, Flickeur exposes what remains obscure in the predominant use of social networking sites for sharing media and connecting with other users: namely, the fact that the underlying organizational principles are anything but homologous with the associational networks of human cognition. But Flickeur-and this is what makes it exemplary-does more than simply expose a technicist logic; it takes seriously the idea that this logic could have affirmative aesthetic consequences. Thus at the same time that the aesthetic experience afforded by Flickeur mediates the technical processes that produce its random image transitions, the work also asks whether the technicist organization of information might not furnish new, specifically noncinematic principles for experiential synthesis. Taken seriously, Flickeur calls on us to ask what it would be like to live time, and the aesthetic content that necessarily fills time, from the standpoint of its discreteness.

In this way, Flickeur argues against any move to identify media (and mediation) narrowly with the technical. Far from being a direct consequence of a specific technology (networked computation), the medium that is Flickeur is an invention on the basis of a new technical "automatism" (Stanley Cavell). And while the specific materiality of the latter's technical logic is central here, what makes it a medium is the interface of this logic with human aesthetic experience. Nonetheless, there does seem to be something new involved in this media invention, which is precisely the opening of human experience to a form of storage and transmission that occurs in a form of embodiment (networked computation) and at a level of temporal processing radically discontinuous with human embodiment and the temporal range characteristic of media. So perhaps "new media," far from designating either one more new medium or some blanket postmedium condition, should be thought of as an expression for this newness, which is to say, an expression that indexes the changing vocation of media itself. Not simply the direct, technical consequence of digital computation, new media nonetheless concerns what is new about the widespread role of computation in our world: new technicist logics of informational organization that might also prove fruitful for our selfunderstanding and our understanding of (the role of) media. From this standpoint, the sheer breadth of what falls under the term "new media" might begin to make sense. And so too might the retreat from an effort to find a technical core for each new medium. For if "new media" today names a range of contemporary technical, aesthetic, and social developments, what holds them all together is not a common technical basis so much as an effort to interface the technicist logic of computation with human experience. Isn't this, ultimately, why new media can encompass new inflections of mass media (the Net and the blogosphere, transformations in the form and transmission of the news), new gadgets (iPods, digital television, Web-accessible cellular phones, GPS instruments), and new experiments with the effects of these inflections and gadgets on the senses, emotions, and perceptual, social, and imaginary experience (artworks, new forms of community, transitory and highly responsive political affiliations, cultural metaphors)?

What is new here is new in a different sense from the newness that accompanied prior media upon their introduction. To see this, we need only invoke German media theorist Wolfgang Ernst's discussion of computer emulation of other media, and specifically the example of Erich von Hornbostel's Berliner Phonogramm-Archiv, whose collection of wax-cylinder recordings of peoples threatened by extinction can be experienced by us today only because of the computer, or more specifically, because of endoscopic recording devices that can "read" the wax sound traces graphically, "re-translating them into audible sound by algorithmically transforming visual data into sound." A lost "old new medium" is thus revivified precisely because of the computer's difference from older media, which is to say, its indifference to the aesthetic and medial differences between audio and visual data; it is, specifically, this indifference to medial difference that allows the digital computer to emulate one interface through another. Again, we see clearly how media and mediation have changed vocation; no longer directed primarily toward or operating primarily at the level of human sense experience, the computer's emulation of Hornbostel's wax-cylinder recordings quite literally mediates an old new medium. That it emulates it for our sense perception, however, announces the noncontingent role of the human. Indeed, the interface to human experience is precisely what-notwithstanding its materialist indifference to medial differences- makes it media in the first place.


References and Suggested Readings

Derrida, Jacques. 1998. Of Grammatology, trans. G. Spivak. Baltimore: Johns Hopkins University Press.

Eisenstein, Elizabeth. 1979. The Printing Press as an Agent of Change. Cambridge: Cambridge University Press.

Ernst, Wolfgang. 2006. "Dis/continuities: Does the Archive Become Metaphorical in Multi-Media Space?" In New Media/Old Media: A History and Theory Reader, ed. W. Chun and T. Keenan, 105-24. New York: Routledge.

Hansen, Mark. 2004. New Philosophy for New Media. Cambridge, MA: MIT Press.

Kittler, Friedrich. 1999. Gramophone, Film, Typewriter, trans. G. Winthrop-Young and M. Wutz. Stanford, CA: Stanford University Press.

Klingemann, Mario. "Quasimondo." http://www.quasimondo.com.

Manovich, Lev. 2002. The Language of New Media. Cambridge, MA: MIT Press.

McLuhan, Marshall. 1994. Understanding Media: the Extensions of Man, ed. L. H. Lapham. Cambridge, MA: MIT Press.

Plato. 2005. Phaedrus, trans. C. Rowe. New York: Penguin.

Stiegler, Bernard. 1998. Technics and Time. Vol. 1, The Fault of Epimetheus, trans. R. Beardsworth and G. Collins. Stanford: Stanford University Press.

----. 2002. "Derrida and Technology: Fidelity at the Limits of Deconstruction and the Prosthesis of Faith." In Derrida and the Humanities: A Critical Reader, 238-70. Cambridge: Cambridge University Press. }}


Bob Dobbs
 
View user's profile Send private message
      Back To Top  
Post new topic  Reply to topic

 
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


      Back To Top  

Page 1 of 1
Jump to:  
Powered by phpBB © 2001, 2002 phpBB Group
Avalanche style by What Is Real © 2004