Remediation
Ok, I know that I am suppose to post this in a blog for a class but I thought I'll post it here, since the topic of remediation, based on the introduction (for now) of a book I'm reading for class called "Remediation: Understanding New Media" by Jay David Boulter and Richard Grusin, is something which I should be thinking about in light of my own work.
1. Is remediation concerned with the retooling of existing mediation for hyperimmediacy? If that is the case, what's the equivalent of Planck limit for media in terms of the ability for an individual to bear an onslaught of data streaming through their visual and aural systems before being processed by the brain?
2. Remediation, which the authors concur begun before the digital age, seems to be denoted as the reworking of existing media for the chance of manipulating perception (their arguments about the usage of lines, perspectives, illuminations in medieval manuscripts. Their words on p 11.
Remediation is seen as a constant notion of adding layers to the mediation of effects and affects in keeping with technological evolution and revolution, perhaps to the point in which the observer/experiencer/audience/customer is unable to detect the mechanism and instrumentation that goes into detecting that incubator of auratic phenomenon, and where ruptures and breaks could be smoothed over. This is where digital technology and digital graphics are supposed to come in. Digital technology allowing for better calculus, matrix and vectors than analog technologies are ever able to. And then, we want to begin looking at increasingly transparent interfaces that could not be separable from that seen by the enduser due to its supposedly seamless ceding between both. So what is important here seems to be the ability to effect continuity between the 'picture' space and the 'viewer' space.
But then, remediation fail to address a few points
1. epistemological construction is both cumulative, parallel and organic. Our ontology is as much created by our conscious experience that has been processed as it is by that which is beyond the Planck limit of our consciousness. Hence, if the success of a particular form of mediation is dictated or rests on individual perception, how would a person unexposed to a close communion with digital media feel any different?
2. Our senses are imperfect and as real as an event is, as affective and auratic its onslaught on the senses (such as the hypothetical jaunt through the woods), our nerves only process that much of information (think about the mathematics involved in the process). How would the obsession with hypermediated sensations survive the saturation points of our individual senses? Do we then need another mediation to mediate the hypermediated effects that are overwhelming for us? So is remediation now also another form of adding layers to the mediation. If that's the case, how can we achieve transparency?
3. I suppose if I am to think about this against the LHC project, which I can position as a project of remediation, though in a manner that is different from that projected by the authors of the abovementioned paper. The remediation of abstract knowledge is still at its infancy, so much so that one has to trawl the net to find different aspects to fit together a picture.
4. Since all our point of references are always undergoing remediation, the realism we are striving towards seem itself to be a sense of 'false' realism. Maybe what seems to be missing in the discussion is how memory comes into play in all these. Also the authors did not address the chemical mediation that is important in the creation of all media arts, digital or otherwise. Chemical ranging from the natural transmitters in the body of the human to artificially inseminated chemicals. William Gibson seems more farsighted in addressing that in his much older work, Neuromancer.
Also, how can remediation through digital technology help augment the visualization and understanding of abstract knowledge. Making making interfaces more transparent is the key to that, which is what I look forward to reading more of in the rest of this book.
1. Is remediation concerned with the retooling of existing mediation for hyperimmediacy? If that is the case, what's the equivalent of Planck limit for media in terms of the ability for an individual to bear an onslaught of data streaming through their visual and aural systems before being processed by the brain?
2. Remediation, which the authors concur begun before the digital age, seems to be denoted as the reworking of existing media for the chance of manipulating perception (their arguments about the usage of lines, perspectives, illuminations in medieval manuscripts. Their words on p 11.
A painting by the seventeenrh-century artist Pietet Saenredam, a photograph by Edward Weston, and a computer system for virtual reality are different in manyI am uncertain as to what is the difference between that which is mathematize and that which is not. Sure, the mathematical coding is made more visible when you have to input a list of equations to output shapes, but how does one not know that 'unconscious' mathematics (or endo-mathematics) isn't taking place when the artist seemingly 'intuitively' (and what intuition really is, is another topic for discussion) make particular selections of lines. But anyway, this does not quite discuss the point of the paper, which is creating hypermediacy via remediation, in other words, creating a synthetic environment that is as immersive, and even more so, than what one can experience in a jaunt through the woods, for instance.
important ways, but they are all attempts to achieve immediacy by ignoring or denying the presence of the medium and the act ofmediation. All of them seek to put the viewer in the same space as the objects viewed. The illusionistic painter employs linear perspective and "realistic'' lighting (fig. I.lO), while the computer graphics specialist mathematizes linear perspective and creates "models" of shading and
illumination (fig. 1.11; plate 1). Furthermore, the goal of the computer graphics specialists is to do as well as, and eventually better than, the painter or even the photographer.
Remediation is seen as a constant notion of adding layers to the mediation of effects and affects in keeping with technological evolution and revolution, perhaps to the point in which the observer/experiencer/audience/customer is unable to detect the mechanism and instrumentation that goes into detecting that incubator of auratic phenomenon, and where ruptures and breaks could be smoothed over. This is where digital technology and digital graphics are supposed to come in. Digital technology allowing for better calculus, matrix and vectors than analog technologies are ever able to. And then, we want to begin looking at increasingly transparent interfaces that could not be separable from that seen by the enduser due to its supposedly seamless ceding between both. So what is important here seems to be the ability to effect continuity between the 'picture' space and the 'viewer' space.
But then, remediation fail to address a few points
1. epistemological construction is both cumulative, parallel and organic. Our ontology is as much created by our conscious experience that has been processed as it is by that which is beyond the Planck limit of our consciousness. Hence, if the success of a particular form of mediation is dictated or rests on individual perception, how would a person unexposed to a close communion with digital media feel any different?
2. Our senses are imperfect and as real as an event is, as affective and auratic its onslaught on the senses (such as the hypothetical jaunt through the woods), our nerves only process that much of information (think about the mathematics involved in the process). How would the obsession with hypermediated sensations survive the saturation points of our individual senses? Do we then need another mediation to mediate the hypermediated effects that are overwhelming for us? So is remediation now also another form of adding layers to the mediation. If that's the case, how can we achieve transparency?
3. I suppose if I am to think about this against the LHC project, which I can position as a project of remediation, though in a manner that is different from that projected by the authors of the abovementioned paper. The remediation of abstract knowledge is still at its infancy, so much so that one has to trawl the net to find different aspects to fit together a picture.
4. Since all our point of references are always undergoing remediation, the realism we are striving towards seem itself to be a sense of 'false' realism. Maybe what seems to be missing in the discussion is how memory comes into play in all these. Also the authors did not address the chemical mediation that is important in the creation of all media arts, digital or otherwise. Chemical ranging from the natural transmitters in the body of the human to artificially inseminated chemicals. William Gibson seems more farsighted in addressing that in his much older work, Neuromancer.
Also, how can remediation through digital technology help augment the visualization and understanding of abstract knowledge. Making making interfaces more transparent is the key to that, which is what I look forward to reading more of in the rest of this book.
Comments
Post a Comment