Thursday, August 19, 2010

An excerpt from a longer essay I wrote for class that is relevant to my own work

The confluence of science fiction, cybernetics, and scientific ontology constitutes an intersection between predictions, myth-making, hypotheses, phenomenal gestures, thought-experiments and material realization. Science fiction is not a genre by which one can easily position textual productions and reproductions of scientific ideas in either sublimated or pedagogically direct forms. At the same time, the potentiality of science fiction has spurred scientists and philosophers of science to model postulations, and theoretically-produced predictions in the sciences, on its ‘narrative,’ especially in areas that are still epistemically speculative. As Godfrey-Smith suggests, model-based science is motivated by the complex nature of the scientific target and the employment of exact methods (3). Even then, as the mathematicians will tell you, the final goal is to create a general enough model, so that it can be the public key by which one can unlock the encrypted data nature has bestowed upon us.

In the fields of particle and high energy physics, scientists’ work are now heavily mediated by instruments that will inform them, through series of bit-strings, as to whether they are inching nearer to their goal of breaking the symmetry of the Standard Model. This reliance on instruments turns the science produced into highly logical abstractions when compared to their predecessor the bubble chamber, as Peter Galison would tell you in his book Image and Logic: a Material Culture of Microphysics. Can we define the human relationship to instruments not directly attached to the flesh of the former as cyborgian, even as the latter functions as a sensorial augmentation unit? Should the very embodiment of a cyborg be equivalent to a seamless and ‘natural’ integration of techne onto the human senses? Is the cyborgian body, whether fleshly or abstract, a body that has become heavily dependent on its technological extensions (think of the pacemaker or bodily-implanted implements), or a body that obtains agency through a symbiotic relationship? But in symbiosis, how does the body relate and engage with technology?

I am interested in tackling these questions in relation to science fiction because I am of the opinion that science fiction provides the platform for interrogating different ideologies and events, including events that are seemingly unfeasible or difficult to immediately replicate. Science fiction allows us to test out (and stretch out) the propositions provided by social and intellectual ideologies through the construction of the what-if scenarios. By ‘fictionalizing’ physical events, we can create event-worlds that are a conjugation of the ‘real’ and the ‘imaginary,’ twinning the physically feasible with the unknown, and observing the outcome (or speculating on possible outcomes). However, Clark’s Natural-born Cyborgs are replete with physical realization of a number of these what-ifs. Radical as they are, there are still no concrete suggestions as to how one can connect ‘intuitively’ between the human agents and their knowledge-generating and knowledge-detecting machines.


Image saturation, consumption and addiction are also integral to science studies. Image displays on consoles and screens, these being images that had been constructed from data collected from detectors such as the Large Hadron Collider, a stroboscope or a fMRI, are images that form the augmented reality that enable a scientist to ‘read’ the goings-on in the active invisible worlds and to provide phenomenological interpretations to existing theories and mathematical equations. Some of the attributes of science fiction, particularly of postmodern science fiction as discussed by Bukatman, as well as models of science fiction ventured by Godfrey-Smith, provide an imaginary system by which theoretical issues can be introduced and investigated. This allows for a comparative exploration between the complex systems inhabiting the real and theoretical models situated in the virtual. Fictive models enable the exploration of the ‘behavior’ of the imaginary system and it is hope that in the process, one will have a better understanding of the complex world systems.

Stanislaw Lem states that it is untrue that one should presuppose that anything and everything can happen in science fiction, because even when one looks at the matter purely mathematically, infinities still exist under very different powers (and degrees). Lem also differentiates between science fiction and pseudo-science fiction, whereby the latter is one that produces fantasy, with the focus being on the content signaled, on its realistic communication, however fantastical the content may be. An example given is the story Metamorphosis by Kafka. Lem argues that science fiction that deals with unrealizable objects is merely playing an empty game, with no hidden meaning or chance of ever being realized. Hence, these games have no real relationship to the world and only exist as “logical puzzles, paradoxes, as intellectual acrobatics (37).” However, what if we are to take up Lem’s representation of fantastical science fiction as games in a different mode and state that the semantics in these games are made meaningful by the very attribution of value and meaning through the construction of imaginary and theoretical worlds that may not yet be realizable; or which may only be realizable in terms that require a paradigm shift in the ontology of facticity? Bukatman and Clark have spent much of their chapters trying to explain how science fiction is already endemic to the cultural and technological landscapes that they’ve been exploring throughout their books, and to show that there is no stable ontology where science fiction, science models and techno-human relationships are concerned. They are perhaps less willing than Lem (and other more conservative advocates of science fiction) in separating the virtual from the real, knowing perhaps in hindsight that technological innovations are always preceded by epistemic shifts and revolutions in epistemic cultures. They share with Haraway the notion that the cyborg is an entanglement between lived experience and fiction, whereby the boundary between science fiction and social reality is only an “optical illusion.” After all, the cyborg is a condensation of imagination and material reality (Haraway 149-50).

Hence, I want to take up the notion of technological determinism and indeterminism that have been bouncing around in both Bukatman’s and Clark’s work to discuss briefly why I consider the study of the Large Hadron Collider and experimental particles physics as existing in the framework of cyborgian physics. This is because of the tight-knitted relationship between the organism (humans) and the machines that are central to their work. If the 1960s and then 1970s saw the construction of the theoretical framework for the Standard Model and f the Higgs boson, with the energy predicted for their manifestation deduced from existing mathematical frameworks, the 1980s, 1990s and the 21st century saw the building and elaboration of existing and new supercolliders that enable a phenomenological manifestation of the theories through the data flows mediated by the sub-detectors within the detectors. With the exception of the theoretical physicists, who may only need to work from their laptops, the experimental and accelerator physicists could only perform their duties and create value for their work through the work that they perform on the terminal space, which could range from calibration of the instruments up to the validation and quality checking of the data. Then, comes statistical interpretation and comparison with Monte Carlo results ensue, done off-line but still within the network. Data validation is intensive and requires the collective labor of the international GRID. The work goes on day and night with the aid of the collective expertise of an international network of experimentalists. Moreover, whatever new results accumulated and insights obtained are aggregated in no one central location, but are widely dispersed to be easily called-up when needed (though some purging of data still have to take place since there is no such thing as infinite storage and painstaking decisions would have to be made to that end). All these are done through the machine, and the machine is now the extension of both the cumulative long-term memory, and the instantaneous knowledge, of the physicists. In fact, so much importance is placed in building a good relationship with the machine that the user has to undergo certification in order to prove his or her affinity with the machine. There are experts for every section of the sub-detectors, and these are the hardware and software physicists (who also work with other computer scientists, the ‘non-specialists’). They are expected to be on-call at any time of the day, and be accessible to the machine. While much work is still manually done, there are plans to increase the automation (the agency as well as transparency of the machines) so that many of the visible processes can now be pushed into the background, thus freeing the physicists from having to care about every little nerve-ending of the machine.

As PhDs are being meted out through work done by graduate students to improve and increase the efficacy of the machine, much of the new knowledge is generated through an intimate and unceasing relationship with the machine. However, in order for real agency to be achieved in a field where data flow in by the zillions, it is no longer just about feeling for the machine but also knowing how, as Clark would say, one can maneuver and organize the system from afar without too much ado. The system of instrumentation has to be more transparent and opacity must be broken down.

However, big sciences such as the LHC experiment is being built on a larger machinery of governance, a body-politic that tries to govern and control the very ambivalence and indeterminacy involved in knowledge production of such massive kind. This collective and hierarchical human machinery seeks to decide on how and when should particular analysis be released or published, and how should the experiments be present to the interested public. This has led to much division, competition and secrecy within the organization, which could have as much negative as positive repercussions. In attending some of the meetings, I notice a high-level of attention to even the minutest detail in this behemoth of an organization, and it is possible for work of interpretation and analysis to begin afresh merely because of the objections of any legitimate member of the organization. As the tunnels of the LHC twist and turn more than 100 feet underground, the movement of the beam through the different sectors between the Swiss-France border is more straightforward than the navigation of the governing body-politic where arbitrary vetoes and decisions by the spokespersons of each of the four experiments at CERN could put an end to any democratic vote. Perhaps such a ‘hierarchy of oppression’ would run counter to making high energy and experimental particle physics a truly cyborgian physics, where the flexibility of epistemic production is needed before anything revolutionary can happen.

Finally, science fiction, terminal space, terminal identity, and the cyborg being share a similarity of fictive construction within a virtual space that is situated on the real. This fictive construction allows a narration of potentialities that are not evident as long as we try to see them in disjointed and abstract terms. However, it is important to bear in mind that the narrative may be more elusive and ambivalent (and sometimes, even seemingly anti-narrative) than we like, only because phenomena do not operate within our linear time, nor would it operate at a time most conducive to us. An example is how much of the downtime at CERN took place in the afternoon while the graveyard shift always seem to see a hive of activities.

No comments:

Post a Comment