Tuesday, October 29, 2013

Grappling with the Some Theoretical Moments in Articulating the Points of the LHC

One of the main goals of my dissertation project is to understand the Large Hadron Collider (LHC) in terms of the epistemological world it exists in, as well as its contribution towards scientific knowledge formation that grew out of a relationship between the human and the machine. This includes understanding how data collection, generation and simulation takes place, dealing with the different stages of validation and interpretation of data, and working out the intricate relationship between the data and the apparatuses used to collect and generate them. However, my eventual goal is to elucidate the ontological that is hidden and tightly bounded to the epistemological.

At the foundation of my analysis are theories of phenomenology as developed in the tradition of critical theory, which is different from the kind of phenomenology espoused by working particle physicists and in the ‘standard models’ of philosophy of science. This phenomenology is based on Husserl’s interest in expanding the experiential as an antidote for the limitations of noema and ideation. He sees phenomenology as the collective field of experiences engendered through one’s interaction with the ontic. I am interested in how phenomenology as promulgated through the Husserlian model (and also that of Merleau-Ponty) can be put into dialectical engagement with the phenomenological analysis and interpretations made by particle physicists. In order to construct new theories from an existing pool of available theories, I will be using science fiction to create a series of thought experiments. One of my preliminary forays into this can be found in a mini graphic novel I have produced in conjunction with a colleague for a seminar . The graphic novel, written in the model of science ‘fiction,’ was intended to be a story board for demonstrating the use of visualized narratives in presenting the multi-faceted epistemologies involved in the discourse of the LHC with the purpose of foregrounding ontological gaps.

Much effort goes into producing and obtaining data but not enough research is expended towards studying the potentiality of phenomenology and its usefulness for understanding the ontology of data. The phenomenological world of data is underdetermined in empirical techniques because through that, one will only be able to extricate the epistemological but not lay out the ontological since the latter is so tightly enfolded within the core of the former. In other words, we will know that the data is experiencing but not the how and why of that experience. It is impossible to trace the causality of the data even if we can know its effects. Data also exists outside of the ‘certified’ data packets selected for further analysis, as part of ‘junk’ data. ‘Junk’ data is data that has been discarded due to the choices made during the calibration of instruments for the objective of ensuring that the detection of data will only take place under certain pre-determined conditions. We need to figure out how to work around the inadequacy of existing instruments in dealing with informational flows and sources that fall outside the purview of our selected observational range.

My research parallels that done by physicists because it is also about constructing newer and more refined theories, except that I am constructing my theories from existing philosophical concepts of science already utilized in the study of experimental works in physics. Work has been done by scholars such as Karin Cetina-Knorr, Peter Galison and Andrew Pickering on the history and sociology of physics of the Large Hadron Collider. Studies that had been done on the LHC and its antecedents have concentrated on the relationship between the people involved in the work and the machine, the sociology of knowledge-production, comparisons between knowledge models and research methods applied by the physicists and other scientists to their chosen scientific subfields, the political structures of scientific epistemology, and the historical developments (in relationship to the politics) that drive the direction in which certain scientific pursuits take place. There is not much focus on the analytic of the Large Hadron Collider; or research on how phenomenology can contribute to the understanding of ontological qualities in scientific knowledge formation that also shapes the epistemic culture of abstract knowledge. My main contribution will be in working out how and what ontology looks like, since that is important for understanding the structure of epistemology.

In particle physics, phenomenology is a bridge between theoretical constructs and experimental data, allowing both sides to mutually reinforce the work done by the other. The utilization of phenomenology in particle physics enables theoreticians to readjust, discard, fine-tune and reinforce their hypotheses and predictions made in physical laws as well as in the mathematics used for reconstructing the physical world through simulations. For the experimentalists, phenomenology allows them to categorize and organize the multiplicity of data obtained and analyzed, and to use the data to piece together dispersed fragments that will unlock the mysteries of the universe. Hence, phenomenology allows the selection of the most practically useful data that are further refined through data simulation so that it is possible to visualize the segments of the world that only exists in abstraction for us. This is because our relationship to the information that nature provides us with, such as the outcome of particle collisions, is always mediated through the machine. In other words, we can only ‘experience’ the material that the machine, which is the computer GRID in this case, is able to analyze and make sense of, even if we may ‘know’ how it looks like with the aid of mathematical formalism. But, it is in our interest to attempt to experience, also, data in mathematical formalism. The question would be, how do we do so? To further understand how phenomenology works in the development of scientific knowledge, especially in physics, it is important to understand how physicists use calibration and simulations to fine-tune the equations they will use in the final analysis of the data and interpretation made about the Standard Model, and in understanding further the intellectual directions taken by the physicists.

No comments:

Post a Comment