♣ Concept of atomism and corpuscularity, versus that of field. How does the banality between both help us thing about the category of media
♣ Schelling’s concept of productivity and product, atomistic and dynamical, and the position of speculative physics in theory and the empirical.
♣ Speculative realism and post-Husserl Ian and post-Kantian phenomenology.
♣ Whitehead’s concept of speculative philosophy and rethinking of the phenomena (under consideration).
I am trying to identify what other key theoretical areas that provide theoretical predictions or explications into possible or existing experiments, connections between some of the interpretational aspects of quantum mechanics and field theory to the phenomenological problems in particle physics, what drives the choice of certain theoretical application to the experiments).
Research questions for this section:
• What are the epistemics that drive the articulation and development of theoretical models that are then use in experiments?
• How do the epistemics of theoretical formalism provide the tools for experimental analysis and in the determination of data cuts?
• What are the speculative areas of inquiry within theoretical particle physics that connect with speculative modes of experimental triggers?
• How is the formation of the theoretical discourse here useful for thinking about the formation of critical theoretical discourse cited in the philosophy section and elsewhere?
♣ Questions of the quantization process (first level and second level field quantization) in relation to the operating framework (relativistic versus non-relativistic). Quantization can be within a classical or quantum field model, and this influences the degrees of boundedness of the particles in the field. In the experimental sense, decisions of what to use depends on the energy level and ‘collision’ rates. Negative and positive fields stemming from sign changes in the quantization process. Quantum states and Fourier modes for wave functions. Wave model as having a more practicable interpretive usability in experimental analysis.
♣ Functions – How the Feynman diagram retools and reutilizes the Lagrangian, Hamiltonian, Fourier transformations, Dirac and Lorentzian equations. There are a number of other functions but I am most interested in, such as the functions involved in the actual quantization and transformative calculations of the particle involved) and operators (ladder, creation and annihilation, commutating and anti-commutating, hermitian and non-hermitian. This section has a number of phenomenological implications in providing important frameworks for framing bounded calculations relating to spatiality of the interacting particles in the fields (especially in the process of ‘scattering’ and ‘collision’, and also in helping us think through some of the different characteristics connecting fermions and bosons. As fermions deal with occupation numbers of either 0 or 1, stemming from the raising and lowering of its quantum numbers, there is less issue in dealing with normalization.
♣ Degrees of freedom available in in both ordinary and field-relativistic quantum mechanics. This is possibly important when thinking about the boundedness (and the discrete settings) of the particles within classical and quantum fields (c numbers and q numbers). How does this connect to asymptotic freedom?
♣ Field-theoretic versus non field-theoretic, field theoretic versus particle-like. What are the preferred interpretive modes? Feynman diagrams for visualizing the space, time and positionality of a particle when virtual interactions occur.
♣ Hilbert versus Fock space (how is the Feynman diagram able to help move the idea beyond theoretical abstraction into the experimental). Such abstract spaces are embedded into the operators that are then transformed into the vertices of the diagram. In the experimental sense, this is visualized through fission, fusion and decay of the different particles in the Standard Model, as well as in th
♣ Field versus quantal aspects of quantum mechanics.
♣ Positionality and momentum (this connects also to the experimental scattering process that looks at a cross-section of momentum and isospin of the interacting particles).
♣ Locality versus non-locality.
♣ Pure states versus mixed states (the theoretical breakdown of pure states provide the background for understanding mixed-states, that come into play for dealing with experimental conditions).
♣ Unification, gauge theory and renormalization, which connect to the idea of the structure of symmetry, color-parity violations in quantum chromodynamics and the discovery of heavy bosons initially theoretically predicted (J/).
✓ What is the relationship between instrumental design, interfacing between the machines at the software and hardware level (especially in terms of thinking about the GRID) and the larger concept of the Large Hadron Collider as the medium by which ‘natural’ events in terms of particular interactions in the microstate are amplified.
✓ How are experimental triggers decided upon and cuts made from the data generated? What defines the arguments and parameters of these cuts?
✓ What is the affective connection between the machine, data and scientist-worker in the age of big science groups and indirect access (Galison has also written on the sociology of the image versus object-oriented science, and how that affects its relationship to the scientist, but I am more interested in understanding how much of the construction of experimental epistemics are influenced by the analytics of human-based interpretation and how much by machine-generated analytics. How do these epistemics connect with that of theory? )
♣ Calibration of instruments, luminosity of the injected luminous bunches, and trigger decisions (instrumental parameters of the experimental design.
♣ Measurement of the different properties of ‘particles’ produced in the process such as charge, spin, mass, total angular momentum.
♣ Interpretational of traces and tracks on the detectors by separating out between background ‘noise’ and actual data collated from the center-of-mass collisions.
♣ No ‘free’ particle in the field in that there are multiple fields involved through the generation of a multiplicity of particles. However, it is only necessary to take a cross section of this multiple particles and to look at the decay that matters for the ‘search’.
♣ Transitioning between cross-sections of collisions and matrix denotation of 2-particle interactions in the case of inelastic collision and scattering, since inelastic scattering is important for obtaining the particles that come out of the initial interaction, especially vector-boson, gamma-gamma, leptonic, and other baryonic interactions.
• Gap between theory and experiment
¬ Different formalistic concentration: experiments interested in interpreting empirical results within existing models, or to simulate results using experimentally realizable models with hypothetical constraints (that may or may not be realizable with current technology) while theories are interested in working out various interpretive models concerning the epistemic probabilities, observability/unobservability of a phenomenon, and abstract representation of ontologies that can only be partially approximated through functions that are translated into experiments.
¬ Physical states: how are physical states and observability differently defined in experiments versus that in theory. The question of statistical significance comes into play. Also, how do we contend with the examination of states in vacuum or zero condition or does this now have experimental import?
¬ Spectrum of spatial probabilities that have to be rendered more numerically rigorous through a selection of statistical methods that are Bayesian, as systemic uncertainties have to be taken into account.
¬ How is the concept of ‘new physics’ differently interpreted in theory and experiment, especially in terms of what must be done to arrive at a new framework? Since much of the experiments designed seemed to be confirming the Standard Model, what is the role that can be played by phenomenologists for stretching the boundaries of the quantum imaginary? Should new physics be articulated within the framework of model building, or is it opened to re-envisioning within a fiction-narrative frame working within the idea of indeterminacy and incompleteness of knowledge? What other heterodox models are possible?
Speculative Ontology of Theory
• Aspects of theory-building that are speculative in nature and what points of intersection do they share: bootstrap approach, different epistemic approaches to quantum theory.
• How does the epistemic history of the theory help us construct the larger ontology that overlies it and the potentiality for going beyond it: example of case is that of the Standard Model (and the idea of Beyond the Standard Model) and Gauge Theory.
• What is the objective versus the subjective aspect of theory construction? Is the realism versus anti-realism standpoint important for addressing the problematics of theory construction, or for differentiating facticity from truth (within the arbitrary and relativist stance for defining ‘truth’)?
• Thinking about fiction and modeling of theory through thought-experiments (Gedanken Experiment) and fiction-creation, such as by conceiving the application of the narrative of Schrödinger’s Cat for working through the question of uncertainty and indeterminacy when examining the story and question relating to the probability of discoveries in particle physics, by looking at the story of the (J/) and of the Higgs. Can a fictional-modeling standpoint be useful tool for going beyond the realist-antirealist divide for extrapolating the object-subject relationship within physics to the other disciplines, such as for thinking about the subject-object relation of transmission and the ludic in media?
Ontology of Experimental Design
• Accelerators for creating an energetic environment for generating events arising from interactions. I will be examining here the blueprint of design for ATLAS and CMS trackers in particular, and the collaborative process involved in their design.
• Thinking about fields of interaction (in a multi-particle, multiplet environment, and the creation of ‘virtual’ particles as mediating points of that interaction) rather than quantum of interaction.
• Thinking of the connection between Monte Carlo simulation based on pre-determined parameters and constraints, and of experimental triggers. How are the experiments and the instruments designed to enable those triggers to take place, particularly in relation to the Higgs, top quark, and supersymmetric searches.
• Sociology of the organization at CERN that drives and enables specific choices made in terms of data and experimental choices, publication of results and funding that drives the epistemic goals of ATLAS and CMS, with some comparisons to CDF and D-Zero (and possibly LHC-b in terms of the quark searches).
• How does experimental design of mid-twentieth century nuclear physics influence the direction taken in the accelerators and supercolliders for particle physics?