Wednesday, December 1, 2010
How can we link 'dead' knowledge to the LHC
This idea came to me as I was working on a project for a visual studies class that tries to use as its subject matter the history of Durham (see www.digitaldurham.duke.edu). I worked on history of the Duke Family (because their papers are most readily accessible from the University Archives of Duke, where I go to school). But the larger idea was the excavation of 'dead' epistemology, and working through the 'dead' histories of once 'living' objects that are still connected to the living 'members' of the community through genetic and social ties got me thinking about the sort of knowledge generated before the growth in experimental particle physics, and the sorts of knowledge that have come and gone before the LHC was even built, and also the knowledge that came to pass WHEN the LHC was built (one just has to go through the large archives of materials relating to the planning, organizing and building of the LHC). I've written up extensively on this game project I did for class, which I'll like to modify to model the "Epistemology of the LHC." I've only done up the storyboard, so it would be great to see what can be modifield in the process. However, my more important issue is to work the back engine, which for me is the database, and to figure out how one can determine ways by which one should and can hierarchisize categories of information, particularly in relational databases and intelligent databases, making use of the ontological development on database science; how do know which category should come first and what are the criteria for that. Then, to develop a database that is able to map in the processes of epistemological excavations. Probably the late Bruce Trigger's work " Archaeology and Epistemology: Dialoguing Across the Darwinian Chasm" can help. If any one reading this knows of more work out there and can provide suggestions, I look forward to that.