Volume 8 Issue 2 News & Resources | February 2015
We do estimation when we cannot really measure things that we are interested in. The natural world offers so much information or data in order for man to enhance the current available understanding of nature’s natural function and cycle. The obsession of man to understand the natural world is actually triggered by the desire to control and exploit. The obsession of being in control has led human-being to estimate the behavior of things and making assumptions which have brought us multitudes of ill-construct hypotheses. The scarce and limited data which have been accumulated were sometimes tailored to fit certain selected arguments. Data are inherently neutral. The interpretations of data can lead one to varying conclusions and subsequent follow-up actions. Hence, data acquisition and collection is only the initial part of the overall process of strengthening accepted constructs or beliefs. The data will then need to be analyzed, i.e. categorized into different distinct groups, and send for data recognition and identification. The arrays of data will then need to be arranged into a convincing narrative of events. The narratives will then, if duly accepted, become such a thing termed “FACTS”.
The natural world has always surprised man with new openings and exposé. The boundaries of knowledge and understanding have always been akin to shifting blurry lines as we progress in our explorations and search for new meaning to life and living. The doubts are growing louder and louder with even greater magnitudes. With the exponential access to knowledge and know-how, more and more stakeholders are giving their input any events or incidents that happened. The unchallenged altars of experts are being shaken by the man on the street. No longer can one give opinions with the expectation of blind acceptance from the public. The evaluators of opinions are lingering more and more often along the corridor of expert knowledge. The experts are being challenged and their views are doubted first, before accepted. The “doubt” culture is here to stay as better infrastructure has permitted better independent access to bank of knowledge all year round. In general, this is a better and much welcome scenario. A knowledgeable society will be able to develop itself and steer itself away from ill-considered behaviors and actions which will ruin the accumulated progress thus far.
Why common-sense (CS)?
A particular danger for processed information and data is the tendency to accept whole-heartedly the derived conclusion with no recourse to further queries. A common-sense approach is desperately needed in such scenario. Even some of the most analyzed set of information can fail when checked to the rule of common-sense. The use of technical jargons will not help to alleviate the problems. A common-sense (CS) approach is a bird’s eye-view, pragmatic and logical approach all in one. Normally, this CS technique is a default technique by those who lacked real information and knowledge. They will adapt the current problem to others which they have already have some grasped and find the similarities to those in hands. CS should also be adopted by experts so that the detail perspectives can enhance the overall understanding and will enhance the correctness of the final conclusions. The CS approach is also the basis of the fight or flight response embedded in all of us. We will weigh logically the probability of success or survival, and make the CS decision. In the modern world which is full of complexities certainly we will not survive only with CS, but certainly without CS we will be doomed right from the beginning.
Why we model?
The world is very complex. To ease the difficulty of deciphering the natural phenomena and processes, we make models of the world. We are not referring to the physical model of the world, but instead the behavior and also mathematical model. The derived models are normally developed based on some fixed assumptions. These assumptions act as starting points for the model derivation and development. After a simple model has been developed, we can then start to re-evaluate the assumptions used and refined the generated model. A stable model will then need to be verified via actual experimental data gathering and analyses. The validated model will be very useful for future projected scenarios and will assist the analyses of future incidents. The problem with model is that people tend to accept it and not dare to question it. And, later we quarrel on the decision which was made via the model. No model is fool-proof. There is no magic number for the assumptions. But, logically, the more complex a situation is, the more assumptions we need to use. And, more assumptions meant more room for errors. Things which can be measured can be managed and monitor @ verified. The least data we have on it, the more assumptions we have to make. This statement, in essence, is also an assumption.
The process of thinking and making decision must be balanced with the amount of data that we have in store, and the meaning that we make out from them. If the data are rubbish, we cannot do much to make it meaningful. High quality and reliable data will be the main factor in making decision or even to give analyses on. The growth of real knowledge came from the hard-work of academician who have filtered the incoming data and, in essence, differentiate the truth from falsehood. Academicians have a heavy burden to carry and deliver. We need to ensure the sanctity of the body of knowledge and not allow them to be corrupted. The truth is one while falsehood and deviations are many. To my fellow academicians, let us begin this fight!
“Scientific views end in awe and mystery, lost at the edge in uncertainty, but they appear to be so deep and so impressive that the theory that it is all arranged as a stage for God to watch man's struggle for good and evil seems inadequate”
Richard P. Feynman