The preceding chapter discussed the manner in which the modern scientific epistemology originating with Galileo reached a deep understanding in the first half of the Twentieth Century; however, the book on epistemology is far from closed. The epistemological challenges confronting the Twenty-first Century are the most severe since the dawning of the Seventeenth Century. They arise from a desire to model complex systems that exceed human conceptualization ability. As a consequence, people attempt to use highly flexible mathematical structures with large numbers of parameters that can be adjusted to fit the data, the result often being models that fit the data well but lack structural representation of the phenomena and thus are not predictive outside the range of the data. The situation is exacerbated by uncertainty regarding model parameters on account of insufficient data relative to model complexity, which in fact means uncertainty regarding the models themselves. More importantly from the standpoint of epistemology, the amount of available data is often miniscule in comparison to the amount needed for validation. The desire for knowledge has far outstripped experimental/observational capability. We are starved for data.
With all the talk these days of “Big Data,” one must remember that bigness is relative to need. While the current amount of data may be big relative to small systems, it is paltry compared to the data required for large complex systems, especially if it is not collected with a sharp eye to the intended use, which often it is not. We need only recall the warnings of Bacon and Kant about groping in the dark. With complex systems, experimental design is even more imperative. Still, with or without experimental design, in many cases it is virtually impossible to obtain the data required for model validation.
PDF download only.