ТОП просматриваемых книг сайта:
Twentieth-Century Philosophy of Science: A History (Third Edition). Thomas J. Hickey
Читать онлайн.Название Twentieth-Century Philosophy of Science: A History (Third Edition)
Год выпуска 0
isbn 9780692650738
Автор произведения Thomas J. Hickey
Жанр Афоризмы и цитаты
Издательство Ingram
Hickey’s METAMODEL discovery system synthesizes theory revisions. It constructed the Keynesian macroeconomic theory from U.S. statistical data available prior to 1936, the publication year of Keynes’ revolutionary General Theory of Employment, Interest and Money. The applicability of the METAMODEL for this theory revision was already known in retrospect by the fact that, as 1980 Nobel-laureate econometrician Lawrence Klein wrote in his Keynesian Revolution (1947), all the important parts of Keynes theory can be found in the works of one or another of his predecessors. Hickey’s METAMODEL discovery system described in his Introduction to Metascience (1976) is a mechanized generative grammar with combinatorial transition rules producing econometric models. The grammar is a finite-state generative grammar both to satisfy the collinearity restraint for the regression-estimated equations and to satisfy the formal requirements for executable multi-equation predictive models. The system tests for collinearity, statistical significance, serial correlation, goodness-of-fit properties of the equations, and for accurate out-of-sample retrodictions. Simon calls this combinatorial type of system a “generate-and-test” design.
Hickey also used his METAMODEL system in 1976 to develop a post-classical macrosociometric functionalist model of the American national society with fifty years of historical time-series data. To the shock, chagrin and dismay of academic sociologists it is not a social-psychological theory, and four sociological journals therefore rejected Hickey’s paper describing the model and its findings about the national society’s dynamics and stability characteristics. The paper is reprinted below as “Appendix I” to BOOK VIII.
The academic sociologists’ a priori ontological commitments to romanticism and social-psychological reductionism rendered the referees invincibly obdurate. Their criticisms also betrayed their Luddite mentality toward mechanized theory development. Later in the mid-1980’s Hickey integrated his macrosociometric model into a Keynesian macroeconometric model to produce an institutionalist macroeconometric model for the Indiana Department of Commerce, Division of Economic Analysis.
4.13 Examples of Successful Discovery Systems
There are several examples of successful discovery systems in actual use. John Sonquist developed his AID system for his Ph.D. dissertation in sociology at the University of Chicago. His dissertation was written in 1961, when William F. Ogburn was department chairman, which was before the romantics took over the University of Chicago sociology department. He described the system in his Multivariate Model Building: Validation of a Search Strategy (1970). The system has long been used at the University of Michigan Survey Research Center. Now modified as the CHAID system using χ2 Sonquist’s discovery system is available commercially in SAS and SPSS statistical software packages. Its principal commercial application is for list processing for market analysis and for risk analysis as well as for academic investigations in social science. It is not only the oldest mechanized discovery system but also the most widely used in practical applications to date.
Robert Litterman developed his BVAR system for his Ph.D. dissertation in economics at the University of Minnesota. He described the system in his Techniques for Forecasting Using Vector Autoregressions (1984). The economists at the Federal Reserve Bank of Minneapolis have long used his system for macroeconomic and regional economic analysis. It has also been used for regional economic analysis both by the State of Connecticut and the State of Indiana.
Having previously received an M.A. degree in economics Hickey had intended to develop his METAMODEL computerized discovery system for a Ph.D. dissertation in philosophy of science while a graduate student in the philosophy department of the University of Notre Dame, South Bend, Indiana. But the Notre Dame philosophers under the chairmanship of a Reverend Ernan McMullin were obstructionist to Hickey views, and Hickey dropped out. He then developed his computerized discovery system as a nondegree student at San Jose City College in San Jose, California.
For thirty years afterwards Hickey used his discovery system occupationally, working as a research econometrician in both business and government. For six of those years he used his system for Institutionalist macroeconometric modeling and regional econometric modeling for the State of Indiana Department of Commerce. He also used it for econometric market analysis and risk analysis for various business corporations including USX/United States Steel Corporation, BAT(UK)/Brown and Williamson Company, Pepsi/Quaker Oats Company, Altria/Kraft Foods Company, Allstate Insurance Company, and TransUnion LLC.
In 2004 TransUnion’s Analytical Services Group purchased a perpetual license to use his METAMODEL system for their consumer credit risk analyses using their proprietary TrenData aggregated quarterly time series extracted from their large national database of consumer credit files. Hickey used the models generated by the discovery system to forecast payment delinquency rates, bankruptcy filings, average balances and other consumer borrower characteristics that constitute risk exposure for lenders, especially during the contractionary phase of the business cycle. He also used the system at Quaker Oats and Kraft Foods to discover the sociological and demographic factors responsible for the secular long-term market dynamics of food products and other nondurable consumer goods.
More about discovery systems and the evolution of computational philosophy of science is in BOOK VIII below.
4.14 Scientific Criticism
Criticism pertains to the criteria for the acceptance or rejection of theories.
The only criterion for scientific criticism that is acknowledged by the contemporary pragmatist is the empirical criterion.
The philosophical literature on scientific criticism has little to say about the specifics of experimental design. Most often philosophical discussion of criticism pertains to the criteria for acceptance or rejection of theories and more recently to the decidability of empirical testing.
In earlier times when the natural sciences were called “natural philosophy” and social sciences were called “moral philosophy”, nonempirical criteria operated as criteria for the criticism and acceptance of descriptive narratives. Even today some philosophers and scientists have used their semantical and ontological preconceptions as criteria for the criticism of scientific theories including preconceptions about causality or specific causal factors. Such semantical and ontological preconceptions have misled them to reject new empirically superior theories. In his Against Method Feyerabend noted that the ontological preconceptions used to criticize new theories have often been the semantical and ontological claims expressed by previously accepted theories.
But what historically has separated the empirical sciences from their origins in natural and moral philosophy is the empirical criterion, which is responsible for the advancement of science and for its enabling practicality in application. Whenever in the history of science there has been a conflict between the empirical criterion and any nonempirical criteria for the evaluation of new theories, it is eventually the empirical criterion that ultimately decides theory selection.
Contemporary pragmatists accept scientific realism, relativized semantics and thus ontological relativity, and they therefore reject all prior semantical or ontological criteria for scientific criticism including the romantics’ mentalistic ontology requiring social-psychological reductionism.
4.15 Logic of Empirical Testing
An empirical test is a decision procedure consisting of a modus tollens deduction from a set of one or several universally quantified theory statements expressible in a nontruth-functional hypothetical-conditional schema proposed for testing together with an antecedent particularly quantified description of the initial test conditions. These statements jointly conclude to a consequent particularly quantified description of a produced (predicted) test-outcome event, which is compared with the observed test-outcome description.
In order to express explicitly the dependency of the produced effect upon the realized initial conditions in an empirical test, the universally quantified theory statements can be schematized as a nontruth-functional