Deductive-nomological model

Source: Wikipedia, the free encyclopedia.

The deductive-

deductive structure, one where truth of its premises entails truth of its conclusion, hinged on accurate prediction or postdiction
of the phenomenon to be explained.

Because of problems concerning humans' ability to define, discover, and know causality, this was omitted in initial formulations of the DN model. Causality was thought to be incidentally approximated by realistic selection of premises that derive the phenomenon of interest from observed starting conditions plus general laws. Still, the DN model formally permitted causally irrelevant factors. Also, derivability from observations and laws sometimes yielded absurd answers.

When

logical empiricism fell out of favor in the 1960s, the DN model was widely seen as a flawed or greatly incomplete model of scientific explanation. Nonetheless, it remained an idealized version of scientific explanation, and one that was rather accurate when applied to modern physics. In the early 1980s, a revision to the DN model emphasized maximal specificity for relevance of the conditions and axioms stated. Together with Hempel's inductive-statistical model
, the DN model forms scientific explanation's covering law model, which is also termed, from critical angle, subsumption theory.

Form

The term

nomological is derived from the Greek word νόμος or nomos, meaning "law".[1] The DN model holds to a view of scientific explanation whose conditions of adequacy (CA)—semiformal but stated classically—are derivability (CA1), lawlikeness (CA2), empirical content (CA3), and truth (CA4).[2]

In the DN model, a law

counterfactual claims and thus suggesting what must be true,[4] while following from a scientific theory's axiomatic structure.[5]

The phenomenon to be explained is the explanandum—an event, law, or theory—whereas the premises to explain it are explanans, true or highly confirmed, containing at least one universal law, and entailing the explanandum.[6][7] Thus, given the explanans as initial, specific conditions C1, C2 . . . Cn plus general laws L1, L2 . . . Ln, the phenomenon E as explanandum is a deductive consequence, thereby scientifically explained.[6]

Roots

fact/value gap, as what is does not itself reveal what ought.[15]

phenomena. Safeguarding metaphysics, too, it found the mind's constants holding also universal moral truths,[19] and launched German idealism
.

enumerative induction is grounded on the empiricism available, while science's point is not metaphysical truth. Comte found human knowledge had evolved from theological to metaphysical to scientific—the ultimate stage—rejecting both theology and metaphysics as asking questions unanswerable and posing answers unverifiable. Comte in the 1830s expounded positivism—the first modern philosophy of science and simultaneously a political philosophy[20]—rejecting conjectures about unobservables, thus rejecting search for causes.[21] Positivism predicts observations, confirms the predictions, and states a law, thereupon applied to benefit human society.[22] From late 19th century into the early 20th century, the influence of positivism spanned the globe.[20] Meanwhile, evolutionary theory's natural selection brought the Copernican Revolution into biology and eventuated in the first conceptual alternative to vitalism and teleology.[8]

Growth

Whereas Comtean positivism posed science as description,

National Socialism with World War II's close in 1945, logical positivism shifted to a milder variant, logical empiricism.[24] All variants of the movement, which lasted until 1965, are neopositivism,[25] sharing the quest of verificationism.[26]

Neopositivists led emergence of the philosophy subdiscipline philosophy of science, researching such questions and aspects of scientific theory and knowledge.[24] Scientific realism takes scientific theory's statements at face value, thus accorded either falsity or truth—probable or approximate or actual.[17] Neopositivists held scientific antirealism as instrumentalism, holding scientific theory as simply a device to predict observations and their course, while statements on nature's unobservable aspects are elliptical at or metaphorical of its observable aspects, rather.[27]

DN model received its most detailed, influential statement by Carl G Hempel, first in his 1942 article "The function of general laws in history", and more explicitly with Paul Oppenheim in their 1948 article "Studies in the logic of explanation".[28][29] Leading logical empiricist, Hempel embraced the Humean empiricist view that humans observe sequence of sensory events, not cause and effect,[23] as causal relations and casual mechanisms are unobservables.[30] DN model bypasses causality beyond mere constant conjunction: first an event like A, then always an event like B.[23]

Hempel held

William Dray.[32] Derivation of statistical laws from other statistical laws goes to the deductive-statistical model (DS model).[31][33] Georg Henrik von Wright, another critic, named the totality subsumption theory.[34]

Decline

Amid failure of

neopositivism's fundamental tenets,[35] Hempel in 1965 abandoned verificationism, signaling neopositivism's demise.[36] From 1930 onward, Karl Popper attacked positivism, although, paradoxically, Popper was commonly mistaken for a positivist.[37][38] Even Popper's 1934 book[39] embraces DN model,[7][28] widely accepted as the model of scientific explanation for as long as physics remained the model of science examined by philosophers of science.[30][40]

In the 1940s, filling the vast observational gap between cytology[41] and biochemistry,[42] cell biology arose[43] and established existence of cell organelles besides the nucleus. Launched in the late 1930s, the molecular biology research program cracked a genetic code in the early 1960s and then converged with cell biology as cell and molecular biology, its breakthroughs and discoveries defying DN model by arriving in quest not of lawlike explanation but of causal mechanisms.[30] Biology became a new model of science, while special sciences were no longer thought defective by lacking universal laws, as borne by physics.[40]

In 1948, when explicating DN model and stating scientific explanation's semiformal conditions of adequacy,

James Fetzer helped replace CA3 empirical content with CA3' strict maximal specificity.[45]

Salmon introduced causal mechanical explanation, never clarifying how it proceeds, yet reviving philosophers' interest in such.[30] Via shortcomings of Hempel's inductive-statistical model (IS model), Salmon introduced statistical-relevance model (SR model).[7] Although DN model remained an idealized form of scientific explanation, especially in applied sciences,[7] most philosophers of science consider DN model flawed by excluding many types of explanations generally accepted as scientific.[33]

Strengths

As theory of knowledge, epistemology differs from ontology, which is a subbranch of metaphysics, theory of reality.[46] Ontology proposes categories of being—what sorts of things exist—and so, although a scientific theory's ontological commitment can be modified in light of experience, an ontological commitment inevitably precedes empirical inquiry.[46]

ontic. Blurring epistemic with ontic—as by incautiously presuming a natural law to refer to a causal mechanism, or to trace structures realistically during unobserved transitions, or to be true regularities always unvarying—tends to generate a category mistake.[47][48]

Discarding ontic commitments, including causality per se, DN model permits a theory's laws to be reduced to—that is, subsumed by—a more fundamental theory's laws. The higher theory's laws are explained in DN model by the lower theory's laws.

absolute time
.

Covering law model reflects

fundamental physics—or are special sciences, whether astrophysics, chemistry, biology, geology, psychology, economics, and so on.[40][50][51] All special sciences would network via covering law model.[52] And by stating boundary conditions while supplying bridge laws, any special law would reduce to a lower special law, ultimately reducing—theoretically although generally not practically—to fundamental science.[53][54] (Boundary conditions are specified conditions whereby the phenomena of interest occur. Bridge laws translate terms in one science to terms in another science.)[53][54]

Weaknesses

By DN model, if one asks, "Why is that shadow 20 feet long?", another can answer, "Because that flagpole is 15 feet tall, the Sun is at x angle, and laws of electromagnetism".[6] Yet by problem of symmetry, if one instead asked, "Why is that flagpole 15 feet tall?", another could answer, "Because that shadow is 20 feet long, the Sun is at x angle, and laws of electromagnetism", likewise a deduction from observed conditions and scientific laws, but an answer clearly incorrect.[6] By the problem of irrelevance, if one asks, "Why did that man not get pregnant?", one could in part answer, among the explanans, "Because he took birth control pills"—if he factually took them, and the law of their preventing pregnancy—as covering law model poses no restriction to bar that observation from the explanans.

Many philosophers have concluded that causality is integral to scientific explanation.[55] DN model offers a necessary condition of a causal explanation—successful prediction—but not sufficient conditions of causal explanation, as a universal regularity can include spurious relations or simple correlations, for instance Z always following Y, but not Z because of Y, instead Y and then Z as an effect of X.[55] By relating temperature, pressure, and volume of gas within a container, Boyle's law permits prediction of an unknown variable—volume, pressure, or temperature—but does not explain why to expect that unless one adds, perhaps, the kinetic theory of gases.[55][56]

Scientific explanations increasingly pose not

counterfactual causality.[61][62]

Covering action

Through lawlike explanation,

atomic/molecular theory of matter.[66] Mach as well as Ostwald viewed matter as a variant of energy, and molecules as mathematical illusions,[66] as even Boltzmann thought possible.[67]

In 1905, via statistical mechanics,

electromagnetic,[69] whose two theories misaligned.[70] Yet belief in aether as the source of all physical phenomena was virtually unanimous.[71][72][73][74] At experimental paradoxes,[75] physicists modified the aether's hypothetical properties.[76]

Finding the

Newton's theory, a revolution in science[83] resisted by many yet fulfilled around 1930.[84]

In 1925,

weak nuclear force were discovered.[91]

In 1941,

waveparticle duality had rendered atomism—indivisible particles in a void—untenable, and highlighted the very notion of discontinuous particles as selfcontradictory.[94]

Meeting in 1947,

Sin-Itiro Tomonaga soon introduced renormalization, a procedure converting QED to physics' most predictively precise theory,[90][95] subsuming chemistry, optics, and statistical mechanics.[63][96] QED thus won physicists' general acceptance.[97] Paul Dirac criticized its need for renormalization as showing its unnaturalness,[97] and called for an aether.[98] In 1947, Willis Lamb had found unexpected motion of electron orbitals, shifted since the vacuum is not truly empty.[99] Yet emptiness was catchy, abolishing aether conceptually, and physics proceeded ostensibly without it,[92] even suppressing it.[98] Meanwhile, "sickened by untidy math, most philosophers of physics tend to neglect QED".[97]

Physicists have feared even mentioning aether,

relativity theory[109] became associated with earlier theories of aether, whose word and concept became taboo.[110] Einstein explained special relativity's compatibility with an aether,[107] but Einstein aether, too, was opposed.[100] Objects became conceived as pinned directly on space and time[111] by abstract geometric relations lacking ghostly or fluid medium.[100][112]

By 1970, QED along with

Higgs field, corroborates aether,[100][115] although physics need not state or even include aether.[100] Organizing regularities of observations—as in the covering law model—physicists find superfluous the quest to discover aether.[64]

In 1905, from

metaphysical research", QFTs pose particles not as existing individually, yet as excitation modes of fields,[114][121] the particles and their masses being states of aether,[92] apparently unifying all physical phenomena as the more fundamental causal reality,[101][115][116] as long ago foreseen.[73] Yet a quantum field is an intricate abstraction—a mathematical field—virtually inconceivable as a classical field's physical properties.[121] Nature's deeper aspects, still unknown, might elude any possible field theory.[114][121]

Though discovery of causality is popularly thought science's aim, search for it was shunned by the

fundamental interactions would reduce to superstring theory, whereby atoms and molecules, after all, are energy vibrations holding mathematical, geometric forms.[63] Given uncertainties of scientific realism,[18] some conclude that the concept causality raises comprehensibility of scientific explanation and thus is key folk science, but compromises precision of scientific explanation and is dropped as a science matures.[123] Even epidemiology is maturing to heed the severe difficulties with presumptions about causality.[14][57][59] Covering law model is among Carl G Hempel's admired contributions to philosophy of science.[124]

See also

Types of inference

Related subjects

Notes

  1. ^
    SEP
    , 2011.
  2. ^ a b James Fetzer, ch 3 "The paradoxes of Hempelian explanation", in Fetzer, ed, Science, Explanation, and Rationality (Oxford U P, 2000), p 113.
  3. ^ Montuschi, Objects in Social Science (Continuum, 2003), pp 61–62.
  4. ^ Bechtel, Philosophy of Science (Lawrence Erlbaum, 1988), ch 2, subch "DN model of explanation and HD model of theory development", pp 25–26.
  5. ^ a b Bechtel, Philosophy of Science (Lawrence Erlbaum, 1988), ch 2, subch "Axiomatic account of theories", pp 27–29.
  6. ^ a b c d e f g h Suppe, "Afterword—1977", "Introduction", §1 "Swan song for positivism", §1A "Explanation and intertheoretical reduction", pp 619–24, in Suppe, ed, Structure of Scientific Theories, 2nd edn (U Illinois P, 1977).
  7. ^ a b c d e Kenneth F Schaffner, "Explanation and causation in biomedical sciences", pp 79–125, in Laudan, ed, Mind and Medicine (U California P, 1983), p 81.
  8. ^ a b G Montalenti, ch 2 "From Aristotle to Democritus via Darwin", in Ayala & Dobzhansky, eds, Studies in the Philosophy of Biology (U California P, 1974).
  9. Newtonian physics
    , reducing celestial science to terrestrial science, ejected from physics the vestige of Aristotelian metaphysics, thus disconnecting physics and alchemy/chemistry, which then followed its own course, yielding chemistry around 1800.
  10. Hume's law
    —were not created by Hume but by later philosophers labeling them for ease of reference.
  11. upon experience. By Hume's fork, the two categories never cross. Any treatises containing neither can contain only "sophistry and illusion". (Flew, Dictionary, "Hume's fork", p 156
    ).
  12. enumerative induction, and justify it by presuming uniformity of nature. Humans thus attempt to justify a minor induction by adding a major induction, both logically invalid and unverified by experience—the problem of induction—how humans irrationally presume discovery of causality. (Chakraborti, Logic, p 381; Flew, Dictionary, "Hume", p 156
    .
  13. counterfactual
    —see Rothman & Greenland, Parascandola & Weed, as well as Kundi. Following is more direct elucidation:

    A necessary cause is a causal condition required for an event to occur. A sufficient cause is a causal condition complete to produce an event. Necessary is not always sufficient, however, since other casual factors—that is, other component causes—might be required to produce the event. Conversely, a sufficient cause is not always a necessary cause, since differing sufficient causes might likewise produce the event. Strictly speaking, a sufficient cause cannot be a single factor, as any causal factor must act casually through many other factors. And although a necessary cause might exist, humans cannot verify one, since humans cannot check every possible state of affairs. (Language can state necessary causality as a
    synthetic statement
    , rather.)

    Sufficient causality is more actually sufficient component causality—a complete set of component causes interacting within a causal constellation—which, however, is beyond humans' capacity to fully discover. Yet humans tend intuitively to conceive of causality as necessary and sufficient—a single factor both required and complete—the one and only cause, the cause. One may so view flipping a light switch. The switch's flip was not sufficient cause, however, but contingent on countless factors—intact bulb, intact wiring, circuit box, bill payment, utility company, neighborhood infrastructure, engineering of technology by
    Thomas Edison and Nikola Tesla, explanation of electricity by James Clerk Maxwell, harnessing of electricity by Benjamin Franklin, metal refining, metal mining, and on and on—while, whatever the tally of events, nature's causal mechanical structure remains a mystery.

    From a Humean perspective, the light's putative inability to come on without the switch's flip is neither a logical necessity nor an empirical finding, since no experience ever reveals that the world either is or will remain universally uniform as to the aspects appearing to bind the switch's flip as the necessary event for the light's coming on. If the light comes on without switch flip, surprise will affect one's mind, but one's mind cannot know that the event violated nature. As just a mundane possibility, an activity within the wall could have connected the wires and completed the circuit without the switch's flip.

    Though apparently enjoying the scandals that trailed his own explanations, Hume was very practical and his skepticism was quite uneven (Flew p 156). Although Hume rejected orthodox theism and sought to reject
    counterfactual
    causality. Silent as to causal role—whether necessity, sufficiency, component strength, or mechanism—counterfactual causality is simply that alteration of a factor prevents or produces the event of interest.
  14. ^
    PMID 16835045
    .
  15. norms—supposedly what should be—with barely explanation. Yet such values, as in ethics or aesthetics or political philosophy, are not found true merely by stating facts: is does not itself reveal ought. Hume's law is the principle that the fact/value gap is unbridgeable—that no statements of facts can ever justify norms—although Hume himself did not state that. Rather, some later philosophers found Hume to merely stop short of stating it, but to have communicated it. Anyway, Hume found that humans acquired morality through experience by communal reinforcement. (Flew, Dictionary, "Hume's law", p 157 & "Naturalistic fallacy", pp 240–41; Wootton, Modern Political Thought, p 306
    .)
  16. ^ Kant inferred that the
    a priori
    truth.
  17. ^ a b Chakravartty, "Scientific realism", §1.2 "The three dimensions of realist commitment", in SEP, 2013: "Semantically, realism is committed to a literal interpretation of scientific claims about the world. In common parlance, realists take theoretical statements at 'face value'. According to realism, claims about scientific entities, processes, properties, and relations, whether they be observable or unobservable, should be construed literally as having truth values, whether true or false. This semantic commitment contrasts primarily with those of so-called instrumentalist epistemologies of science, which interpret descriptions of unobservables simply as instruments for the prediction of observable phenomena, or for systematizing observation reports. Traditionally, instrumentalism holds that claims about unobservable things have no literal meaning at all (though the term is often used more liberally in connection with some antirealist positions today). Some antirealists contend that claims involving unobservables should not be interpreted literally, but as elliptical for corresponding claims about observables".
  18. ^
    Newton's first law of motion (the law of inertia) requires us to imagine a body that is always at rest or else moving aimlessly in a straight line at a constant speed, even though we never see such a body, and even though according to his own theory of universal gravitation, it is impossible that there can be one. This fundamental law, then, which begins with a claim about what would happen in a situation that never exists, carries no conviction except insofar as it helps to predict observable events. Thus, despite the amazing success of Newton's laws in predicting the observed positions of the planets and other bodies, Einstein and Infeld are correct to say, in The Evolution of Physics
    , that 'we can well imagine another system, based on different assumptions, might work just as well'. Einstein and Infeld go on to assert that 'physical concepts are free creations of the human mind, and are not, however it may seem, uniquely determined by the external world'. To illustrate what they mean by this assertion, they compare the modern scientist to a man trying to understand the mechanism of a closed watch. If he is ingenious, they acknowledge, this man 'may form some picture of a mechanism which would be responsible for all the things he observes'. But they add that he 'may never quite be sure his picture is the only one which could explain his observations. He will never be able to compare his picture with the real mechanism and he cannot even imagine the possibility or the meaning of such a comparison'. In other words, modern science cannot claim, and it will never be able to claim, that it has the definite understanding of any natural phenomenon".
  19. ^ Whereas a hypothetical imperative is practical, simply what one ought to do if one seeks a particular outcome, the categorical imperative is morally universal, what everyone always ought to do.
  20. ^ a b Bourdeau, "Auguste Comte", §§ "Abstract" & "Introduction", in Zalta, ed, SEP, 2013.
  21. ^ Comte, A General View of Positivism (Trübner, 1865), pp 49–50, including the following passage: "As long as men persist in attempting to answer the insoluble questions which occupied the attention of the childhood of our race, by far the more rational plan is to do as was done then, that is, simply to give free play to the imagination. These spontaneous beliefs have gradually fallen into disuse, not because they have been disproved, but because humankind has become more enlightened as to its wants and the scope of its powers, and has gradually given an entirely new direction to its speculative efforts".
  22. ^ Flew, Dictionary (St Martin's, 1984), "Positivism", p 283.
  23. ^
    SEP
    , 2011.
  24. ^ a b Friedman, Reconsidering Logical Positivism (Cambridge U P, 1999), p xii.
  25. inductivist trend from Bacon at 1620, the Newtonian research program at 1687, and Comtean
    positivism at 1830—that continues in a vague but usually disavowed sense within popular culture and some sciences.
  26. ^ Neopositivists are sometimes called "verificationists".
  27. metaphysical dimension of realism
    (as in Carnap 1950)".
    • Okasha, Philosophy of Science (Oxford U P, 2002), p 62: "Strictly we should distinguish two sorts of anti-realism. According to the first sort, talk of unobservable entities is not to be understood literally at all. So when a scientist puts forward a theory about electrons, for example, we should not take him to be asserting the existence of entities called 'electrons'. Rather, his talk of electrons is metaphorical. This form of anti-realism was popular in the first half of the 20th century, but few people advocate it today. It was motivated largely by a doctrine in the philosophy of language, according to which it is not possible to make meaningful assertions about things that cannot in principle be observed, a doctrine that few contemporary philosophers accept. The second sort of anti-realism accepts that talk of unobservable entities should be taken at face value: if a theory says that electrons are negatively charged, it is true if electrons do exist and are negatively charged, but false otherwise. But we will never know which, says the anti-realist. So the correct attitude towards the claims that scientists make about unobservable reality is one of total agnosticism. They are either true or false, but we are incapable of finding out which. Most modern anti-realism is of this second sort".
  28. ^ a b Woodward, "Scientific explanation", in Zalta, ed, SEP, 2011, abstract.
  29. S2CID 16924146
    .
  30. ^ a b c d Bechtel, Discovering Cell Mechanisms (Cambridge U P, 2006), esp pp 24–25.
  31. ^ a b Woodward, "Scientific explanation", §2 "The DN model", §2.3 "Inductive statistical explanation", in Zalta, ed, SEP, 2011.
  32. ^ von Wright, Explanation and Understanding (Cornell U P, 1971), p 11.
  33. ^ a b Stuart Glennan, "Explanation", § "Covering-law model of explanation", in Sarkar & Pfeifer, eds, Philosophy of Science (Routledge, 2006), p 276.
  34. ^ Manfred Riedel, "Causal and historical explanation", in Manninen & Tuomela, eds, Essays on Explanation and Understanding (D Reidel, 1976), pp 3–4.
  35. ^ Neopositivism's fundamental tenets were the verifiability criterion of cognitive meaningfulness, the
    Thomas Samuel Kuhn overthrew foundationalism
    , which was erroneously presumed to be a fundamental tenet of neopositivism.
  36. ^ Fetzer, "Carl Hempel", §3 "Scientific reasoning", in
    SEP
    , 2013: "The need to dismantle the verifiability criterion of meaningfulness together with the demise of the observational/theoretical distinction meant that logical positivism no longer represented a rationally defensible position. At least two of its defining tenets had been shown to be without merit. Since most philosophers believed that Quine had shown the analytic/synthetic distinction was also untenable, moreover, many concluded that the enterprise had been a total failure. Among the important benefits of Hempel's critique, however, was the production of more general and flexible criteria of cognitive significance in Hempel (1965b), included in a famous collection of his studies, Aspects of Scientific Explanation (1965d). There he proposed that cognitive significance could not be adequately captured by means of principles of verification or falsification, whose defects were parallel, but instead required a far more subtle and nuanced approach. Hempel suggested multiple criteria for assessing the cognitive significance of different theoretical systems, where significance is not categorical but rather a matter of degree: 'Significant systems range from those whose entire extralogical vocabulary consists of observation terms, through theories whose formulation relies heavily on theoretical constructs, on to systems with hardly any bearing on potential empirical findings' (Hempel 1965b: 117). The criteria Hempel offered for evaluating the 'degrees of significance' of theoretical systems (as conjunctions of hypotheses, definitions, and auxiliary claims) were (a) the clarity and precision with which they are formulated, including explicit connections to observational language; (b) the systematic—explanatory and predictive—power of such a system, in relation to observable phenomena; (c) the formal simplicity of the systems with which a certain degree of systematic power is attained; and (d) the extent to which those systems have been confirmed by experimental evidence (Hempel 1965b). The elegance of Hempel's study laid to rest any lingering aspirations for simple criteria of 'cognitive significance' and signaled the demise of logical positivism as a philosophical movement".
  37. ^ Popper, "Against big words", In Search of a Better World (Routledge, 1996), pp 89-90.
  38. ^ Hacohen, Karl Popper: The Formative Years (Cambridge U P, 2000), pp 212–13.
  39. Logik der Forschung, published in Austria in 1934, was translated by Popper from German to English, The Logic of Scientific Discovery, and arrived in the English-speaking world
    in 1959.
  40. ^ a b c d Reutlinger, Schurz & Hüttemann, "Ceteris paribus", § 1.1 "Systematic introduction", in Zalta, ed, SEP, 2011.
  41. ^ As scientific study of cells, cytology emerged in the 19th century, yet its technology and methods were insufficient to clearly visualize and establish existence of any cell organelles beyond the nucleus.
  42. Edward Buchner's in 1897 (Morange, A History, p 11). The biochemistry discipline soon emerged, initially investigating colloids in biological systems, a "biocolloidology" (Morange p 12; Bechtel, Discovering, p 94). This yielded to macromolecular theory, the term macromolecule introduced by German chemist Hermann Staudinger in 1922 (Morange p 12
    ).
  43. ^ Cell biology emerged principally at Rockefeller Institute through new technology (electron microscope and ultracentrifuge) and new techniques (cell fractionation and advancements in staining and fixation).
  44. ^ James Fetzer, ch 3 "The paradoxes of Hempelian explanation", in Fetzer J, ed, Science, Explanation, and Rationality (Oxford U P, 2000), pp 121–122.
  45. ^ Fetzer, ch 3 in Fetzer, ed, Science, Explanation, and Rationality (Oxford U P, 2000), p 129.
  46. ^ a b Bechtel, Philosophy of Science (Lawrence Erlbaum, 1988), ch 1, subch "Areas of philosophy that bear on philosophy of science", § "Metaphysics", pp 8–9, § "Epistemology", p 11.
  47. ^ H Atmanspacher, R C Bishop & A Amann, "Extrinsic and intrinsic irreversibility in probabilistic dynamical laws", in Khrennikov, ed, Proceedings (World Scientific, 2001), pp 51–52.
  48. ontic
    : "The underlying conception is that of bringing order to our knowledge of the universe. Yet there are at least three reasons why even complete knowledge of every empirical regularity that obtains during the world's history might not afford an adequate inferential foundation for discovery of the world's laws. First, some laws might remain uninstantiated and therefore not be displayed by any regularity. Second, some regularities may be accidental and therefore not display any law of nature. And, third, in the case of probabilistic laws, some frequencies might deviate from their generating nomic probabilities 'by chance' and therefore display natural laws in ways that are unrepresentative or biased".
  49. ^ This theory reduction occurs if, and apparently only if, the Sun and one planet are modeled as a two-body system, excluding all other planets (Torretti, Philosophy of Physics, pp 60–62).
  50. ^ Spohn, Laws of Belief (Oxford U P, 2012), p 305.
  51. exceptionless in their domains, yet were in principle reduced to fundamental physics [Feynman p 5, Schwarz Fig 1
    , and so are special sciences.
  52. ^ Bechtel, Philosophy of Science (Lawrence Erlbaum, 1988), ch 5, subch "Introduction: Relating disciplines by relating theories" pp 71–72.
  53. ^ a b Bechtel, Philosophy of Science (Lawrence Erlbaum, 1988), ch 5, subch "Theory reduction model and the unity of science program" pp 72–76.
  54. ^ a b Bem & de Jong, Theoretical Issues (Sage, 2006), pp 45–47.
  55. ^ a b c O'Shaughnessy, Explaining Buyer Behavior (Oxford U P, 1992), pp 17–19.
  56. ^ a b Spohn, Laws of Belief (Oxford U P, 2012), p 306.
  57. ^
    S2CID 24260908
    .
  58. ^ Bechtel, Philosophy of Science (Lawrence Erlbaum, 1988), ch 3, subch "Repudiation of DN model of explanation", pp 38–39.
  59. ^
    PMID 16030331
    .
  60. ^ Boffetta, "Causation in the presence of weak associations", Crit Rev Food Sci Nutr, 2010; 50(S1):13–16.
  61. ^ Making no commitment as to the particular causal role—such as necessity, or sufficiency, or component strength, or mechanism—counterfactual causality is simply that alteration of a factor from its factual state prevents or produces by any which way the event of interest.
  62. PMID 11707485
    .
  63. ^ a b c d Schwarz, "Recent developments in string theory", Proc Natl Acad Sci U S A, 1998; 95:2750–7, esp Fig 1.
  64. ^ a b Ben-Menahem, Conventionalism (Cambridge U P, 2006), p 71.
  65. ^ Instances of falsity limited Boyle's law to special cases, thus ideal gas law.
  66. ^ a b c d Newburgh et al, "Einstein, Perrin, and the reality of atoms" Archived 2017-08-03 at the Wayback Machine, Am J Phys, 2006, p 478.
  67. J J Thomson and the distinction between electrons and nuclei
    . Fifth, he normally called physical atoms 'things of thought' and was very happy when Ostwald seemed to refute the reality of atoms in 1905. And sixth, after Ostwald returned to atomism in 1908, Mach continued to defend Ostwald's 'energeticist' alternative to atomism".
  68. ^ Physicists had explained the electromagnetic field's energy as mechanical energy, like an ocean wave's bodily impact, not water droplets individually showered (Grandy, Everyday Quantum Reality, pp 22–23). In the 1890s, the problem of
    Planck's constant—a minimum unit of energy. The quanta were mysterious, not viewed as particles, yet simply as units of energy. Another paradox, however, was the photoelectric effect
    . As shorter wavelength yields more waves per unit distance, lower wavelength is higher wave frequency. Within the electromagnetic spectrum's visible portion, frequency sets the color. Light's intensity, however, is the wave's amplitude as the wave's height. In a strictly wave explanation, a greater intensity—higher wave amplitude—raises the mechanical energy delivered, namely, the wave's impact, and thereby yields greater physical effect. And yet in the photoelectric effect, only a certain color and beyond—a certain frequency and higher—was found to knock electrons off a metal surface. Below that frequency or color, raising the intensity of the light still knocked no electrons off. Einstein modeled Planck's quanta as each a particle whose individual energy was Planck's constant multiplied by the light's wave's frequency: at only a certain frequency and beyond would each particle be energetic enough to eject an electron from its orbital. Although elevating the intensity of light would deliver more energy—more total particles—each individual particle would still lack sufficient energy to dislodge an electron. Einstein's model, far more intricate, used
    messenger particles
    or force carriers, emitted and absorbed by electrons and by other particles undergoing transitions.
  69. ^ Wolfson, Simply Einstein (W W Norton & Co, 2003), p 67.
  70. interferometer by Michelson & Morley at 1887, revealed no apparent aether drift—light speed apparently constant, an absolute. Thus, both Newton's gravitational theory and Maxwell's electromagnetic theory each had its own relativity principle, yet the two were incompatible. For brief summary, see Wilczek, Lightness of Being (Basic Books, 2008), pp 78–80
    .
  71. ^ Cordero, EPSA Philosophy of Science (Springer, 2012), pp 26–28.
  72. ^ Hooper, Aether and Gravitation (Chapman & Hall, 1903), pp 122–23.
  73. ^ .
  74. mechanical philosophy's founding principle, No instant interaction at a distance (Einstein, "Ether", Sidelights (Methuen, 1922), pp 15–18
    ).
  75. ^ Rowlands, Oliver Lodge (Liverpool U P, 1990), pp 159–60: "
    Michelson–Morley
    experiment has come to be seen as the more significant of the two, and Lodge's experiment becomes something of a detail, a matter of eliminating the final, and less likely, possibility of a nonstationary, viscous, all-pervading medium. It could be argued that almost the exact opposite may have been the case. The Michelson–Morley experiment did not prove that there was no absolute motion, and it did not prove that there was no stationary ether. Its results—and the
    FitzGerald–Lorentz contraction—could have been predicted on Heaviside's, or even Maxwell
    's, theory, even if no experiment had ever taken place. The significance of the experiment, though considerable, is purely historical, and in no way factual. Lodge's experiment, on the other hand, showed that, if an ether existed, then its properties must be quite different from those imagined by mechanistic theorists. The ether which he always believed existed had to acquire entirely new properties as a result of this work".
  76. electrodynamic theory and, more or less, developed special theory of relativity before Einstein did (Ohanian, Einstein's Mistakes, pp 281–85). Yet Einstein, free a thinker, took the next step and stated it, more elegantly, without aether (Torretti, Philosophy of Physics, p 180
    ).
  77. ^ a b Tavel, Contemporary Physics (Rutgers U P, 2001), pp [1], 66.
  78. ^ Introduced soon after Einstein explained Brownian motion, special relativity holds only in cases of inertial motion, that is, unaccelerated motion. Inertia is the state of a body experiencing no acceleration, whether by change in speed—either quickening or slowing—or by change in direction, and thus exhibits constant velocity, which is speed plus direction.
  79. ^ a b c Cordero, EPSA Philosophy of Science (Springer, 2012), pp 29–30.
  80. dynamic states of the aether, whereas Einstein's special relativity was simply kinematic, that is, positing no causal mechanical explanation, simply describing positions, thus showing how to align measuring devices, namely, clocks and rods. (Ohanian, Einstein's Mistakes, pp 281–85
    ).
  81. ^ Ohanian, Einstein's Mistakes (W W Norton, 2008), pp 281–85.
  82. ^ Newton's theory required absolute space and time.
  83. ^ Buchen, "May 29, 1919", Wired, 2009.
    Moyer, "Revolution", in Studies in the Natural Sciences (Springer, 1979), p 55.
    Melia, Black Hole (Princeton U P, 2003), pp 83–87.
  84. ^ Crelinsten, Einstein's Jury (Princeton U P, 2006), p 28.
  85. ^
    empirically, that is, when not used for interpretation, and taken as simply formalism (p xv
    ).

    In 1941, at a party in a tavern in
    sum over paths or path integrals (p xv). Feynman would joke that this approach—which sums all possible paths that a particle could take, as though the particle actually takes them all, canceling themselves out except for one pathway, the particle's most efficient—abolishes the uncertainty principle (p xvi
    ). All empirically equivalent, Schrödinger's wave formalism, Heisenberg's matrix formalism, and Feynman's path integral formalism all incorporate the uncertain principle (p xvi).

    There is no particular barrier to additional formalisms, which could be, simply have not been, developed and widely disseminated (
    p xvii). In a particular physical discipline, however, and on a particular problem, one of the three formalisms might be easier than others to operate (pp xvi–xvii). By the 1960s, path integral formalism virtually vanished from use, while matrix formalism was the "canonical" (p xvii). In the 1970s, path integral formalism made a "roaring comeback", became the predominant means to make predictions from QFT, and impelled Feynman to an aura of mystique (p xviii).
  86. ^ a b Cushing, Quantum Mechanics (U Chicago P, 1994), pp 113–18.
  87. ^
    wave
    formalism.
  88. ^ Torretti, Philosophy of Physics (Cambridge U P, 1999), pp 393–95.
  89. ^ Torretti, Philosophy of Physics (Cambridge U P, 1999), p 394.
  90. ^ a b c Torretti, Philosophy of Physics (Cambridge U P, 1999), p 395.
  91. radioactive fallout
    —of diverse health consequences.
  92. ^ a b c d e f Wilczek, "The persistence of ether", Phys Today, 1999; 52:11,13, p 13.
  93. fundamental interactions
    are gravitational, electromagnetic, weak nuclear, and strong nuclear.
  94. ^ Grandy, Everyday Quantum Reality (Indiana U P, 2010), pp 24–25.
  95. ^ Schweber, QED and the Men who Made it (Princeton U P, 1994).
  96. ^ Feynman, QED (Princeton U P, 2006), p 5.
  97. ^ a b c Torretti, Philosophy of Physics, (Cambridge U P, 1999), pp 395–96.
  98. ^ a b c d Cushing, Quantum Mechanics (U Chicago P, 1994), pp 158–59.
  99. ^ Close, "Much ado about nothing", Nova, PBS/WGBH, 2012: "This new quantum mechanical view of nothing began to emerge in 1947, when Willis Lamb measured spectrum of hydrogen. The electron in a hydrogen atom cannot move wherever it pleases but instead is restricted to specific paths. This is analogous to climbing a ladder: You cannot end up at arbitrary heights above ground, only those where there are rungs to stand on. Quantum mechanics explains the spacing of the rungs on the atomic ladder and predicts the frequencies of radiation that are emitted or absorbed when an electron switches from one to another. According to the state of the art in 1947, which assumed the hydrogen atom to consist of just an electron, a proton, and an electric field, two of these rungs have identical energy. However, Lamb's measurements showed that these two rungs differ in energy by about one part in a million. What could be causing this tiny but significant difference? "When physicists drew up their simple picture of the atom, they had forgotten something: Nothing. Lamb had become the first person to observe experimentally that the vacuum is not empty, but is instead seething with ephemeral electrons and their anti-matter analogues, positrons. These electrons and positrons disappear almost instantaneously, but in their brief mayfly moment of existence they alter the shape of the atom's electromagnetic field slightly. This momentary interaction with the electron inside the hydrogen atom kicks one of the rungs of the ladder just a bit higher than it would be otherwise.
    "This is all possible because, in quantum mechanics, energy is not conserved on very short timescales, or for very short distances. Stranger still, the more precisely you attempt to look at something—or at nothing—the more dramatic these energy fluctuations become. Combine that with Einstein's E=mc2, which implies that energy can congeal in material form, and you have a recipe for particles that bubble in and out of existence even in the void. This effect allowed Lamb to literally measure something from nothing".
  100. ^ a b c d e
  101. ^ a b Riesselmann "Concept of ether in explaining forces", Inquiring Minds, Fermilab, 2008.
  102. ^ Close, "Much ado about nothing", Nova, PBS/WGBH, 2012.
  103. ^ On "historical examples of empirically successful theories that later turn out to be false", Okasha, Philosophy of Science (Oxford U P, 2002), p 65, concludes, "One that remains is the wave theory of light, first put forward by Christiaan Huygens in 1690. According to this theory, light consists of wave-like vibrations in an invisible medium called the ether, which was supposed to permeate the whole universe. (The rival to the wave theory was the particle theory of light, favoured by Newton, which held that light consists of very small particles emitted by the light source.) The wave theory was not widely accepted until the French physicist Auguste Fresnel formulated a mathematical version of the theory in 1815, and used it to predict some surprising new optical phenomena. Optical experiments confirmed Fresnel's predictions, convincing many 19th-century scientists that the wave theory of light must be true. But modern physics tells us that the theory is not true: there is no such thing as the ether, so light doesn't consist of vibrations in it. Again, we have an example of a false but empirically successful theory".
  104. ^ Pigliucci, Answers for Aristotle (Basic Books, 2012), p 119: "But the antirealist will quickly point out that plenty of times in the past scientists have posited the existence of unobservables that were apparently necessary to explain a phenomenon, only to discover later on that such unobservables did not in fact exist. A classic case is the aether, a substance that was supposed by nineteenth-century physicists to permeate all space and make it possible for electromagnetic radiation (like light) to propagate. It was Einstein's special theory of relativity, proposed in 1905, that did away with the necessity of aether, and the concept has been relegated to the dustbin of scientific history ever since. The antirealists will relish pointing out that modern physics features a number of similarly unobservable entities, from quantum mechanical 'foam' to dark energy, and that the current crop of scientists seems just as confident about the latter two as their nineteenth-century counterparts were about aether".
  105. ^ Wilczek, Lightness of Being (Basic Books, 2008), pp 78–80.
  106. ^ Laughlin, A Different Universe (Basic Books, 2005), pp 120–21.
  107. ^ a b Einstein, "Ether", Sidelights (Methuen, 1922), pp 14–18.
  108. waving. An unobservable, however, Einstein aether is not a privileged reference frame
    —is not to be assigned a state of absolute motion or absolute rest.
  109. ^ Relativity theory comprises both special relativity (SR) and general relativity (GR). Holding for inertial reference frames, SR is as a limited case of GR, which holds for all reference frames, both inertial and accelerated. In GR, all motion—inertial, accelerated, or gravitational—is consequent of the geometry of 3D space stretched onto the 1D axis of time. By GR, no force distinguishes acceleration from inertia. Inertial motion is consequence simply of uniform geometry of spacetime, acceleration is consequence simply of nonuniform geometry of spacetime, and gravitation is simply acceleration.
  110. ^ a b Laughlin, A Different Universe, (Basic Books, 2005), pp 120–21: "The word 'ether' has extremely negative connotations in theoretical physics because of its past association with opposition to relativity. This is unfortunate because, stripped of these connotations, it rather nicely captures the way most physicists actually think about the vacuum. ... Relativity actually says nothing about the existence or nonexistence of matter pervading the universe, only that any such matter must have relativistic symmetry. It turns out that such matter exists. About the time that relativity was becoming accepted, studies of radioactivity began showing that the empty vacuum of space had spectroscopic structure similar to that of ordinary quantum solids and fluids. Subsequent studies with large particle accelerators have now led us to understand that space is more like a piece of window glass than ideal Newtonian emptiness. It is filled with 'stuff' that is normally transparent but can be made visible by hitting it sufficiently hard to knock out a part. The modern concept of the vacuum of space, confirmed every day by experiment, is a relativistic ether. But we do not call it this because it is taboo".
  111. contracts
    in the vicinity of mass or energy.
  112. ^ Torretti, Philosophy of Physics (Cambridge U P, 1999), p 180.
  113. ^ As an effective field theory, once adjusted to particular domains, Standard Model is predictively accurate until a certain, vast energy scale that is a cutoff, whereupon more fundamental phenomena—regulating the effective theory's modeled phenomena—would emerge. (Burgess & Moore, Standard Model, p xi; Wells, Effective Theories, pp 55–56).
  114. ^ a b c Torretti, Philosophy of Physics (Cambridge U P, 1999), p 396.
  115. ^
    Planck length
    as a 'microscopic' length scale. Note that the cutoff, though very large, in any case is finite.
  116. ^ a b Wilczek, Lightness of Being (Basic Books, 2008), ch 8 "The grid (persistence of ether)", p 73: "For natural philosophy, the most important lesson we learn from QCD is that what we perceive as empty space is in reality a powerful medium whose activity molds the world. Other developments in modern physics reinforce and enrich that lesson. Later, as we explore the current frontiers, we'll see how the concept of 'empty' space as a rich, dynamic medium empowers our best thinking about how to achieve the unification of forces".
  117. ^ Mass–energy equivalence is formalized in the equation E=mc2.
  118. ^ Einstein, "Ether", Sidelights (Methuen, 1922), p 13: "[A]ccording to the special theory of relativity, both matter and radiation are but special forms of distributed energy, ponderable mass losing its isolation and appearing as a special form of energy".
  119. ^ Braibant, Giacomelli & Spurio, Particles and Fundamental Interactions (Springer, 2012), p 2: "Any particle can be created in collisions between two high energy particles thanks to a process of transformation of energy in mass".
  120. E=mc2, energy and mass are one and the same, the combined energy of the collision can be converted into a mass, in other words, a particle, that is heavier than either of the colliding protons. The more energy is involved in the collision, the heavier the particles that might come into being" [Avent, "The Q&A"
    , Economist, 2012].
  121. ^ a b c Kuhlmann, "Physicists debate", Sci Am, 2013.
  122. ^ Whereas Newton's Principia inferred absolute space and absolute time, omitted an aether, and, by Newton's law of universal gravitation, formalized action at a distance—a supposed force of gravitation spanning the entire universe instantly—Newton's later work Optiks introduced an aether binding bodies' matter, yet denser outside bodies, and, not uniformly distributed across all space, in some locations condensed, whereby "aethereal spirits" mediate electricity, magnetism, and gravitation. (Whittaker, A History of Theories of Aether and Electricity (Longmans, Green & Co: 1910), pp 17–18)
  123. ^ Norton, "Causation as folk science", in Price & Corry, eds, Mature Causation, Physics, and the Constitution of Reality (Oxford U P, 2007), esp p 12.
  124. ^ Fetzer, ch 3, in Fetzer, ed, Science, Explanation, and Rationality (Oxford U P, 2000), p 111.

Sources

Further reading