Reasoning from inconclusive evidence, or ‘induction’, is central to science and any applications we make of it. For that reason alone it demands the attention of philosophers of science. This Element explores the prospects of using probability theory to provide an inductive logic, a framework for representing evidential support. Constraints on the ideal evaluation of hypotheses suggest that overall support for a hypothesis is represented by its probability in light of the total evidence, and incremental support, or confirmation, indicated by (...) an increased probability given the evidence than otherwise. This proposal is shown to have the capacity to reconstruct many canons of the scientific method and inductive inference. Along the way, significant objections are discussed, such as the challenge from inductive scepticism and the objection that the probabilistic approach makes evidential support arbitrary. (shrink)
This is part II in a series of papers outlining Abstraction Theory, a theory that I propose provides a solution to the characterisation or epistemological problem of induction. Logic is built from first principles severed from language such that there is one universal logic independent of specific logical languages. A theory of (non-linguistic) meaning is developed which provides the basis for the dissolution of the `grue' problem and problems of the non-uniqueness of probabilities in inductive logics. The problem of counterfactual (...) conditionals is generalised to a problem of truth conditions of hypotheses and this general problem is then solved by the notion of abstractions. The probability calculus is developed with examples given. In future parts of the series the full decision theory is developed and its properties explored. (shrink)
Epistemic modals have peculiar logical features that are challenging to account for in a broadly classical framework. For instance, while a sentence of the form ‘p, but it might be that not p’ appears to be a contradiction, 'might not p' does not entail 'not p', which would follow in classical logic. Likewise, the classical laws of distributivity and disjunctive syllogism fail for epistemic modals. Existing attempts to account for these facts generally either under- or over-correct. Some theories predict that (...) 'p and might not p', a so-called epistemic contradiction, is a contradiction only in an etiolated sense, under a notion of entailment that does not allow substitution of logical equivalents; these theories underpredict the infelicity of embedded epistemic contradictions. Other theories savage classical logic, eliminating not just rules that intuitively fail, like distributivity and disjunctive syllogism, but also rules like non-contradiction, excluded middle, De Morgan’s laws, and disjunction introduction, which intuitively remain valid for epistemic modals. In this paper, we aim for a middle ground, developing a semantics and logic for epistemic modals that makes epistemic contradictions genuine contradictions and that invalidates distributivity and disjunctive syllogism but that otherwise preserves classical laws that intuitively remain valid. We start with an algebraic semantics, based on ortholattices instead of Boolean algebras, and then propose a more concrete possibility semantics, based on partial possibilities related by compatibility. Both semantics yield the same consequence relation, which we axiomatize. Then we show how to extend our semantics to explain parallel phenomena involving probabilities and conditionals. The goal throughout is to retain what is desirable about classical logic while accounting for the non-classicality of epistemic vocabulary. (shrink)
Choices rarely deal with certainties; and, where assertoric logic and modal logic are insufficient, those seeking to be reasonable turn to one or more things called “probability.” These things typically have a shared mathematical form, which is an arithmetic construct. The construct is often felt to be unsatisfactory for various reasons. A more general construct is that of a preordering, which may even be incomplete, allowing for cases in which there is no known probability relation between two propositions or between (...) two events. Previous discussion of incomplete preorderings has been as if incidental, with researchers focusing upon preorderings for which quantifications are possible. This article presents formal axioms for the more general case. Challenges peculiar to some specific interpretations of the nature of probability are brought to light in the context of these propositions. A qualitative interpretation is offered for probability differences that are often taken to be quantified. A generalization of Bayesian updating is defended without dependence upon coherence. Qualitative hypothesis testing is offered as a possible alternative in cases for which quantitative hypothesis testing is shown to be unsuitable. [NB: There is a transcription error in (A6), the quantification “∀(Y,c₂)” should be applied within the large parentheses to the proposition on the third row. You have my apologies!]. (shrink)
This book investigates the interplay between two new and influential subdisciplines in the philosophy of science and philosophy: contemporary scientific metaphysics and the philosophy of information. Scientific metaphysics embodies various scientific realisms and has a partial intellectual heritage in some forms of neo-positivism, but is far more attuned than the latter to statistical science, theory defeasibility, scale variability, and pluralist ontological and explanatory commitments, and is averse to a-priori conceptual analysis. The philosophy of information is the combination of what has (...) been called the informational turn in philosophy and ontology, with the informational turn in logic. The book explores the intersecting theoretical commitments and metaphysical basis of both. It applies scientific metaphysics to the philosophy of information, and also correspondingly proposes a new informational interpretation of scientific metaphysics. These are applied to numerous outstanding ontological, philosophical, and theoretical problems in the philosophy of information, scientific realism, cognitive science, theories of semantic information, and informational logic. The book investigates known and new problems, and advances debates and insights in the philosophy of information. It will be of interest to philosophers of information and science, metaphysicians, and cognitive scientists. (shrink)
I argue that when we use ‘probability’ language in epistemic contexts—e.g., when we ask how probable some hypothesis is, given the evidence available to us—we are talking about degrees of support, rather than degrees of belief. The epistemic probability of A given B is the mind-independent degree to which B supports A, not the degree to which someone with B as their evidence believes A, or the degree to which someone would or should believe A if they had B as (...) their evidence. My central argument is that the degree-of-support interpretation lets us better model good reasoning in certain cases involving old evidence. Degree-of-belief interpretations make the wrong predictions not only about whether old evidence confirms new hypotheses, but about the values of the probabilities that enter into Bayes’ Theorem when we calculate the probability of hypotheses conditional on old evidence and new background information. (shrink)
As part of a book symposium on Erwin Dekker's Jan Tinbergen (1903–1994) and the Rise of Economic Expertise (2021), William Peden reflects on shared views on the objectivity and nature of statistics between Tinbergen and Keynes underlying the Tinbergen-Keynes debates.
Pondering the question of free will in the context of probability allows us to take a fresh look at a number of old problems. We are able to avoid deterministic entrapments and attempt to look at free will as an outcome of the entire decision-making system. In my paper, I will argue that free will should be considered in the context of a complex system of decisions, not individual cases. The proposed system will be probabilistic in character, so it will (...) be embedded in the calculus of probability. To achieve the stated goal, I will refer to two areas of Carnap’s interest: the relationship between free will and determinism, and the probability-based decision-making system. First, I will present Carnap’s compatibilist position. On this basis, I will show how free will can be examined on deterministic grounds. Then I will present Carnap’s probabilistic project—the so-called logical interpretation of probability. In addition to presenting its characteristics and functionality, I will argue for its usefulness in the context of decision analysis and its immunity to problems associated with determinism. Finally, I will show how the two mentioned elements can be combined, as a result of which I will present a concept for a probabilistic analysis of free will. In this context, I will identify free will with the individual characteristics of the system. My main aim is to present the theme of free will in the light of a formal analysis based on probability rather than metaphysical assumptions. (shrink)
David Stove was a philosopher strong on argument and polemic. His work on the logical intepretation of probability led to a defence of induction in The Rationality of Induction (1986). It resulted too in his denunciation of Popper, Kuhn, Lakatos and Feyeraband as irrationalists because of their "deductivism" (the thesis that the only logic is deductive logic). Stove also defended controversial views on the intelligence of women and on Darwinism. The article surveys his life and work.
We show that recurrence conditions do not yield invariant Borel probability measures in the descriptive set-theoretic milieu, in the strong sense that if a Borel action of a locally compact Polish group on a standard Borel space satisfies such a condition but does not have an orbit supporting an invariant Borel probability measure, then there is an invariant Borel set on which the action satisfies the condition but does not have an invariant Borel probability measure.
John Maynard Keynes’s A Treatise on Probability is the seminal text for the logical interpretation of probability. According to his analysis, probabilities are evidential relations between a hypothesis and some evidence, just like the relations of deductive logic. While some philosophers had suggested similar ideas prior to Keynes, it was not until his Treatise that the logical interpretation of probability was advocated in a clear, systematic and rigorous way. I trace Keynes’s influence in the philosophy of probability through a heterogeneous (...) sample of thinkers who adopted his interpretation. This sample consists of Frederick C. Benenson, Roy Harrod, Donald C. Williams, Henry E. Kyburg and David Stove. The ideas of Keynes prove to be adaptable to their diverse theories of probability. My discussion indicates both the robustness of Keynes’s probability theory and the importance of its influence on the philosophers whom I describe. I also discuss the Problem of the Priors. I argue that none of those I discuss have obviously improved on Keynes’s theory with respect to this issue. (shrink)
Epistemic theories of objective chance hold that chances are idealised epistemic probabilities of some sort. After giving a brief history of this approach to objective chance, I argue for a particular version of this view, that the chance of an event E is its epistemic probability, given maximal knowledge of the possible causes of E. The main argument for this view is the demonstration that it entails all of the commonly-accepted properties of chance. For example, this analysis entails that chances (...) supervene on the physical facts, that the chance function is an expert probability, that the existence of stable frequencies results from invariant chances in repeated experiments, and that chances are probably close to long-run relative frequencies in repeated experiments. Despite these virtues, epistemic approaches to chance have been neglected in recent decades, on account of their conflict with accepted views about closely related topics such as causation, laws of nature, and epistemic probability. However, existing views on these topics are also very problematic, and I believe that the epistemic view of chance is a key piece of this puzzle that, once in place, allows all the other pieces to fit together into a new and coherent way. I also respond to some criticisms of the epistemic theory that I favour. (shrink)
In this contribution we will present a generalization of de Finetti's betting game in which a gambler is allowed to buy and sell unknown events' betting odds from more than one bookmaker. In such a framework, the sole coherence of the books the gambler can play with is not sucient, as in the original de Finetti's frame, to bar the gambler from a sure-win opportunity. The notion of joint coherence which we will introduce in this paper characterizes those coherent books (...) on which sure- win is impossible. Our main results provide geometric characterizations of the space of all books which are jointly coherent with a xed one. As a consequence we will also show that joint coherence is decidable. (shrink)
Many philosophers argue that Keynes’s concept of the “weight of arguments” is an important aspect of argument appraisal. The weight of an argument is the quantity of relevant evidence cited in the premises. However, this dimension of argumentation does not have a received method for formalisation. Kyburg has suggested a measure of weight that uses the degree of imprecision in his system of “Evidential Probability” to quantify weight. I develop and defend this approach to measuring weight. I illustrate the usefulness (...) of this measure by employing it to develop an answer to Popper’s Paradox of Ideal Evidence. (shrink)
I argue that none of the usual interpretations of probability provide an adequate interpretation of probabilistic theories in science. Assuming that the aim of such theories is to capture noisy relationships in the world, I suggest that we do not have to give them classical truth-conditional content at all: their probabilities can remain uninterpreted. Indirectly, this account turns out to explain what is right about the frequency interpretation, the best-systems interpretation, and the epistemic interpretation.
Tomoji Shogenji is generally assumed to be the first author to have presented a probabilistic measure of coherence. Interestingly, Rudolf Carnap in his Logical Foundations of Probability discussed a function that is based on the very same idea, namely his well-known relevance measure. This function is largely neglected in the coherence literature because it has been proposed as a measure of evidential support and still is widely conceived as such. The aim of this paper is therefore to investigate Carnap’s measure (...) regarding its plausibility as a candidate for a probabilistic measure of coherence by comparing it to Shogenji’s. It turns out that both measures satisfy and violate the same adequacy constraints, despite not being ordinally equivalent exhibit a strong correlation with each other in a Monte Carlo simulation and perform similarly in a series of test cases for probabilistic coherence measures. (shrink)
B.K.Matilal, and earlier J.F.Staal, have suggested a reading of the `Nyaya five limb schema' (also sometimes referred to as the Indian Schema or Hindu Syllogism) from Gotama's Nyaya-Sutra in terms of a binary occurrence relation. In this paper we provide a rational justification of a version of this reading as Analogical Reasoning within the framework of Polyadic Pure Inductive Logic.
In a recent paper in this journal, James Hawthorne, Jürgen Landes, Christian Wallmann, and Jon Williamson argue that the principal principle entails the principle of indifference. In this paper, I argue that it does not. Lewis’s version of the principal principle notoriously depends on a notion of admissibility, which Lewis uses to restrict its application. HLWW base their argument on certain intuitions concerning when one proposition is admissible for another: Conditions 1 and 2. There are two ways of reading their (...) argument, depending on how you understand the status of these conditions. Reading 1: The correct account of admissibility is determined independently of these two principles, and yet these two principles follow from that correct account. Reading 2: The correct account of admissibility is determined in part by these two principles, so that the principles follow from that account but only because the correct account is constrained so that it must satisfy them. HLWWshow that, given an account of admissibility on which Conditions 1 and 2 hold, the principal principle entails the principle of indifference. I argue that, on either reading of the argument, it fails. First, I argue that there is a plausible account of admissibility on which Conditions 1 and 2 are false. That defeats reading 1. Next, I argue that the intuitions that lead us to assent to Condition 2 also lead us to assent to other very closely related principles that are inconsistent with Condition 2. This, I claim, casts doubt on the reliability of those intuitions, and thus removes our justification for Condition 2. This defeats the second reading of the HLWW argument. Thus, the argument fails. (shrink)
В настоящата книга се изследва идеята за случайността през нейното историческо и концептуално развитие и са отделени пет основни и типични понятия. Анализът тръгва от класическите примери – Платон, Аристотел, Кант и Хегел – и стига до съвременния контекст на случайността, който е представен през теорията на вероятностите и теорията на сложността. Някои от изведените понятия са формализирани и имат по-логически, математически или пък информационен характер, а други са по-скоро физически или субектни, но всички те са представени в една обща (...) историческа типология. В нея се включва и парадигмата срещу случайността, която има теоретична, културна и дори органична природа. (shrink)
We investigate the relative probabilistic support afforded by the combination of two analogies based on possibly different, structural similarity (as opposed to e.g. shared predicates) within the context of Pure Inductive Logic and under the assumption of Language Invariance. We show that whilst repeated analogies grounded on the same structural similarity only strengthen the probabilistic support this need not be the case when combining analogies based on different structural similarities. That is, two analogies may provide less support than each would (...) individually. (shrink)
Spectrum Exchangeability, Sx, is an irrelevance principle of Pure Inductive Logic, and arguably the most natural extension of Atom Exchangeability to polyadic languages. It has been shown1that all probability functions which satisfy Sx are comprised of a mixture of two essential types of probability functions; heterogeneous and homogeneous functions. We determine the theory of Spectrum Exchangeability, which for a fixed languageLis the set of sentences ofLwhich must be assigned probability 1 by every probability function satisfying Sx, by examining separately the (...) theories of heterogeneity and homogeneity. We find that the theory of Sx is equal to the theory of finite structures, i.e., those sentences true in all finite structures forL, and it emerges that Sx is inconsistent with the principle of Super-Regularity. As a further consequence we are able to characterize those probability functions which satisfy Sx and the Finite Values Property. (shrink)
We propose two principles of inductive reasoning related to how observed information is handled by conditioning, and justify why they may be said to represent aspects of rational reasoning. A partial classification is given of the probability functions which satisfy these principles.
We extend the framework of Inductive Logic to Second Order languages and introduce Wilmers' Principle, a rational principle for probability functions on Second Order languages. We derive a representation theorem for functions satisfying this principle and investigate its relationship to the first order principles of Regularity and Super Regularity.
In Pure Inductive Logic, the rational principle of Predicate Exchangeability states that permuting the predicates in a given language L and replacing each occurrence of a predicate in an L-sentence phi according to this permutation should not change our belief in the truth of phi. In this paper we study when a prior probability function w on a purely unary language L satisfying Predicate Exchangeability also satisfies the principle of Unary Language Invariance.
Sometimes we receive evidence in a form that standard conditioning (or Jeffrey conditioning) cannot accommodate. The principle of maximum entropy (MAXENT) provides a unique solution for the posterior probability distribution based on the intuition that the information gain consistent with assumptions and evidence should be minimal. Opponents of objective methods to determine these probabilities prominently cite van Fraassen’s Judy Benjamin case to undermine the generality of maxent. This article shows that an intuitive approach to Judy Benjamin’s case supports maxent. This (...) is surprising because based on independence assumptions the anticipated result is that it would support the opponents. It also demonstrates that opponents improperly apply independence assumptions to the problem. (shrink)
In 1977 van Fraassen showed convincingly, and in detail, how one can give a dissentive answer to the question `[a]re there necessities in nature?'. In this paper I follow his lead and show in a similar fashion and detail, how it is possible to give a dissentive answer to: Are there probabilities in nature? This is achieved by giving a partial analysis—with the aid of Kaplanian pragmatics—of objective chance in terms of that credence that is reasonable where prevailing laws and (...) conditions exhaust one's evidence. This template belongs firmly within the established Bayesian program of analysing objective chance as ultimate belief. Its contribution to that program is the same as van Fraassen's contribution to the empiricist program of analysing physical necessity; namely, it demonstrates the logical possibility of such an analysis. (shrink)
The goal of this paper is to correct a widespread misconception about the work of Robert Leslie Ellis and John Venn, namely that it can be considered as the ‘British empiricist’ reaction against the traditional theory of probability. It is argued, instead, that there was no unified ‘British school’ of frequentism during the nineteenth century. Where Ellis arrived at frequentism from a metaphysical idealist transformation of probability theory’s mathematical calculations, Venn did so on the basis of an empiricist critique of (...) its ‘inverse application’. (shrink)
This volume is a collection of cutting-edge research papers in scientifically informed metaphysics, tackling a range of philosophical puzzles which have emerged from recent work on chance and temporal asymmetry. How do the probabilities found in fundamental physics and the probabilities of the special sciences relate to one another? How can we account for the normative significance of chance? Can constraints on the initial conditions of the universe underwrite the second law of thermodynamics, and potentially also all other lawlike regularities? (...) How does contemporary quantum theory bear on debates over the nature of chance and the arrow of time? What grounds do we have for believing in a fundamental temporal direction, or flow? And how do all these questions connect up with one another? The aim of the volume is both to survey and summarize recent debates about chance and temporal asymmetry and to push them forward. The authors bring perspectives from metaphysics, from philosophy of physics and from philosophy of probability. Mainstream approaches are subjected to searching new critiques, and bold new proposals are made concerning (inter alia) the semantics of chance-attributions, the justification of the Principal Principle connecting chance and degree of belief, the limits on inter-theoretic reduction and the source of the temporal asymmetry of human experience. (shrink)
Mathematicians often speak of conjectures, yet unproved, as probable or well-confirmed by evidence. The Riemann Hypothesis, for example, is widely believed to be almost certainly true. There seems no initial reason to distinguish such probability from the same notion in empirical science. Yet it is hard to see how there could be probabilistic relations between the necessary truths of pure mathematics. The existence of such logical relations, short of certainty, is defended using the theory of logical probability (or objective Bayesianism (...) or non-deductive logic), and some detailed examples of its use in mathematics surveyed. Examples of inductive reasoning in experimental mathematics are given and it is argued that the problem of induction is best appreciated in the mathematical case. (shrink)
We propose an Analogy Principle in the context of Unary Inductive Logic and characterize the probability functions which satisfy it. In particular in the case of a language with just two predicates the probability functions satisfying this principle correspond to solutions of Skyrmsʼ ‘Wheel of Fortune’.
This note has three goals. First, we discuss a presentation of Bertrand's paradox in a recent issue of Philosophia Mathematica, which we believe to be a subtle but important misinterpretation of the problem. We compare claims made about Bertrand with his 1889 Calcul des Probabilités. Second, we use this source to understand Bertrand's true intention in describing what we now call his paradox, comparing it both to another problem he describes in the same section and to a modern treatment. Finally, (...) we briefly consider the importance of knowing when a specific example represents a general case. (shrink)
We characterize those identities and independencies which hold for all probability functions on a unary language satisfying the Principle of Atom Exchangeability. We then show that if this is strengthen to the requirement that Johnson's Sufficientness Principle holds, thus giving Carnap's Continuum of inductive methods for languages with at least two predicates, then new and somewhat inexplicable identities and independencies emerge, the latter even in the case of Carnap's Continuum for the language with just a single predicate.
This paper shows that Bertrand's proposed 'solutions' to his own question, which generates his chord paradox, are inapplicable. It uses a simple analogy with cake cutting. The problem is that none of Bertrand's solutions considers all possible cuts. This is no solace for the defenders of the principle of indifference, however, because it emerges that the paradox is harder to solve than previously anticipated.
This article shows that Popper’s measure of corroboration is inapplicable if, as Popper argued, the logical probability of synthetic universal statements is zero relative to any evidence that we might possess. It goes on to show that Popper’s definition of degree of testability, in terms of degree of logical content, suffers from a similar problem. 1 The Corroboration Function and P(h|b) 2 Degrees of Testability and P(h|b).
A family of symmetries of polyadic inductive logic are described which in turn give rise to the purportedly rational Permutation Invariance Principle stating that a rational assignment of probabilities should respect these symmetries. An equivalent, and more practical, version of this principle is then derived.
I consider the notions of logical probability, degree of belief, and objective chance, and argue that a different formalism for conditional probability is appropriate for each.
The objective Bayesian view of proof (or logical probability, or evidential support) is explained and defended: that the relation of evidence to hypothesis (in legal trials, science etc) is a strictly logical one, comparable to deductive logic. This view is distinguished from the thesis, which had some popularity in law in the 1980s, that legal evidence ought to be evaluated using numerical probabilities and formulas. While numbers are not always useful, a central role is played in uncertain reasoning by the (...) ‘proportional syllogism’, or argument from frequencies, such as ‘nearly all aeroplane flights arrive safely, so my flight is very likely to arrive safely’. Such arguments raise the ‘problem of the reference class’, arising from the fact that an individual case may be a member of many different classes in which frequencies differ. For example, if 15 per cent of swans are black and 60 per cent of fauna in the zoo is black, what should I think about the likelihood of a swan in the zoo being black? The nature of the problem is explained, and legal cases where it arises are given. It is explained how recent work in data mining on the relevance of features for prediction provides a solution to the reference class problem. (shrink)
We give a unified account of some results in the development of Polyadic Inductive Logic in the last decade with particular reference to the Principle of Spectrum Exchangeability, its consequences for Instantial Relevance, Language Invariance and Johnson's Sufficientness Principle, and the corresponding de Finetti style representation theorems.
We consider two formalizations of the notion of irrelevance as a rationality principle within the framework of (Carnapian) Inductive Logic: Johnson's Sufficientness Principle, JSP, which is classically important because it leads to Carnap's influential Continuum of Inductive Methods and the recently proposed Weak Irrelevance Principle, WIP. We give a complete characterization of the language invariant probability functions satisfying WIP which generalizes the Nix-Paris Continuum. We argue that the derivation of two very disparate families of inductive methods from alternative perceptions of (...) 'irrelevance' is an indication that this notion is imperfectly understood at present. (shrink)
Pure Inductive Logic is the study of rational probability treated as a branch of mathematical logic. This monograph, the first devoted to this approach, brings together the key results from the past seventy years, plus the main contributions of the authors and their collaborators over the last decade, to present a comprehensive account of the discipline within a single unified context.
In this chapter we draw connections between two seemingly opposing approaches to probability and statistics: evidential probability on the one hand and objective Bayesian epistemology on the other.
Wittgenstein did not write very much on the topic of probability. The little we have comes from a few short pages of the Tractatus, some 'remarks' from the 1930s, and the informal conversations which went on during that decade with the Vienna Circle. Nevertheless, Wittgenstein's views were highly influential in the later development of the logical theory of probability. This paper will attempt to clarify and defend Wittgenstein's conception of probability against some oft-cited criticisms that stem from a misunderstanding of (...) his views. Max Black, for instance, criticises Wittgenstein for formulating a theory of probability that is capable of being used only against the backdrop of the ideal language of the Tractatus. I argue that on the contrary, by appealing to the 'hypothetical laws of nature', Wittgenstein is able to make sense of probability statements involving propositions that have not been completely analysed. G.H. von Wright criticises Wittgenstein's characterisation of these very hypothetical laws. He argues that by introducing them Wittgenstein makes what is distinctive about his theory superfluous, for the hypothetical laws are directly inspired by statistical observations and hence these observations indirectly determine the mechanism by which the logical theory of probability operates. I argue that this is not the case at all, and that while statistical observations play a part in the formation of the hypothetical laws, these observations are only necessary, but not sufficient conditions for the introduction of these hypotheses. (shrink)
This paper seeks to defend the following conclusions: The program advanced by Carnap and other necessarians for probability logic has little to recommend it except for one important point. Credal probability judgments ought to be adapted to changes in evidence or states of full belief in a principled manner in conformity with the inquirer’s confirmational commitments—except when the inquirer has good reason to modify his or her confirmational commitment. Probability logic ought to spell out the constraints on rationally coherent confirmational (...) commitments. In the case where credal judgments are numerically determinate confirmational commitments correspond to Carnap’s credibility functions mathematically represented by so—called confirmation functions. Serious investigation of the conditions under which confirmational commitments should be changed ought to be a prime target for critical reflection. The necessarians were mistaken in thinking that confirmational commitments are immune to legitimate modification altogether. But their personalist or subjectivist critics went too far in suggesting that we might dispense with confirmational commitments. There is room for serious reflection on conditions under which changes in confirmational commitments may be brought under critical control. Undertaking such reflection need not become embroiled in the anti inductivism that has characterized the work of Popper, Carnap and Jeffrey and narrowed the focus of students of logical and methodological issues pertaining to inquiry. (shrink)