Stephen Read's Papers
Papers in Progress
Abstract: Bradwardine's solution to the the logical paradoxes depends on the idea that every sentence signifies many things, and its truth depends on things' being wholly as it signifies. This idea is underpinned by his claim that a sentence signies everything that follows from what it signifies. But the idea that signification is closed under entailment ap- pears too strong, just as logical omniscience is unacceptable in the logic of knowledge. What is needed is a more restricted closure principle. A clue can be found in speech act pluralism, whereby indirect speech reports are closed under intersubstitutivity of co-referential terms.
Abstract: John Buridan's introduction of the notion of non-normal propositions (propositiones de modo loquendi inconsueto) in his theory of the syllogism is a marked example of the influence of vernacular languages on the use of Latin in medieval logic and the regimentation of the language used. Classical Latin is an SOV language, in which the word order of the simplest sentence form is subject-object-verb, in contrast to the SVO order of the vernacular languages of the later Middle Ages. Buridan's so-called non-normal propositions arise from deeming the normal order to be the SVO of the vernacular, and so taking SOV, where the object-term precedes the verb, to be non-normal. In particular, introducing O-propositions of non-normal form permits conversion of normal O-propositions, meaning that all four propositions of the traditional square of opposition can be converted, thereby adding further possibilities to the theory of the assertoric syllogism.
Abstract: Whereas his predecessors attempted to make sense of, and if necessary correct, Aristotle's theory of the modal syllogism, John Buridan starts afresh in his Treatise on Consequences, treating separately of composite and divided modals, then of syllogisms of necessity, possibility, and with mixed premises. Finally, he comes in the penultimate chapter of the treatise, Book IV ch. 3, to present a concise treatment of syllogisms with premises of contingency, that is, two-sided possibility. The previous modal syllogisms had all been taken with an affirmed mode only, since modal conversion equates negated necessity and possibility with affirmed possibility and necessity, respectively. But in his Conclusions concerning syllogisms of contingency, he also treats those with negated mode. These are the non-contingency syllogisms.
Abstract: The standard rules for identity do not seem to lie in harmony, where the elimination-rule is justified by the meaning given to a logical operator by its introduction-rule. However, harmonious rules can be given for identity, showing that identity is a logical notion even on the inferentialist conception. These new rules are shown to be sound and complete. The present paper omits §3 of the original published version, which responded to a mistaken objection by the referee, as was shown by Michael Kremer in 'Read on identity and harmony: a friendly correction and simplification', Analysis, 67, April 2007, pp. 157-59.
Abstract: Jan Łukasiewicz's treatise on Aristotle's Syllogistic, published in the 1950s, has been very influential in framing contemporary understanding of Aristotle's logical systems. However, Łukasiewicz's interpretation is based on a number of tendentious claims, not least, the claim that the syllogistic was intended to apply only to non-empty terms. I show that this interpretation is not true to Aristotle's text and that a more coherent and faithful interpretation admits empty terms while maintaining all the relations of the traditional square of opposition.
Abstract: Thomas Bradwardine's solution to the semantic paradoxes, presented in his Insolubilia written in Oxford in the early 1320s, turns on two main principles: that a proposition is true only if things are wholly as it signifies; and that signification is closed under consequence. After exploring the background in Walter Burley's account of the signification of propositions, the question is considered of the extent to which Bradwardine's theory is compatible with the distribution of truth over conjunction, disjunction, negation and the conditional.
Abstract: Michael Dummett introduced the notion of harmony in response to Arthur Prior's tonkish attack on the idea of proof-theoretic justification of logical laws (or analytic validity). But Dummett vacillated between different conceptions of harmony, in an attempt to use the idea to underpin his anti-realism. Dag Prawitz had already articulated an idea of Gerhard Gentzen's into a procedure whereby elimination-rules are in some sense functions of the corresponding introduction-rules. The resulting conception of general-elimination harmony ensures that the rules are transparent in the meaning they confer, in that the elimination-rules match the meaning the introduction-rules confer. The general-elimination rules which result may be of higher level, in that the assumptions discharged by the rule may be of (the existence of) derivations rather than just of formulae. In many cases, such higher-level rules may be "flattened" to rules discharging only formulae. However, such flattening is often only possible in the richer context of so-called ''classical" or realist negation, or in a multiple-conclusion environment. In a constructivist context, the flattened rules are harmonious but not stable.
Abstract: The idea of proof-theoretic validity originated in the work of Gerhard Gentzen, when he suggested that the meaning of each logical expression was encapsulated in its introduction-rules, and that the elimination-rules were justified by the meaning so defined. It was developed by Dag Prawitz in a series of articles in the early 1970s, and by Michael Dummett in his William James lectures of 1976, later published as The Logical Basis of Metaphysics. The idea had been attacked in 1960 by Arthur Prior under the soubriquet ‘analytic validity’. Logical truths and logical consequences are deemed analytically valid by virtue of following, in a way which the present paper clarifies, from the meaning of the logical constants. But different logics are based on different rules, confer different meanings and so validate different theorems and consequences, some of which are arguably not true or valid at all. It seems to follow that some analytic statements are in fact false. The moral is that we must be careful what rules we adopt and what meanings we use our rules to determine.
Abstract: The most famous epistemic paradox is Fitch’s paradox. In it, Frederic Fitch offered a counterexample to the Principle of Knowability (PK), namely, that any true proposition can be known. His example is the proposition that some proposition is true but not known. This proposition is not paradoxical or contradictory in itself, but contradicts (PK), which many have found appealing. What is really paradoxical is any proposition which says of itself that it is true but unknown. Thomas Bradwardine, writing in the early 1320s, developed a solution to the semantic paradoxes (insolubilia) based on a closure principle for signification: every proposition signifies whatever is implied by what it signifies. In ch. 9 of his treatise, he extends his account to deal with various epistemic paradoxes. Comparison of Fitch’s paradox with one of these paradoxes, the Knower paradox (‘You do not know this proposition’) explains the puzzlement caused by Fitch’s paradox. Bradwardine argues that the Knower paradox signifies not only its own truth, but signifies also that it is not known that it is not known, and so is false, since it is known that it is not known. However, his argument is flawed and a different argument for its falsehood is required.
Abstract: In Concepts, Fodor identifies five non-negotiable constraints on any theory of concepts. These theses were all shared by the standard medieval theories of concepts. However, those theories were cognitivist, in contrast with Fodor’s: concepts are definitions, a form of natural knowledge. The medieval theories were formed under two influences, from Aristotle by way of Boethius, and from Augustine. The tension between them resulted in the Ockhamist notion of a natural language, concepts as signs. Thus conventional signs, spoken and written, signify things in the world by the mediation of concepts which themselves form a language of thought, signifying those things naturally by their similarity. Indeed, later medieval thinkers realised that everything signifies itself and what is like it naturally in a broad sense by means of the concept of its natural likeness.
Abstract: The focus of the paper is a sophism based on the proposition ‘This is Socrates’ found in a short treatise on obligational casus attributed to William Heytesbury. First, the background to the puzzle in Walter Burley’s traditional account of obligations (the responsio antiqua), and the objections and revisions made by Richard Kilvington and Roger Swyneshed, are presented. All six types of obligations described by Burley are outlined, including sit verum, the type used in the sophism. Kilvington and Swyneshed disliked the dynamic nature of the responsio antiqua, and Kilvington proposed a revision to the rules for irrelevant propositions. This allowed him to use a form of reasoning, the “disputational meta-argument”, which is incompatible with Burley’s rules. Heytesbury explicitly rejected Kilvington’s revision and the associated meta-argument. Swyneshed also revised Burley’s account of obligations, formulating the so-called responsio nova, characterised by the apparently surprising thesis that a conjunction can be denied both of whose conjuncts are granted. On closer inspection, however, his account is found to be less radical than first appears.
Abstract: One of the manuscripts of Buridan’s Summulae contains three figures, each in the form of an octagon. At each node of each octagon there are nine propositions. Buridan uses the figures to illustrate his doctrine of the syllogism, revising Aristotle's theory of the modal syllogism and adding theories of syllogisms with propositions containing oblique terms (such as ‘man’s donkey’) and with ‘propositions of non-normal construction’ (where the predicate precedes the copula). O-propositions of non-normal construction (i.e., ‘Some S (some) P is not’) allow Buridan to extend and systematize the theory of the assertoric (i.e., non-modal) syllogism. Buridan points to a revealing analogy between the three octagons. To understand their importance we need to rehearse the medieval theories of signification, supposition, truth and consequence.
Abstract: In his article "Verdades antiguas y modernas" (in the same issue, pp. 207-27), David Miller criticised Thomas Bradwardine’s theory of truth and signification and my defence of Bradwardine’s application of it to the semantic paradoxes. Much of Miller’s criticism is sympathetic and helpful in gaining a better understanding of the relationship between Bradwardine’s proposed solution to the paradoxes and Alfred Tarski’s. But some of Miller’s criticisms betray a misunderstanding of crucial aspects of Bradwardine’s account of truth and signification.
Abstract: The recovery of Aristotle’s logic during the twelfth century was a great stimulus to medieval thinkers. Among their own theories developed to explain Aristotle’s theories of valid and invalid reasoning was a theory of consequence, of what arguments were valid, and why. By the fourteenth century, two main lines of thought had developed, one at Oxford, the other at Paris. Both schools distinguished formal from material consequence, but in very different ways. In Buridan and his followers in Paris, formal consequence was that preserved under uniform substitution. In Oxford, in contrast, formal consequence included analytic consequences such as ‘If it’s a man, then it’s an animal’. Aristotle’s notion of syllogistic consequence was subsumed under the treatment of formal consequence. Buridan developed a general theory embracing the assertoric syllogism, the modal syllogism and syllogisms with oblique terms. The result was a thoroughly systematic and extensive treatment of logical theory and logical consequence which repays investigation.
Abstract: The editors invited us to write a short paper that draws together the main themes of logic in the Western tradition from the Classical Greeks to the modern period. To make it short we had to make it personal. We set out the themes that seemed to us either the deepest, or the most likely to be helpful for an Indian reader.
Abstract: Inferentialism claims that expressions are meaningful by virtue of rules governing their use. In particular, logical expressions are autonomous if given meaning by their introduction-rules, rules specifying the grounds for assertion of propositions containing them. If the elimination-rules do no more, and no less, than is justified by the introduction-rules, the rules satisfy what Prawitz, following Lorenzen, called an inversion principle. This connection between rules leads to a general form of elimination-rule, and when the rules have this form, they may be said to exhibit “general-elimination” harmony. Ge-harmony ensures that the meaning of a logical expression is clearly visible in its I-rule, and that the I- and E-rules are coherent, in encapsulating the same meaning. However, it does not ensure that the resulting logical system is normalizable, nor that it satisfies the conservative extension property, nor that it is consistent. Thus harmony should not be identified with any of these notions.
Abstract: What makes necessary truths true? I argue that all truth supervenes on how things are, and that necessary truths are no exception. What makes them true are proofs. But if so, the notion of proof needs to be generalized to include verification-transcendent proofs, proofs whose correctness exceeds our ability to verify it. It is incumbent on me, therefore, to show that arguments, such as Dummett's, that verification-truth is not compatible with the theory of meaning, are mistaken. The answer is that what we can conceive and construct far outstrips our actual abilities. I conclude by proposing a proof-theoretic account of modality, rejecting a claim of Armstrong's that modality can reside in non-modal truthmakers.
Abstract: Hartry Field’s revised logic for the theory of truth in his new book, Saving Truth from Paradox, seeking to preserve Tarski’s T-scheme, does not admit a full theory of negation. In response, CrispinWright proposed that the negation of a proposition is the proposition saying that some proposition inconsistent with the first is true. For this to work, we have to show that this proposition is entailed by any proposition incompatible with the first, that is, that it is the weakest proposition incompatible with the proposition whose negation it should be. To show that his proposal gave a full intuitionist theory of negation, Wright appealed to two principles, about incompatibility and entailment, and using them Field formulated a paradox of validity (or more precisely, of inconsistency).
The medieval mathematician, theologian and logician, Thomas Bradwardine, writing in the fourteenth century, proposed a solution to the paradoxes of truth which does not require any revision of logic. The key principle behind Bradwardine’s solution is a pluralist doctrine of meaning, or signification, that propositions can mean more than they explicitly say. In particular, he proposed that signification is closed under entailment. In light of this, Bradwardine revised the truth-rules, in particular, refining the T-scheme, so that a proposition is true only if everything that it signifies obtains. Thereby, he was able to show that any proposition which signifies that it itself is false, also signifies that it is true, and consequently is false and not true. I show that Bradwardine’s solution is also able to deal with Field’s paradox and others of a similar nature. Hence Field’s logical revisions are unnecessary to save truth from paradox.
Abstract: The medieval name for paradoxes like the famous Liar Paradox (“This proposition is false”) was “insolubles” or insolubilia. From the late-twelfth century to the end of the Middle Ages and beyond, such paradoxes were discussed at length by an enormous number of authors. Yet, unlike twentieth century interest in the paradoxes, medieval interest seems not to have been prompted by any sense of theoretical “crisis.” The history of the medieval discussions can be divided into three main periods: (a) an early stage, from the late-twelfth century to the 1320s; (b) a period of especially intense and original work, during roughly the second quarter of the fourteenth century; (c) a late period, from about 1350 on.
Abstract: In recent years, speech-act theory has mooted the possibility that one utterance can signify a number of different things. This pluralist conception of signification lies at the heart of Thomas Bradwardine's solution to the insolubles, logical puzzles such as the semantic paradoxes, presented in Oxford in the early 1320s. His leading assumption was that signification is closed under consequence, that is, that a proposition signifies everything which follows from what it signifies. Then any proposition signifying its own falsity, he showed, also signifies its own truth and so, since it signifies things which cannot both obtain, it is simply false. Bradwardine himself, and his contemporaries, did not elaborate this pluralist theory, or say much in its defence. It can be shown to accord closely, however, with the prevailing conception of logical consequence in England in the fourteenth century. Recent pluralist theories of signification, such as Grice's, also endorse Bradwardine's closure postulate as a plausible constraint on signification, and so his analysis of the semantic paradoxes is seen to be both well-grounded and plausible.
Abstract: In order to explicate Gentzen's famous remark that the introduction-rules for logical constants give their meaning, the elimination-rules being simply consequences of the meaning so given, we develop natural deduction rules for Sheffer's stroke, alternative denial. The first system turns out to lack Double Negation. Strengthening the introduction-rules by allowing the introduction of Sheffer's stroke into a disjunctive context produces a complete system of classical logic, one which preserves the harmony between the rules which Gentzen wanted: all indirect proof reduces to direct proof.
School's Home Page