Stephen Read: Publications and Work in Progress
Abstract: Although the theory of the assertoric syllogism was Aristotle's great invention, one which dominated logical theory for the succeeding two millenia, accounts of the syllogism evolved and changed over that time. Indeed, in the twentieth century, doctrines were attributed to Aristotle which lost sight of what Aristotle intended. One of these mistaken doctrines was the very form of the syllogism: that a syllogism consists of three propositions containing three terms arranged in four figures. Yet another was that a syllogism is a conditional proposition deduced from a set of axioms. There is even unclarity about what the basis of syllogistic validity consists in. Returning to Aristotle's text, and reading it in the light of commentary from late antiquity and the middle ages, we find a coherent and precise theory which shows all these claims to be based on a misunderstanding and misreading.
Abstract: Anti-exceptionalism about logic is the doctrine that logic does not require its own epistemology, for its methods are continuous with those of science. Although most recently urged by Williamson, the idea goes back at least to Lakatos, who wanted to extend and adapt Popper's falsificationism not only to mathematics but to logic as well. But one needs to be careful here to distinguish the empirical from the a posteriori. Lakatos coined the term 'quasi-empirical' for the counterinstances to putative mathematical and logical theses. Mathematics and logic may both be a posteriori, but it does not follow that they are empirical. Indeed, as Williamson has demonstrated, what counts as empirical knowledge, and the role of experience in acquiring knowledge, are both unclear. Moreover, knowledge, even of necessary truths, is fallible. Nonetheless, logical consequence holds in virtue of the meaning of the logical terms, just as consequence in general holds in virtue of the meanings of the concepts involved; and so logic is both analytic and necessary. In this respect, it is exceptional. But its methodology and its epistemology are the same as those of mathematics and science in being fallibilist, and counterexamples to seemingly analytic truths are as likely as those in any scientific endeavour. What is needed is a new account of the evidential basis of knowledge, one which is, perhaps surprisingly, found in Aristotle.
Abstract: In line with the Principle of Uniform Solution, Graham Priest has challenged advocates like myself of the medieval logician Thomas Bradwardine's solution to the paradoxes of truth and knowledge to extend his account to a similar solution to the paradoxes of denotation, such as Berry's, König's and Richard's. I here rise to this challenge by showing how to adapt Bradwardine's principles of truth and signification for propositions to corresponding principles of denotation and signification for descriptive phrases, applying them to give a Bradwardinian solution to the denotational paradoxes.
Abstract: The most famous epistemic paradox is Fitch’s paradox. In it, Frederic Fitch offered a counterexample to the Principle of Knowability (PK), namely, that any true proposition can be known. His example is the proposition that some proposition is true but not known. This proposition is not paradoxical or contradictory in itself, but contradicts (PK), which many have found appealing. What is really paradoxical is any proposition which says of itself that it is true but unknown. Thomas Bradwardine, writing in the early 1320s, developed a solution to the semantic paradoxes (insolubilia) based on a closure principle for signification: every proposition signifies whatever is implied by what it signifies. In ch. 9 of his treatise, he extends his account to deal with various epistemic paradoxes. Comparison of Fitch’s paradox with one of these paradoxes, the Knower paradox (‘You do not know this proposition’) explains the puzzlement caused by Fitch’s paradox. Bradwardine argues that the Knower paradox signifies not only its own truth, but signifies also that it is not known that it is not known, and so is false, since it is known that it is not known. However, his argument is flawed and a different argument for its falsehood is required.
Abstract: In the late 1970s, Paul Spade edited three treatises, on Consequences, Insolubles and Obligations, which he attributed to an otherwise unknown fourteenth-century logician whom he named as Robert Fland. We question this reading of the name and argue that his real name was Robert Eland. Moreover, we suggest that he should be identified with Eland the dialectician, whose Sophismata is mentioned in an account book at Merton College in 1367, and whose renown as a logician was disdainfully recorded some two hundred years later by the bibliographer John Bale.
Abstract: John Buridan's introduction of the notion of non-normal propositions (propositiones de modo loquendi inconsueto) in his theory of the syllogism is a marked example of the influence of vernacular languages on the use of Latin in medieval logic and the regimentation of the language used. Classical Latin is an SOV language, in which the word order of the simplest sentence form is subject-object-verb, in contrast to the SVO order of the vernacular languages of the later Middle Ages. Buridan's so-called non-normal propositions arise from deeming the normal order to be the SVO of the vernacular, and so taking SOV, where the object-term precedes the verb, to be non-normal. In particular, introducing O-propositions of non-normal form permits conversion of normal O-propositions, meaning that all four propositions of the traditional square of opposition can be converted, thereby adding further possibilities to the theory of the assertoric syllogism.
Abstract: General-elimination harmony articulates Gentzen's idea that the elimination-rules are justified if they infer from an assertion no more than can already be inferred from the grounds for making it. Dummett described the rules as not only harmonious but stable if the E-rules allow one to infer no more and no less than the I-rules justify. Pfenning and Davies call the rules locally complete if the E-rules are strong enough to allow one to infer the original judgement. A method is given of generating harmonious general-elimination rules from a collection of I-rules. We show that the general-elimination rules satisfy Pfenning and Davies' test for local completeness, but question whether that is enough to show that they are stable. Alternative conditions for stability are considered, including equivalence between the introduction- and elimination-meanings of a connective, and recovery of the grounds for assertion, finally generalizing the notion of local completeness to capture Dummett's notion of stability satisfactorily. We show that the general-elimination rules meet the last of these conditions, and so are indeed not only harmonious but also stable.
Abstract: Ian Rumfitt has recently drawn our attention to a couple of paradoxes of signification, claiming that although Thomas Bradwardine's "multiple-meanings" account of truth and signification can solve the first of them, it cannot solve the second. The paradoxes of signification were in fact much discussed by Bradwardine's successors in the fourteenth century. Bradwardine's solution appears to turn on a distinction between the principal and the consequential signification of an utterance. However, although such a distinction played an important role in his successors’ theories, it is shown that Bradwardine’s account of signification does not admit any such distinction, no part being prior to the others. Accordingly his solution, unlike those of his successors, does not fall prey to Rumfitt's paradoxes.
Abstract: Logic in the 14th century was largely dominated by the rather different English and Parisian traditions. Towards the end of the previous century, modism had gained the ascendancy in Paris, and the study of logic there declined, kept alive in England until Walter Burley, trained in Oxford, reintroduced it to Paris. Himself a realist, Burley’s logical theories provided the tools by which the terminist approach came to fruition in the nominalist works of William Ockham (largely based in London) and John Buridan (writing and teaching in Paris for some forty years). Besides Buridan’s writings and the usual commentaries on the logical works of Aristotle’s, however, Burley’s legacy was little pursued in Paris before the second half of the century. In the first half, major innovations were made in Oxford, with an eruption of separate treatises on consequences, obligations and insolubles by the likes of Thomas Bradwardine, Roger Swyneshed and William Heytesbury, which continued later in the century in, inter alia, Ralph Strode, John Hunter and John Wyclif, and culminating in the massive survey of English logic in Paul of Venice’s Logica Magna, probably composed in Oxford in the 1390s. In Paris, logical developments were as much an heir to Albert of Saxony, himself following the English tradition, as to Buridan, in such authors as Marsilius of Inghen and Peter of Ailly, the two streams being brought together in John Dorp’s Perutile Compendium totius logicae Joannis Buridani of around 1393. The century ended with an explosion of logic throughout Europe with the rise of the new provincial universities, and the introduction of English logic into Italy through Peter of Mantua and others.
Abstract: This volume, the first dedicated and comprehensive Companion to Medieval Logic, covers both the Latin and the Arabic traditions and shows that they were in fact sister traditions, which both arose against the background of a Hellenistic heritage and which influenced one another over the centuries. A series of chapters by both established and younger scholars covers the whole period including early and late developments, and offers new insights into this extremely rich period in the history of logic. The volume is divided into two parts, ‘Periods and Traditions’ and ‘Themes’, allowing readers to engage with the subject from both historical and more systematic perspectives. It will be a must-read for students and scholars of medieval philosophy, the history of logic, and the history of ideas.
Cover picture: Giorgione (Giorgio Barbarelli da Castelfranco; c. 1477/8–1510), The Three Philosophers. Kunsthistorisches Museum, Vienna. Reproduced with permission. Although recent research suggests that the three figures are Pythagoras with his teachers Thales and Pherecydes, historically it has been interpreted as representing the transmission of knowledge from the Greeks through the Arabs to modern times.
Abstract: Inferentialism claims that the rules for the use of an expression express its meaning without any need to invoke meanings or denotations for them. Logical inferentialism endorses inferentialism specifically for the logical constants. Harmonic inferentialism, as the term is introduced here, usually but not necessarily a subbranch of logical inferentialism, follows Gentzen in proposing that it is the introduction-rules whch give expressions their meaning and the elimination-rules should accord harmoniously with the meaning so given. It is proposed here that the logical expressions are those which can be given schematic rules that lie in a specific sort of harmony, general-elimination (ge) harmony, resulting from applying the ge-procedure to produce ge-rules in accord with the meaning defined by the I-rules. Griffiths (2014) claims that identity cannot be given such rules, concluding that logical inferentialists are committed to ruling identity a non-logical expression. It is shown that the schematic rules for identity given in Read (2004) are indeed ge-harmonious, so confirming that identity is a logical notion.
Abstract: Logical inferentialism claims that the meaning of the logical constants should be given, not model-theoretically, but by the rules of inference of a suitable calculus. It has been claimed that certain proof-theoretical systems, most particularly, labelled deductive systems for modal logic, are unsuitable, on the grounds that they are semantically polluted and suffer from an untoward intrusion of semantics into syntax. The charge is shown to be mistaken. It is argued on inferentialist grounds that labelled deductive systems are as syntactically pure as any formal system in which the rules define the meanings of the logical constants..
Abstract: Kretzmann and Spade were led by Richard Kilvington's proposed revisions to the rules of obligations in his discussion of the 47th sophism in his Sophismata to claim that the purpose of obligational disputations was the same as that of counterfactual reasoning. Angel d'Ors challenged this interpretation, realising that the reason for Kilvington's revision was precisely that he found the art of obligation unsuited to the kind of reasoning which lay at the heart of the sophismatic argument. In his criticism, Kilvington focussed on a technique used by Walter Burley to force a respondent to grant an arbitrary falsehood and similar to Lewis and Langford’s famous defence of ex impossibili quodlibet. Kilvington observes that just as in obligational disputation, one may be obliged to grant a false proposition and deny a true one, so in counterfactual reasoning one may be obliged to doubt a proposition whose truth or falsity one knows, on pain of contradiction.
Abstract: Jan Łukasiewicz's treatise on Aristotle's Syllogistic, published in the 1950s, has been very influential in framing contemporary understanding of Aristotle's logical systems. However, Łukasiewicz's interpretation is based on a number of tendentious claims, not least, the claim that the syllogistic was intended to apply only to non-empty terms. I show that this interpretation is not true to Aristotle's text and that a more coherent and faithful interpretation admits empty terms while maintaining all the relations of the traditional square of opposition.
Abstract: Thomas Bradwardine's solution to the semantic paradoxes, presented in his Insolubilia written in Oxford in the early 1320s, turns on two main principles: that a proposition is true only if things are wholly as it signifies; and that signification is closed under consequence. After exploring the background in Walter Burley's account of the signification of propositions, the question is considered of the extent to which Bradwardine's theory is compatible with the distribution of truth over conjunction, disjunction, negation and the conditional.
Abstract: The idea of proof-theoretic validity originated in the work of Gerhard Gentzen, when he suggested that the meaning of each logical expression was encapsulated in its introduction-rules, and that the elimination-rules were justified by the meaning so defined. It was developed by Dag Prawitz in a series of articles in the early 1970s, and by Michael Dummett in his William James lectures of 1976, later published as The Logical Basis of Metaphysics. The idea had been attacked in 1960 by Arthur Prior under the soubriquet ‘analytic validity’. Logical truths and logical consequences are deemed analytically valid by virtue of following, in a way which the present paper clarifies, from the meaning of the logical constants. But different logics are based on different rules, confer different meanings and so validate different theorems and consequences, some of which are arguably not true or valid at all. It seems to follow that some analytic statements are in fact false. The moral is that we must be careful what rules we adopt and what meanings we use our rules to determine.
Abstract: Bradwardine’s solution to the the logical paradoxes depends on the idea that every sentence signifies many things, and its truth depends on things’ being wholly as it signifies. This idea is underpinned by his claim that a sentence signifies everything that follows from what it signifies. But the idea that signification is closed under entailment appears too strong, just as logical omniscience is unacceptable in the logic of knowledge. What is needed is a more restricted closure principle. A clue can be found in speech act pluralism, whereby indirect speech reports are closed under intersubstitutivity of co-referential terms. The conclusion is that solving the semantic paradoxes does not require revision of logic, thus saving logic from paradox.
Abstract: Michael Dummett introduced the notion of harmony in response to Arthur Prior's tonkish attack on the idea of proof-theoretic justification of logical laws (or analytic validity). But Dummett vacillated between different conceptions of harmony, in an attempt to use the idea to underpin his anti-realism. Dag Prawitz had already articulated an idea of Gerhard Gentzen's into a procedure whereby elimination-rules are in some sense functions of the corresponding introduction-rules. The resulting conception of general-elimination harmony ensures that the rules are transparent in the meaning they confer, in that the elimination-rules match the meaning the introduction-rules confer. The general-elimination rules which result may be of higher level, in that the assumptions discharged by the rule may be of (the existence of) derivations rather than just of formulae. In many cases, such higher-level rules may be "flattened" to rules discharging only formulae. However, such flattening is often only possible in the richer context of so-called ''classical" or realist negation, or in a multiple-conclusion environment. In a constructivist context, the flattened rules are harmonious but not stable.
Abstract: Whereas his predecessors attempted to make sense of, and if necessary correct, Aristotle's theory of the modal syllogism, John Buridan starts afresh in his Treatise on Consequences, treating separately of composite and divided modals, then of syllogisms of necessity, possibility, and with mixed premises. Finally, he comes in the penultimate chapter of the treatise, Book IV ch. 3, to present a concise treatment of syllogisms with premises of contingency, that is, two-sided possibility. The previous modal syllogisms had all been taken with an affirmed mode only, since modal conversion equates negated necessity and possibility with affirmed possibility and necessity, respectively. But in his Conclusions concerning syllogisms of contingency, he also treats those with negated mode. These are the non-contingency syllogisms.
Abstract: In Concepts, Fodor identifies five non-negotiable constraints on any theory of concepts. These theses were all shared by the standard medieval theories of concepts. However, those theories were cognitivist, in contrast with Fodor’s: concepts are definitions, a form of natural knowledge. The medieval theories were formed under two influences, from Aristotle by way of Boethius, and from Augustine. The tension between them resulted in the Ockhamist notion of a natural language, concepts as signs. Thus conventional signs, spoken and written, signify things in the world by the mediation of concepts which themselves form a language of thought, signifying those things naturally by their similarity. Indeed, later medieval thinkers realised that everything signifies itself and what is like it naturally in a broad sense by means of the concept of its natural likeness.
The rediscovery of Aristotle in the late twelfth century led to a fresh development of logical theory, culminating in Buridan’s crucial comprehensive treatment in the Treatise on Consequences. Buridan’s novel treatment of the categorical syllogism laid the basis for the study of logic in succeeding centuries.
This new translation offers a clear and accurate rendering of Buridan’s text. It is prefaced by a substantial Introduction that outlines the work’s context and explains its argument in detail. Also included is a translation of the Introduction (in French) to the 1976 edition of the Latin text by Hubert Hubien.
Abstract: The focus of the paper is a sophism based on the proposition ‘This is Socrates’ found in a short treatise on obligational casus attributed to William Heytesbury. First, the background to the puzzle in Walter Burley’s traditional account of obligations (the responsio antiqua), and the objections and revisions made by Richard Kilvington and Roger Swyneshed, are presented. All six types of obligations described by Burley are outlined, including sit verum, the type used in the sophism. Kilvington and Swyneshed disliked the dynamic nature of the responsio antiqua, and Kilvington proposed a revision to the rules for irrelevant propositions. This allowed him to use a form of reasoning, the “disputational meta-argument”, which is incompatible with Burley’s rules. Heytesbury explicitly rejected Kilvington’s revision and the associated meta-argument. Swyneshed also revised Burley’s account of obligations, formulating the so-called responsio nova, characterised by the apparently surprising thesis that a conjunction can be denied both of whose conjuncts are granted. On closer inspection, however, his account is found to be less radical than first appears.
Abstract: One of the manuscripts of Buridan’s Summulae contains three figures, each in the form of an octagon. At each node of each octagon there are nine propositions. Buridan uses the figures to illustrate his doctrine of the syllogism, revising Aristotle's theory of the modal syllogism and adding theories of syllogisms with propositions containing oblique terms (such as ‘man’s donkey’) and with ‘propositions of non-normal construction’ (where the predicate precedes the copula). O-propositions of non-normal construction (i.e., ‘Some S (some) P is not’) allow Buridan to extend and systematize the theory of the assertoric (i.e., non-modal) syllogism. Buridan points to a revealing analogy between the three octagons. To understand their importance we need to rehearse the medieval theories of signification, supposition, truth and consequence.
Abstract: In his article "Verdades antiguas y modernas" (in the same issue, pp. 207-27), David Miller criticised Thomas Bradwardine’s theory of truth and signification and my defence of Bradwardine’s application of it to the semantic paradoxes. Much of Miller’s criticism is sympathetic and helpful in gaining a better understanding of the relationship between Bradwardine’s proposed solution to the paradoxes and Alfred Tarski’s. But some of Miller’s criticisms betray a misunderstanding of crucial aspects of Bradwardine’s account of truth and signification.
Abstract: The recovery of Aristotle’s logic during the twelfth century was a great stimulus to medieval thinkers. Among their own theories developed to explain Aristotle’s theories of valid and invalid reasoning was a theory of consequence, of what arguments were valid, and why. By the fourteenth century, two main lines of thought had developed, one at Oxford, the other at Paris. Both schools distinguished formal from material consequence, but in very different ways. In Buridan and his followers in Paris, formal consequence was that preserved under uniform substitution. In Oxford, in contrast, formal consequence included analytic consequences such as ‘If it’s a man, then it’s an animal’. Aristotle’s notion of syllogistic consequence was subsumed under the treatment of formal consequence. Buridan developed a general theory embracing the assertoric syllogism, the modal syllogism and syllogisms with oblique terms. The result was a thoroughly systematic and extensive treatment of logical theory and logical consequence which repays investigation.
Abstract: The editors invited us to write a short paper that draws together the main themes of logic in the Western tradition from the Classical Greeks to the modern period. To make it short we had to make it personal. We set out the themes that seemed to us either the deepest, or the most likely to be helpful for an Indian reader.
The fourteenth-century thinker Thomas Bradwardine is well known in both the history of science and the history of theology. The first of the Merton Calculators (mathematical physicists) and passionate defender of the Augustinian doctrine of salvation through grace alone, he was briefly archbishop of Canterbury before succumbing to the Black Death in 1349. This new edition of his Insolubilia, made from all thirteen known manuscripts, shows that he was also a logician of the first rank. The edition is accompanied by a full English translation. In the treatise, Bradwardine considers and rejects the theories of his contemporaries about the logical puzzles known as "insolubles," and sets out his own solution at length and in detail. In a substantial introduction, Stephen Read describes Bradwardine's analysis, compares it with other more recent theories, and places it in its historical context. The text is accompanied by three appendices, the first of which is an extra chapter found in two manuscripts (and partly in a third) that appears to contain further thoughts by Bradwardine himself. The second contains an extract from Ralph Strode's Insolubilia, composed in the 1360s, repeating and enlarging on Bradwardine's text; and the third consists of an anonymous text that applies Bradwardine's solution to a succession of different insolubles.
|Here is a list of corrections and improvements.
Abstract: Inferentialism claims that expressions are meaningful by virtue of rules governing their use. In particular, logical expressions are autonomous if given meaning by their introduction-rules, rules specifying the grounds for assertion of propositions containing them. If the elimination-rules do no more, and no less, than is justified by the introduction-rules, the rules satisfy what Prawitz, following Lorenzen, called an inversion principle. This connection between rules leads to a general form of elimination-rule, and when the rules have this form, they may be said to exhibit “general-elimination” harmony. Ge-harmony ensures that the meaning of a logical expression is clearly visible in its I-rule, and that the I- and E-rules are coherent, in encapsulating the same meaning. However, it does not ensure that the resulting logical system is normalizable, nor that it satisfies the conservative extension property, nor that it is consistent. Thus harmony should not be identified with any of these notions.
Abstract: What makes necessary truths true? I argue that all truth supervenes on how things are, and that necessary truths are no exception. What makes them true are proofs. But if so, the notion of proof needs to be generalized to include verification-transcendent proofs, proofs whose correctness exceeds our ability to verify it. It is incumbent on me, therefore, to show that arguments, such as Dummett's, that verification-truth is not compatible with the theory of meaning, are mistaken. The answer is that what we can conceive and construct far outstrips our actual abilities. I conclude by proposing a proof-theoretic account of modality, rejecting a claim of Armstrong's that modality can reside in non-modal truthmakers.
Abstract: Hartry Field’s revised logic for the theory of truth in his new book, Saving Truth from Paradox, seeking to preserve Tarski’s T-scheme, does not admit a full theory of negation. In response, CrispinWright proposed that the negation of a proposition is the proposition saying that some proposition inconsistent with the first is true. For this to work, we have to show that this proposition is entailed by any proposition incompatible with the first, that is, that it is the weakest proposition incompatible with the proposition whose negation it should be. To show that his proposal gave a full intuitionist theory of negation, Wright appealed to two principles, about incompatibility and entailment, and using them Field formulated a paradox of validity (or more precisely, of inconsistency).
The medieval mathematician, theologian and logician, Thomas Bradwardine, writing in the fourteenth century, proposed a solution to the paradoxes of truth which does not require any revision of logic. The key principle behind Bradwardine’s solution is a pluralist doctrine of meaning, or signification, that propositions can mean more than they explicitly say. In particular, he proposed that signification is closed under entailment. In light of this, Bradwardine revised the truth-rules, in particular, refining the T-scheme, so that a proposition is true only if everything that it signifies obtains. Thereby, he was able to show that any proposition which signifies that it itself is false, also signifies that it is true, and consequently is false and not true. I show that Bradwardine’s solution is also able to deal with Field’s paradox and others of a similar nature. Hence Field’s logical revisions are unnecessary to save truth from paradox.
Abstract: The medieval name for paradoxes like the famous Liar Paradox (“This proposition is false”) was “insolubles” or insolubilia. From the late-twelfth century to the end of the Middle Ages and beyond, such paradoxes were discussed at length by an enormous number of authors. Yet, unlike twentieth century interest in the paradoxes, medieval interest seems not to have been prompted by any sense of theoretical “crisis.” The history of the medieval discussions can be divided into three main periods: (a) an early stage, from the late-twelfth century to the 1320s; (b) a period of especially intense and original work, during roughly the second quarter of the fourteenth century; (c) a late period, from about 1350 on.
Abstract: In recent years, speech-act theory has mooted the possibility that one utterance can signify a number of different things. This pluralist conception of signification lies at the heart of Thomas Bradwardine's solution to the insolubles, logical puzzles such as the semantic paradoxes, presented in Oxford in the early 1320s. His leading assumption was that signification is closed under consequence, that is, that a proposition signifies everything which follows from what it signifies. Then any proposition signifying its own falsity, he showed, also signifies its own truth and so, since it signifies things which cannot both obtain, it is simply false. Bradwardine himself, and his contemporaries, did not elaborate this pluralist theory, or say much in its defence. It can be shown to accord closely, however, with the prevailing conception of logical consequence in England in the fourteenth century. Recent pluralist theories of signification, such as Grice's, also endorse Bradwardine's closure postulate as a plausible constraint on signification, and so his analysis of the semantic paradoxes is seen to be both well-grounded and plausible.
Abstract: The standard rules for identity do not seem to lie in harmony, where the elimination-rule is justified by the meaning given to a logical operator by its introduction-rule. However, harmonious rules can be given for identity, showing that identity is a logical notion even on the inferentialist conception. These new rules are shown to be sound and complete. The revised paper omits §3 of the original published version, which responded to an objection by the referee, shown to be mistaken by Michael Kremer in 'Read on identity and harmony: a friendly correction and simplification', Analysis, 67, April 2007, pp. 157-59.
Abstract: In order to explicate Gentzen's famous remark that the introduction-rules for logical constants give their meaning, the elimination-rules being simply consequences of the meaning so given, we develop natural deduction rules for Sheffer's stroke, alternative denial. The first system turns out to lack Double Negation. Strengthening the introduction-rules by allowing the introduction of Sheffer's stroke into a disjunctive context produces a complete system of classical logic, one which preserves the harmony between the rules which Gentzen wanted: all indirect proof reduces to direct proof.
School's Home Page