Logic
Part of a series on 
Philosophy 

Philosophers 
Traditions 
Periods 
Literature 
Branches 
Lists 
Philosophy portal 
Logic (from the Ancient Greek: λογική, logike)^{[1]} is the branch of philosophy concerned with the use and study of valid reasoning.^{[2]}^{[3]} The study of logic also features prominently in mathematics and computer science.
Logic was studied in several ancient civilizations, including India,^{[4]} China,^{[5]} Persia and Greece. In the West, logic was established as a formal discipline by Aristotle, who gave it a fundamental place in philosophy. The study of logic was part of the classical trivium, which also included grammar and rhetoric. Logic was further extended by AlFarabi who categorized it into two separate groups (idea and proof). Later, Avicenna revived the study of logic and developed relationship between temporalis and the implication. In the East, logic was developed by Hindus, Buddhists and Jains.
Logic is often divided into three parts: inductive reasoning, abductive reasoning, and deductive reasoning.
Contents

The study of logic 1
 Logical form 1.1
 Deductive and inductive reasoning, and abductive inference 1.2
 Consistency, validity, soundness, and completeness 1.3
 Rival conceptions of logic 1.4
 History 2

Types of logic 3
 Syllogistic logic 3.1
 Propositional logic (sentential logic) 3.2
 Predicate logic 3.3
 Modal logic 3.4
 Informal reasoning 3.5
 Mathematical logic 3.6
 Philosophical logic 3.7
 Computational logic 3.8
 Bivalence and the law of the excluded middle; nonclassical logics 3.9
 "Is logic empirical?" 3.10
 Implication: strict or material? 3.11
 Tolerating the impossible 3.12
 Rejection of logical truth 3.13
 See also 4
 Notes and references 5
 Bibliography 6
 External links 7
The study of logic
“  Upon this first, and in one sense this sole, rule of reason, that in order to learn you must desire to learn, and in so desiring not be satisfied with what you already incline to think, there follows one corollary which itself deserves to be inscribed upon every wall of the city of philosophy: Do not block the way of inquiry.  ” 
— Charles Sanders Peirce, "First Rule of Logic"

The concept of logical form is central to logic, it being held that the validity of an argument is determined by its logical form, not by its content. Traditional Aristotelian syllogistic logic and modern symbolic logic are examples of formal logics.
 Informal logic is the study of natural language arguments. The study of fallacies is an especially important branch of informal logic. The dialogues of Plato^{[6]} are good examples of informal logic.
 Formal logic is the study of inference with purely formal content. An inference possesses a purely formal content if it can be expressed as a particular application of a wholly abstract rule, that is, a rule that is not about any particular thing or property. The works of Aristotle contain the earliest known formal study of logic. Modern formal logic follows and expands on Aristotle.^{[7]} In many definitions of logic, logical inference and inference with purely formal content are the same. This does not render the notion of informal logic vacuous, because no formal logic captures all of the nuances of natural language.
 Symbolic logic is the study of symbolic abstractions that capture the formal features of logical inference.^{[8]}^{[9]} Symbolic logic is often divided into two branches: propositional logic and predicate logic.
 Mathematical logic is an extension of symbolic logic into other areas, in particular to the study of model theory, proof theory, set theory, and recursion theory.
However, agreement on what logic is has remained elusive, and although the field of universal logic has studied the common structure of logics, in 2007 Mossakowski et al commented that "it is embarrassing that there is no widely acceptable formal definition of 'a logic'".^{[10]}
Logical form
Logic is generally considered formal when it analyzes and represents the form of any valid argument type. The form of an argument is displayed by representing its sentences in the formal grammar and symbolism of a logical language to make its content usable in formal inference. If one considers the notion of form too philosophically loaded, one could say that formalizing simply means translating English sentences into the language of logic.
This is called showing the logical form of the argument. It is necessary because indicative sentences of ordinary language show a considerable variety of form and complexity that makes their use in inference impractical. It requires, first, ignoring those grammatical features irrelevant to logic (such as gender and declension, if the argument is in Latin), replacing conjunctions irrelevant to logic (such as "but") with logical conjunctions like "and" and replacing ambiguous, or alternative logical expressions ("any", "every", etc.) with expressions of a standard type (such as "all", or the universal quantifier ∀).
Second, certain parts of the sentence must be replaced with schematic letters. Thus, for example, the expression "all As are Bs" shows the logical form common to the sentences "all men are mortals", "all cats are carnivores", "all Greeks are philosophers", and so on.
That the concept of form is fundamental to logic was already recognized in ancient times. Aristotle uses variable letters to represent valid inferences in Prior Analytics, leading Jan Łukasiewicz to say that the introduction of variables was "one of Aristotle's greatest inventions".^{[11]} According to the followers of Aristotle (such as Ammonius), only the logical principles stated in schematic terms belong to logic, not those given in concrete terms. The concrete terms "man", "mortal", etc., are analogous to the substitution values of the schematic placeholders A, B, C, which were called the "matter" (Greek hyle) of the inference.
The fundamental difference between modern formal logic and traditional, or Aristotelian logic, lies in their differing analysis of the logical form of the sentences they treat.
 In the traditional view, the form of the sentence consists of (1) a subject (e.g., "man") plus a sign of quantity ("all" or "some" or "no"); (2) the copula, which is of the form "is" or "is not"; (3) a predicate (e.g., "mortal"). Thus: all men are mortal. The logical constants such as "all", "no" and so on, plus sentential connectives such as "and" and "or" were called syncategorematic terms (from the Greek kategorei – to predicate, and syn – together with). This is a fixed scheme, where each judgment has an identified quantity and copula, determining the logical form of the sentence.
 According to the modern view, the fundamental form of a simple sentence is given by a recursive schema, involving logical connectives, such as a quantifier with its bound variable, which are joined by juxtaposition to other sentences, which in turn may have logical structure.
 The modern view is more complex, since a single judgement of Aristotle's system involves two or more logical connectives. For example, the sentence "All men are mortal" involves, in term logic, two nonlogical terms "is a man" (here M) and "is mortal" (here D): the sentence is given by the judgement A(M,D). In predicate logic, the sentence involves the same two nonlogical concepts, here analyzed as m(x) and d(x), and the sentence is given by \forall x. (m(x) \rightarrow d(x)), involving the logical connectives for universal quantification and implication.
 But equally, the modern view is more powerful. Medieval logicians recognized the problem of multiple generality, where Aristotelian logic is unable to satisfactorily render such sentences as "Some guys have all the luck", because both quantities "all" and "some" may be relevant in an inference, but the fixed scheme that Aristotle used allows only one to govern the inference. Just as linguists recognize recursive structure in natural languages, it appears that logic needs recursive structure.
Deductive and inductive reasoning, and abductive inference
Deductive reasoning concerns what follows necessarily from given premises (if a, then b). However, inductive reasoning—the process of deriving a reliable inference from observations—is often included in the study of logic. Similarly, it is important to distinguish deductive validity and inductive validity (called "strength"). An inference is deductively valid if and only if there is no possible situation in which all the premises are true but the conclusion false. An inference is inductively strong if and only if its premises give some degree of probability to its conclusion.
The notion of deductive validity can be rigorously stated for systems of formal logic in terms of the wellunderstood notions of semantics. Inductive validity on the other hand requires us to define a reliable generalization of some set of observations. The task of providing this definition may be approached in various ways, some less formal than others; some of these definitions may use mathematical models of probability. For the most part this discussion of logic deals only with deductive logic.
Abduction^{[12]} is a form of logical inference that goes from observation to a hypothesis that accounts for the reliable data (observation) and seeks to explain relevant evidence. The American philosopher Charles Sanders Peirce (1839–1914) first introduced the term as "guessing".^{[13]} Peirce said that to abduce a hypothetical explanation a from an observed surprising circumstance b is to surmise that a may be true because then b would be a matter of course.^{[14]} Thus, to abduce a from b involves determining that a is sufficient (or nearly sufficient), but not necessary, for b.
Consistency, validity, soundness, and completeness
Among the important properties that logical systems can have are:
 Consistency, which means that no theorem of the system contradicts another.^{[15]}
 Validity, which means that the system's rules of proof never allow a false inference from true premises. A logical system has the property of soundness when the logical system has the property of validity and uses only premises that prove true (or, in the case of axioms, are true by definition).^{[15]}
 Completeness, which means that if a formula is true, it can be proven (if it is true, it is a theorem of the system).
 Soundness, which has multiple separate meanings, creating a bit of confusion throughout the literature. Most commonly, soundness refers to logical systems, which means that if some formula can be proven in a system, then it is true in the relevant model/structure (if A is a theorem, it is true). This is the converse of completeness. A distinct, peripheral use of soundness refers to arguments, which means that the premises of a valid argument are true in the actual world.
Some logical systems do not have all four properties. As an example, Kurt Gödel's incompleteness theorems show that sufficiently complex formal systems of arithmetic cannot be consistent and complete;^{[9]} however, firstorder predicate logics not extended by specific axioms to be arithmetic formal systems with equality can be complete and consistent.^{[16]}
Rival conceptions of logic
Logic arose (see below) from a concern with correctness of argumentation. Modern logicians usually wish to ensure that logic studies just those arguments that arise from appropriately general forms of inference. For example, Thomas Hofweber writes in the Stanford Encyclopedia of Philosophy that logic "does not, however, cover good reasoning as a whole. That is the job of the theory of rationality. Rather it deals with inferences whose validity can be traced back to the formal features of the representations that are involved in that inference, be they linguistic, mental, or other representations".^{[17]}
By contrast, Immanuel Kant argued that logic should be conceived as the science of judgement, an idea taken up in Gottlob Frege's logical and philosophical work. But Frege's work is ambiguous in the sense that it is both concerned with the "laws of thought" as well as with the "laws of truth", i.e. it both treats logic in the context of a theory of the mind, and treats logic as the study of abstract formal structures.
History
In Europe, logic was first developed by Aristotle.^{[18]} Aristotelian logic became widely accepted in science and mathematics and remained in wide use in the West until the early 19th century.^{[19]} Aristotle's system of logic was responsible for the introduction of hypothetical syllogism,^{[20]} temporal modal logic,^{[21]}^{[22]} and inductive logic,^{[23]} as well as influential terms such as terms, predicables, syllogisms and propositions. In Europe during the later medieval period, major efforts were made to show that Aristotle's ideas were compatible with Christian faith. During the High Middle Ages, logic became a main focus of philosophers, who would engage in critical logical analyses of philosophical arguments, often using variations of the methodology of scholasticism. In 1323, William of Ockham's influential Summa Logicae was released. By the 18th century, the structured approach to arguments had degenerated and fallen out of favour, as depicted in Holberg's satirical play Erasmus Montanus.
The Chinese logical philosopher Gongsun Long (c. 325–250 BCE) proposed the paradox "One and one cannot become two, since neither becomes two."^{[24]} In China, the tradition of scholarly investigation into logic, however, was repressed by the Qin dynasty following the legalist philosophy of Han Feizi.
In India, innovations in the scholastic school, called Aristotle's body of work on logic, with the Prior Analytics constituting the first explicit work in formal logic, introducing the syllogistic.^{[27]} The parts of syllogistic logic, also known by the name term logic, are the analysis of the judgements into propositions consisting of two terms that are related by one of a fixed number of relations, and the expression of inferences by means of syllogisms that consist of two propositions sharing a common term as premise, and a conclusion that is a proposition involving the two unrelated terms from the premises.
Aristotle's work was regarded in classical times and from medieval times in Europe and the Middle East as the very picture of a fully worked out system. However, it was not alone: the Stoics proposed a system of propositional logic that was studied by medieval logicians. Also, the problem of multiple generality was recognized in medieval times. Nonetheless, problems with syllogistic logic were not seen as being in need of revolutionary solutions.
Today, some academics claim that Aristotle's system is generally seen as having little more than historical value (though there is some current interest in extending term logics), regarded as made obsolete by the advent of propositional logic and the predicate calculus. Others use Aristotle in argumentation theory to help develop and critically question argumentation schemes that are used in artificial intelligence and legal arguments.
Propositional logic (sentential logic)
A propositional calculus or logic (also a sentential calculus) is a formal system in which formulae representing propositions can be formed by combining atomic propositions using logical connectives, and in which a system of formal proof rules establishes certain formulae as "theorems".
Predicate logic
Predicate logic is the generic term for symbolic formal systems such as firstorder logic, secondorder logic, manysorted logic, and infinitary logic.
Predicate logic provides an account of quantifiers general enough to express a wide set of arguments occurring in natural language. Aristotelian syllogistic logic specifies a small number of forms that the relevant part of the involved judgements may take. Predicate logic allows sentences to be analysed into subject and argument in several additional ways—allowing predicate logic to solve the problem of multiple generality that had perplexed medieval logicians.
The development of predicate logic is usually attributed to Gottlob Frege, who is also credited as one of the founders of analytical philosophy, but the formulation of predicate logic most often used today is the firstorder logic presented in Principles of Mathematical Logic by David Hilbert and Wilhelm Ackermann in 1928. The analytical generality of predicate logic allowed the formalization of mathematics, drove the investigation of set theory, and allowed the development of Alfred Tarski's approach to model theory. It provides the foundation of modern mathematical logic.
Frege's original system of predicate logic was secondorder, rather than firstorder. Stewart Shapiro.
Modal logic
In languages, modality deals with the phenomenon that subparts of a sentence may have their semantics modified by special verbs or modal particles. For example, "We go to the games" can be modified to give "We should go to the games", and "We can go to the games" and perhaps "We will go to the games". More abstractly, we might say that modality affects the circumstances in which we take an assertion to be satisfied.
Aristotle's logic is in large parts concerned with the theory of nonmodalized logic. Although, there are passages in his work, such as the famous seabattle argument in De Interpretatione § 9, that are now seen as anticipations of modal logic and its connection with potentiality and time, the earliest formal system of modal logic was developed by Avicenna, whom ultimately developed a theory of "temporally modalized" syllogistic.^{[28]}
While the study of necessity and possibility remained important to philosophers, little logical innovation happened until the landmark investigations of Clarence Irving Lewis in 1918, who formulated a family of rival axiomatizations of the alethic modalities. His work unleashed a torrent of new work on the topic, expanding the kinds of modality treated to include deontic logic and epistemic logic. The seminal work of Arthur Prior applied the same formal language to treat temporal logic and paved the way for the marriage of the two subjects. Saul Kripke discovered (contemporaneously with rivals) his theory of frame semantics, which revolutionized the formal technology available to modal logicians and gave a new graphtheoretic way of looking at modality that has driven many applications in computational linguistics and computer science, such as dynamic logic.
Informal reasoning
The motivation for the study of logic in ancient times was clear: it is so that one may learn to distinguish good from bad arguments, and so become more effective in argument and oratory, and perhaps also to become a better person. Half of the works of Aristotle's rhetoric.
This ancient motivation is still alive, although it no longer takes centre stage in the picture of logic; typically dialectical logic forms the heart of a course in critical thinking, a compulsory course at many universities.
Argumentation theory is the study and research of informal logic, fallacies, and critical questions as they relate to every day and practical situations. Specific types of dialogue can be analyzed and questioned to reveal premises, conclusions, and fallacies. Argumentation theory is now applied in artificial intelligence and law.
Mathematical logic
Mathematical logic really refers to two distinct areas of research: the first is the application of the techniques of formal logic to mathematics and mathematical reasoning, and the second, in the other direction, the application of mathematical techniques to the representation and analysis of formal logic.^{[29]}
The earliest use of mathematics and geometry in relation to logic and philosophy goes back to the ancient Greeks such as Euclid, Plato, and Aristotle.^{[30]} Many other ancient and medieval philosophers applied mathematical ideas and methods to their philosophical claims.^{[31]}
One of the boldest attempts to apply logic to mathematics was undoubtedly the logicism pioneered by philosopherlogicians such as Gottlob Frege and Bertrand Russell: the idea was that mathematical theories were logical tautologies, and the programme was to show this by means to a reduction of mathematics to logic.^{[8]} The various attempts to carry this out met with a series of failures, from the crippling of Frege's project in his Grundgesetze by Russell's paradox, to the defeat of Hilbert's program by Gödel's incompleteness theorems.
Both the statement of Hilbert's program and its refutation by Gödel depended upon their work establishing the second area of mathematical logic, the application of mathematics to logic in the form of proof theory.^{[32]} Despite the negative nature of the incompleteness theorems, Gödel's completeness theorem, a result in model theory and another application of mathematics to logic, can be understood as showing how close logicism came to being true: every rigorously defined mathematical theory can be exactly captured by a firstorder logical theory; Frege's proof calculus is enough to describe the whole of mathematics, though not equivalent to it.
If proof theory and model theory have been the foundation of mathematical logic, they have been but two of the four pillars of the subject. Cantor's theorem, through the status of the Axiom of Choice and the question of the independence of the continuum hypothesis, to the modern debate on large cardinal axioms.
Recursion theory captures the idea of computation in logical and arithmetic terms; its most classical achievements are the undecidability of the Entscheidungsproblem by Alan Turing, and his presentation of the Church–Turing thesis.^{[33]} Today recursion theory is mostly concerned with the more refined problem of complexity classes—when is a problem efficiently solvable?—and the classification of degrees of unsolvability.^{[34]}
Philosophical logic
Philosophical logic deals with formal descriptions of ordinary, nonspecialist ("natural") language. Most philosophers assume that the bulk of everyday reasoning can be captured in logic if a method or methods to translate ordinary language into that logic can be found. Philosophical logic is essentially a continuation of the traditional discipline called "logic" before the invention of mathematical logic. Philosophical logic has a much greater concern with the connection between natural language and logic. As a result, philosophical logicians have contributed a great deal to the development of nonstandard logics (e.g. free logics, tense logics) as well as various extensions of classical logic (e.g. modal logics) and nonstandard semantics for such logics (e.g. Kripke's supervaluationism in the semantics of logic).
Logic and the philosophy of language are closely related. Philosophy of language has to do with the study of how our language engages and interacts with our thinking. Logic has an immediate impact on other areas of study. Studying logic and the relationship between logic and ordinary speech can help a person better structure his own arguments and critique the arguments of others. Many popular arguments are filled with errors because so many people are untrained in logic and unaware of how to formulate an argument correctly.^{[35]}^{[36]}
Computational logic
Logic cut to the heart of computer science as it emerged as a discipline: Alan Turing's work on the Entscheidungsproblem followed from Kurt Gödel's work on the incompleteness theorems. The notion of the general purpose computer that came from this work was of fundamental importance to the designers of the computer machinery in the 1940s.
In the 1950s and 1960s, researchers predicted that when human knowledge could be expressed using logic with mathematical notation, it would be possible to create a machine that reasons, or artificial intelligence. This was more difficult than expected because of the complexity of human reasoning. In logic programming, a program consists of a set of axioms and rules. Logic programming systems such as Prolog compute the consequences of the axioms and rules in order to answer a query.
Today, logic is extensively applied in the fields of Artificial Intelligence, and Computer Science, and these fields provide a rich source of problems in formal and informal logic. Argumentation theory is one good example of how logic is being applied to artificial intelligence. The ACM Computing Classification System in particular regards:
 Section F.3 on Logics and meanings of programs and F.4 on Mathematical logic and formal languages as part of the theory of computer science: this work covers formal semantics of programming languages, as well as work of formal methods such as Hoare logic;
 Boolean logic as fundamental to computer hardware: particularly, the system's section B.2 on Arithmetic and logic structures, relating to operatives AND, NOT, and OR;
 Many fundamental logical formalisms are essential to section I.2 on artificial intelligence, for example modal logic and default logic in Knowledge representation formalisms and methods, Horn clauses in logic programming, and description logic.
Furthermore, computers can be used as tools for logicians. For example, in symbolic logic and mathematical logic, proofs by humans can be computerassisted. Using automated theorem proving the machines can find and check proofs, as well as work with proofs too lengthy to write out by hand.
Bivalence and the law of the excluded middle; nonclassical logics
The logics discussed above are all "bivalent" or "twovalued"; that is, they are most naturally understood as dividing propositions into true and false propositions. Nonclassical logics are those systems that reject bivalence.
Hegel developed his own dialectic logic that extended Kant's transcendental logic but also brought it back to ground by assuring us that "neither in heaven nor in earth, neither in the world of mind nor of nature, is there anywhere such an abstract 'either–or' as the understanding maintains. Whatever exists is concrete, with difference and opposition in itself".^{[37]}
In 1910, Nicolai A. Vasiliev extended the law of excluded middle and the law of contradiction and proposed the law of excluded fourth and logic tolerant to contradiction.^{[38]} In the early 20th century Jan Łukasiewicz investigated the extension of the traditional true/false values to include a third value, "possible", so inventing ternary logic, the first multivalued logic.^{[39]}
Logics such as fuzzy logic have since been devised with an infinite number of "degrees of truth", represented by a real number between 0 and 1.^{[40]}
Intuitionistic logic was proposed by L.E.J. Brouwer as the correct logic for reasoning about mathematics, based upon his rejection of the law of the excluded middle as part of his intuitionism. Brouwer rejected formalization in mathematics, but his student Arend Heyting studied intuitionistic logic formally, as did Gerhard Gentzen. Intuitionistic logic is of great interest to computer scientists, as it is a constructive logic and can be applied for extracting verified programs from proofs.
Modal logic is not truth conditional, and so it has often been proposed as a nonclassical logic. However, modal logic is normally formalized with the principle of the excluded middle, and its relational semantics is bivalent, so this inclusion is disputable.
"Is logic empirical?"
What is the epistemological status of the laws of logic? What sort of argument is appropriate for criticizing purported principles of logic? In an influential paper entitled "Is logic empirical?"^{[41]} Hilary Putnam, building on a suggestion of W. V. Quine, argued that in general the facts of propositional logic have a similar epistemological status as facts about the physical universe, for example as the laws of mechanics or of general relativity, and in particular that what physicists have learned about quantum mechanics provides a compelling case for abandoning certain familiar principles of classical logic: if we want to be realists about the physical phenomena described by quantum theory, then we should abandon the principle of distributivity, substituting for classical logic the quantum logic proposed by Garrett Birkhoff and John von Neumann.^{[42]}
Another paper of the same name by Sir Michael Dummett argues that Putnam's desire for realism mandates the law of distributivity.^{[43]} Distributivity of logic is essential for the realist's understanding of how propositions are true of the world in just the same way as he has argued the principle of bivalence is. In this way, the question, "Is logic empirical?" can be seen to lead naturally into the fundamental controversy in metaphysics on realism versus antirealism.
Implication: strict or material?
The notion of implication formalized in classical logic does not comfortably translate into natural language by means of "if ... then ...", due to a number of problems called the paradoxes of material implication.
The first class of paradoxes involves counterfactuals, such as If the moon is made of green cheese, then 2+2=5, which are puzzling because natural language does not support the principle of explosion. Eliminating this class of paradoxes was the reason for C. I. Lewis's formulation of strict implication, which eventually led to more radically revisionist logics such as relevance logic.
The second class of paradoxes involves redundant premises, falsely suggesting that we know the succedent because of the antecedent: thus "if that man gets elected, granny will die" is materially true since granny is mortal, regardless of the man's election prospects. Such sentences violate the Gricean maxim of relevance, and can be modelled by logics that reject the principle of monotonicity of entailment, such as relevance logic.
Tolerating the impossible
Hegel was deeply critical of any simplified notion of the Law of NonContradiction. It was based on Leibniz's idea that this law of logic also requires a sufficient ground to specify from what point of view (or time) one says that something cannot contradict itself. A building, for example, both moves and does not move; the ground for the first is our solar system and for the second the earth. In Hegelian dialectic, the law of noncontradiction, of identity, itself relies upon difference and so is not independently assertable.
Closely related to questions arising from the paradoxes of implication comes the suggestion that logic ought to tolerate inconsistency. Relevance logic and paraconsistent logic are the most important approaches here, though the concerns are different: a key consequence of classical logic and some of its rivals, such as intuitionistic logic, is that they respect the principle of explosion, which means that the logic collapses if it is capable of deriving a contradiction. Graham Priest, the main proponent of dialetheism, has argued for paraconsistency on the grounds that there are in fact, true contradictions.^{[44]}
Rejection of logical truth
The philosophical vein of various kinds of skepticism contains many kinds of doubt and rejection of the various bases on which logic rests, such as the idea of logical form, correct inference, or meaning, typically leading to the conclusion that there are no logical truths. Observe that this is opposite to the usual views in philosophical skepticism, where logic directs skeptical enquiry to doubt received wisdoms, as in the work of Sextus Empiricus.
Friedrich Nietzsche provides a strong example of the rejection of the usual basis of logic: his radical rejection of idealization led him to reject truth as a "... mobile army of metaphors, metonyms, and anthropomorphisms—in short ... metaphors which are worn out and without sensuous power; coins which have lost their pictures and now matter only as metal, no longer as coins."^{[45]} His rejection of truth did not lead him to reject the idea of either inference or logic completely, but rather suggested that "logic [came] into existence in man's head [out] of illogic, whose realm originally must have been immense. Innumerable beings who made inferences in a way different from ours perished".^{[46]} Thus there is the idea that logical inference has a use as a tool for human survival, but that its existence does not support the existence of truth, nor does it have a reality beyond the instrumental: "Logic, too, also rests on assumptions that do not correspond to anything in the real world".^{[47]}
This position held by Nietzsche however, has come under extreme scrutiny for several reasons. He fails to demonstrate the validity of his claims and merely asserts them rhetorically. Although, since he is criticising the established criteria of validity, this does not undermine his position for one could argue that the demonstration of validity provided in the name of logic was just as rhetorically based. Some philosophers, such as ^{[49]} Still, in this respect his "theory" would be a much better depiction of a confused and chaotic reality than any consistent and compatible theory. Bertrand Russell described Nietzsche's irrational claims with "He is fond of expressing himself paradoxically and with a view to shocking conventional readers" in his book A History of Western Philosophy.^{[50]}
See also
Notes and references
 ^ "possessed of reason, intellectual, dialectical, argumentative", also related to λόγος (logos), "word, thought, idea, argument, account, reason, or principle" (Liddell & Scott 1999; Online Etymology Dictionary 2001).
 ^ Richard Henry Popkin; Avrum Stroll (1 July 1993). Philosophy Made Simple. Random House Digital, Inc. p. 238.
 ^ Jacquette, D. (2002). A Companion to Philosophical Logic. Wiley Online Library. p. 2.
 ^ For example, Nyaya (syllogistic recursion) dates back 1900 years.
 ^ Mohists and the school of Names date back at 2200 years.
 ^
 ^
 ^ ^{a} ^{b} ^{c}
 ^ ^{a} ^{b} For a more modern treatment, see Hamilton, A. G. (1980). Logic for Mathematicians. Cambridge University Press.
 ^ T. Mossakowski, J. A. Goguen, R. Diaconescu, A. Tarlecki, "What is a Logic?", Logica Universalis 2007 Birkhauser, pp. 113–133.
 ^

^
 Magnani, L. "Abduction, Reason, and Science: Processes of Discovery and Explanation". Kluwer Academic Plenum Publishers, New York, 2001. xvii. 205 pages. Hard cover, ISBN 0306465140.
 R. Josephson, J. & G. Josephson, S. "Abductive Inference: Computation, Philosophy, Technology" Cambridge University Press, New York & Cambridge (U.K.). viii. 306 pages. Hard cover (1994), ISBN 0521434610, Paperback (1996), ISBN 0521575451.
 Bunt, H. & Black, W. "Abduction, Belief and Context in Dialogue: Studies in Computational Pragmatics" (Natural Language Processing, 1.) John Benjamins, Amsterdam & Philadelphia, 2000. vi. 471 pages. Hard cover, ISBN 9027249830 (Europe),

^ Peirce, C. S.
 "On the Logic of drawing History from Ancient Documents especially from Testimonies" (1901), Collected Papers v. 7, paragraph 219.
 "PAP" ["Prolegomena to an Apology for Pragmatism"], MS 293 c. 1906, New Elements of Mathematics v. 4, pp. 319320.
 A Letter to F. A. Woods (1913), Collected Papers v. 8, paragraphs 385388.
 ^ Peirce, C. S. (1903), Harvard lectures on pragmatism, Collected Papers v. 5, paragraphs 188–189.
 ^ ^{a} ^{b} Bergmann, Merrie; Moor, James; Nelson, Jack (2009). The Logic Book (Fifth ed.). New York, NY: McGrawHill.
 ^ Mendelson, Elliott (1964). "Quantification Theory: Completeness Theorems". Introduction to Mathematical Logic. Van Nostrand.
 ^ Hofweber, T. (2004). "Logic and Ontology". In
 ^ E.g., Kline (1972, p.53) wrote "A major achievement of Aristotle was the founding of the science of logic".
 ^ "Aristotle", MTU Department of Chemistry.
 ^ Jonathan Lear (1986). "Aristotle and Logical Theory". Cambridge University Press. p.34. ISBN 0521311780
 ^ Simo Knuuttila (1981). "Reforging the great chain of being: studies of the history of modal theories". Springer Science & Business. p.71. ISBN 9027711259
 ^ Michael Fisher, Dov M. Gabbay, Lluís Vila (2005). "Handbook of temporal reasoning in artificial intelligence". Elsevier. p.119. ISBN 0444514937
 ^ Harold Joseph Berman (1983). "Law and revolution: the formation of the Western legal tradition". Harvard University Press. p.133. ISBN 0674517768
 ^ The four Catuṣkoṭi logical divisions are formally very close to the four opposed propositions of the Greek tetralemma, which in turn are analogous to the four truth values of modern relevance logic Cf. Belnap (1977); Jayatilleke, K. N., (1967, The logic of four alternatives, in Philosophy East and West, University of Hawaii Press).
 ^ Kisor Kumar Chakrabarti (June 1976). "Some Comparisons Between Frege's Logic and NavyaNyaya Logic". Philosophy and Phenomenological Research (International Phenomenological Society) 36 (4): 554–563.
 ^ Jonardon Ganeri (2001). Indian logic: a reader.
 ^ "Aristotle".
 ^ "History of logic: Arabic logic".
 ^ Stolyar, Abram A. (1983). Introduction to Elementary Mathematical Logic. Dover Publications. p. 3.
 ^ Barnes, Jonathan (1995). The Cambridge Companion to Aristotle. Cambridge University Press. p. 27.
 ^
 ^ Mendelson, Elliott (1964). "Formal Number Theory: Gödel's Incompleteness Theorem". Introduction to Mathematical Logic. Monterey, Calif.: Wadsworth & Brooks/Cole Advanced Books & Software.
 ^ Brookshear, J. Glenn (1989). "Computability: Foundations of Recursive Function Theory". Theory of computation: formal languages, automata, and complexity. Redwood City, Calif.: Benjamin/Cummings Pub. Co.
 ^ Brookshear, J. Glenn (1989). "Complexity". Theory of computation: formal languages, automata, and complexity. Redwood City, Calif.: Benjamin/Cummings Pub. Co.
 ^ Goldman, Alvin I. (1986), Epistemology and Cognition, Harvard University Press, p. 293, .
 ^ Demetriou, A.; Efklides, A., eds. (1994), Intelligence, Mind, and Reasoning: Structure and Development, Advances in Psychology 106, Elsevier, p. 194, .
 ^
 ^ Joseph E. Brenner (3 August 2008). Logic in Reality. Springer. pp. 28–30.
 ^ Zegarelli, Mark (2010), Logic For Dummies, John Wiley & Sons, p. 30, .
 ^
 ^
 ^
 ^
 ^
 ^ Nietzsche, 1873, On Truth and Lies in a Nonmoral Sense.
 ^ Nietzsche, 1882, The Gay Science.
 ^ Nietzsche, 1878, Human, All Too Human
 ^ Babette Babich, Habermas, Nietzsche, and Critical Theory
 ^ Georg Lukács. "The Destruction of Reason by Georg Lukács 1952". Marxists.org. Retrieved 20130616.
 ^ Russell, Bertrand (1945), A History of Western Philosophy And Its Connection with Political and Social Circumstances from the Earliest Times to the Present Day (PDF), Simon and Schuster, p. 762
Bibliography
 Nuel Belnap, (1977). "A useful fourvalued logic". In Dunn & Eppstein, Modern uses of multiplevalued logic. Reidel: Boston.
 Józef Maria Bocheński (1959). A précis of mathematical logic. Translated from the French and German editions by Otto Bird. D. Reidel, Dordrecht, South Holland.
 Józef Maria Bocheński, (1970). A history of formal logic. 2nd Edition. Translated and edited from the German edition by Ivo Thomas. Chelsea Publishing, New York.
 Brookshear, J. Glenn (1989). Theory of computation: formal languages, automata, and complexity. Redwood City, Calif.: Benjamin/Cummings Pub. Co.
 Cohen, R.S, and Wartofsky, M.W. (1974). Logical and Epistemological Studies in Contemporary Physics. Boston Studies in the Philosophy of Science. D. Reidel Publishing Company: Dordrecht, Netherlands. ISBN 9027703779.
 Finkelstein, D. (1969). "Matter, Space, and Logic". in R.S. Cohen and M.W. Wartofsky (eds. 1974).
 Gabbay, D.M., and Guenthner, F. (eds., 2001–2005). Handbook of Philosophical Logic. 13 vols., 2nd edition. Kluwer Publishers: Dordrecht.
 Hilbert, D., and Ackermann, W, (1928). Grundzüge der theoretischen Logik (Principles of Mathematical Logic). SpringerVerlag. OCLC 2085765
 Susan Haack, (1996). Deviant Logic, Fuzzy Logic: Beyond the Formalism, University of Chicago Press.
 Hodges, W., (2001). Logic. An introduction to Elementary Logic, Penguin Books.
 Hofweber, T., (2004), Logic and Ontology. Stanford Encyclopedia of Philosophy. Edward N. Zalta (ed.).
 Hughes, R.I.G., (1993, ed.). A Philosophical Companion to FirstOrder Logic. Hackett Publishing.
 Kline, Morris (1972). Mathematical Thought From Ancient to Modern Times. Oxford University Press.
 Kneale, William, and Kneale, Martha, (1962). The Development of Logic. Oxford University Press, London, UK.
 Mendelson, Elliott, (1964). Introduction to Mathematical Logic. Wadsworth & Brooks/Cole Advanced Books & Software: Monterey, Calif. OCLC 13580200
 Harper, Robert (2001). "Logic".
 Smith, B., (1989). "Logic and the Sachverhalt". The Monist 72(1):52–69.
 Whitehead, Alfred North and Bertrand Russell, (1910). Principia Mathematica. Cambridge University Press: Cambridge, England. OCLC 1041146
External links
Library resources about Logic 

 Logic at PhilPapers
 Logic at the Indiana Philosophy Ontology Project
 Logic entry in the Internet Encyclopedia of Philosophy
 Hazewinkel, Michiel, ed. (2001), "Logical calculus",
 An Outline for Verbal Logic

Introductions and tutorials
 An Introduction to Philosophical Logic, by Paul Newall, aimed at beginners.
 forall x: an introduction to formal logic, by P.D. Magnus, covers sentential and quantified logic.

Logic SelfTaught: A Workbook (originally prepared for online logic instruction).
 Nicholas Rescher. (1964). Introduction to Logic, St. Martin's Press.

Essays
 "Symbolic Logic" and "The Game of Logic", Lewis Carroll, 1896.
 Math & Logic: The history of formal mathematical, logical, linguistic and methodological ideas. In The Dictionary of the History of Ideas.

Online Tools
 Interactive Syllogistic Machine A web based syllogistic machine for exploring fallacies, figures, terms, and modes of syllogisms.

Reference material
 Translation Tips, by Peter Suber, for translating from English into logical notation.
 Ontology and History of Logic. An Introduction with an annotated bibliography.

Reading lists

The London Philosophy Study Guide offers many suggestions on what to read, depending on the student's familiarity with the subject:
 Logic & Metaphysics
 Set Theory and Further Logic
 Mathematical Logic

The London Philosophy Study Guide offers many suggestions on what to read, depending on the student's familiarity with the subject:

Syllogistic logic
Types of logic
The development of logic since Frege, Russell, and Wittgenstein had a profound influence on the practice of philosophy and the perceived nature of philosophical problems (see Analytic philosophy), and Philosophy of mathematics. Logic, especially sentential logic, is implemented in computer logic circuits and is fundamental to computer science. Logic is commonly taught by university philosophy departments, often as a compulsory discipline.
The An Investigation of the Laws of Thought on Which are Founded the Mathematical Theories of Logic and Probabilities, introducing symbolic logic and the principles of what is now known as Boolean logic. In 1879, Gottlob Frege published Begriffsschrift, which inaugurated modern logic with the invention of quantifier notation. From 1910 to 1913, Alfred North Whitehead and Bertrand Russell published Principia Mathematica^{[8]} on the foundations of mathematics, attempting to derive mathematical truths from axioms and inference rules in symbolic logic. In 1931, Gödel raised serious problems with the foundationalist program and logic ceased to focus on such issues.
In the 20th century, Western philosophers like Stanislaw Schayer and Klaus Glashoff have explored Indian logic more extensively. [26]