Scientificmethod Wiki
Advertisement

Valid and Invalid Argument forms[]

As we have seen, we can create Truth Tables to assess argument forms for validity. However, there are cases where a truth table becomes unwieldy. Consider the example of a complex argument form given by Copi and Cohen:

A ⊃ B
B ⊃ C
C ⊃ D
~D
A v E
(Therefore) E

To test the validity of such a monster, we would need a truth table with 32 rows (!) for the five statements involved. However, there are two simpler ways to undertake such a task: the first is through a formal proof, where elementary valid argument forms are identified in the argument, to verify that the entire argument is valid. Here is an example:

1. A ⊃ B
2. B ⊃ C
3. C ⊃ D
4. ~D
5. A v E / E
6. A ⊃ C (From premises 1 and 2, using a Hypothetical Syllogism)
7. A ⊃ D (From premises 6 and 3, using a Hypothetical Syllogism)
8. ~A (From premises 7 and 4, using Modus Tollens)
9. E (From premises 5 and 8, using a Disjunctive Syllogism)

As Copi and Cohen write: "We define a formal proof that a given argument is valid as a sequence of statements each of which is either a premise of that argument or follows from preceding statements of the sequence by an elementary valid argument, such that the last statement in the sequence is the conclusion of the argument whose validity is being proven. And we define an elementary valid argument as any argument that is a substitution instance of an elementary valid argument form." What this all means, simply enough, is that we can determine that an argument is valid if the argument contains elements which themselves are valid. So let's go through the example again, and ensure we have an understanding of how a formal proof works:

1. A ⊃ B
2. B ⊃ C
3. C ⊃ D
4. ~D
5. A v E / E (All steps 1-5 do is re-present our argument. All formal proofs begin with way.)
6. A ⊃ C (From premises 1 and 2, using a Hypothetical Syllogism. Step six provides a formal proof for premises 1 and 2)
7. A ⊃ D (From premises 6 and 3, using a Hypothetical Syllogism. Step seven works to provide a proof for premise 3. We can use the hypothetical syllogism in step 6 to validate premise 3, through another hypothetical syllogism: If A leads to C, and C leads to D, we know that A leads to D.)
8. ~A (From premises 7 and 4, using Modus Tollens)

Now we move on to work on premise 4. We know from the valid argument form Modus Tollens that if D is false, then A must be false. Recall that the argument form for modus tollens is If A, then B. Not A, therefore Not B. Well, we know from step seven that we have the statement "If A, then D" and combining this with premise 4, we have modus tollens.

9. E (From premises 5 and 8, using a Disjunctive Syllogism)

We know that in premise 5, we have a disjunction: either A or E. We have just ascertained in step 8 that A is false. Ergo, E must be true.

QED ("quod erat demonstrandum": that which was to be demonstrated")

Now that we know that we can use elementary valid argument forms to provide formal proofs for more complex arguments, all we need is a reliable list of already proven elementary forms. As you may guess, such lists exist. We call such a list the:

Rules of Inference[]

The argument forms presented here in this list go from the most crude - the prima facie, to the more complex complex. Some seem bone dead obvious, but that's how logic works - for as Rene Descartes said "If we wish to have firm and constant knowledge, we must start from the very beginning" And, as I will point out in the final entry to this page, we tend to forget the even the simplest points of logic when they may point to stressful realities about ourselves.

So let's take a look at the list. If you've read through the entire site, you ought to recognize many of these forms from our discussion of truth tables:

http://upload.wikimedia.org/wikipedia/commons/3/34/Propositional_Logic.png

I will now provide examples of the valid and invalid forms.

Valid Forms[]

Simplification[]

p . q
p

Example: Both Sheila and Sue are here. Therefore, Sheila is here.

Addition[]

P
p v q

Example: Pete is here. Therefore, either Pete or Quincy is here.

Conjunction[]

p
q
p . q

Example: Pete is here. Quincy is here. Therefore, both Pete and Quincy are here.

Disjunctive Syllogism[]

p v q
~ p
q

Example: Either we left the iron on, or we turned it off. Its not on, therefore it is off.

Hypothetical Syllogism[]

p > q
q > r
p > r

Example: If you get good grades, you'll go to college. If you go to college, you'll get a good job. So, if you get good grades, you'll get a good job.

Modus Ponens[]

p > q
p
q

Example: If you ring that bell, this dog will salivate. You rang the bell. Therefore, the dog will salivate.

Modus Tollens[]

p > q
~q
~p

Example: If you try, you will succeed You didn't succeed. Therefore, you didn't try.

Constructive Dilemma[]

(p > q) . (r > s)
p v r
q v s

Example: If you try, you will succeed AND if you sew, then you will reap. Either you tried, or you sewed. Therefore, you will either succeed or reap.

Destructive Dilemma[]

(p > q) . (r > s)
~q v ~ s
~p v ~ r

Example: If you try, then you will succeed and if you sew, then you shall reap. You didn't succeed or you didn't reap. Therefore either you didn't try or you didn't sew.


Invalid Forms[]

All invalid argument forms are known as Non Sequiturs. Aristotle held that the basis for all formal fallacies was the non sequitur, which is why the term is known in Latin as Ignoratio elench - or an ignorance of logic.

Fallacy of affirming the consequent[]

p > q
q
p

Example: If you win the lottery, then you will have a lot of money. You have a lot of money, therefore you won the lottery.

Reason: Affirming the Consequent is an invalid form of argument in propositional logic; for instance, let "p" be false and "q" be true, then there is no inconsistency in supposing that the first, conditional premise is true, which makes the premises true and the conclusion false. In our example, a person could have money from another source other than lottery winnings.


Fallacy of denying the antecedent[]

p > q
~p
~q

Example: If you win the lottery, then you will have a lot of money. You didn't win the lottery. Therefore, you don't have a lot of money.

Reason: Together with Affirming the Consequent, this is a fallacy which involves either confusion about the direction of a conditional relation, or a conflating of a conditional with a biconditional proposition. In our example, again, it is possible to have money from other sources.

Fallacy of Denying the Alternative[]

p v q
p
~ q

This argument reads: "Either p is true or q is true. p is true, ergo q is not true." However, if you recall our discussion of disjunctives in the previous section, you will remember that it IS possible for both the p and q atomic statements to be true in a disjunctive. So, using our inclusive sense of the ( Ú ) conjunction, it is possible for both p and q to be true, meaning that there is a case here where the premises are true while the conclusion false.


There are many valid truth-functional arguments whose validity cannot be proved using only the nine rules of inference. Copi and Cohen give the following example:

A ⊃ B C ⊃ ~ B Therefore: A ⊃ ~C

However, we know that in any truth-functional compound statement, if a component of that statement is replaced by another statement with the same truth value, the truth value of the compound statement will remain the same. For this reason, by this Rule of Replacement , we know that we can replace one logically equivalent component for another term that we have already proven to be valid. Let's now add these 10 rules of replacement to our rules of inference:

Rules of Replacement[]

Valid argument forms may be used as rules of inference in natural deduction. Forms of logical equivalences may also be used as rules of inference. Every substitution instance of a from of logical equivalence is a logical equivalence; moreover, either one of a pair of logically equivalent expressions may be substituted for the other in a proof without loss of validity.

Tautology[]

p is logically equivalent to (p v p) p is = to (p . p)

DeMorgan's Rule[]

~(p.q) is equivalent to (~p v ~ q) ~(p v q) is equiv to (~p . ~ q)

Commutativity[]

(p v q) is equiv to (q v p) (p . q) is equiv to (q . p)

Association[]

[p v (q v r)]is equiv to [(p v q) v r)] [p . (q . r)]is equiv to [(p . q) . r)]

Distribution[]

[p . (q v r) is logically equiv to [(p . q) v (p . r)] [p v (q . r)] is equiv to [(p v q) . (p v r)]

Double Negation[]

P is = to ~~p

Transposition[]

( p > q) = (~q > ~ p)

(Note the order carefully, the opposite is a fallacy)

Material Implication[]

(p > q) is equiv to (~p v q)

Material Equivalence[]

(p=q) is logically equiv to [(p > q) . (q . p)] (p=q) is equiv to [(p . q) v (~p . ~q)]

Exportation[]

[(p . q) > r] is = to [p > (q > r)]

How To Use the Rules of replacement[]

Now, if we look carefully at that list, we see we have an answer to our problem from above:

P1. A ⊃ B
P2. C ⊃ ~ B
Therefore: A ⊃ ~C

We see that by the rule of replacement of Transposition, that "C ⊃ ~ B" is logically equivalent to B ⊃ ~ C

If we replace premise 2 with "B ⊃ ~ C", we get:

P1. A ⊃ B
P2. B ⊃ ~ C
Therefore: A ⊃ ~C

And it should be clear at this point that we have valid argument form: hypothetical syllogism (See the above rules of inference.)


Abbreviated (or Indirect) Truth Tables[]

There is a second way to assess an argument for validity or invalidity: through an abbreviated Truth Table. Recall from our earlier example of Truth Tables that one uses a table to check for validity by looking for instances where there are all true premises and a false conclusion. Well, seeing as that all we really want to know is if there is such a case, then we can set up an abreviated truth table by simply beginning with the assumption that the conclusion is false, and attempting to see if it is possible, without contradiction, to make all the premises true, given that the conclusion is false!

Consider this argument:
p ⊃ q
p
therefore: q


p q p ⊃ q p q
F F T F F

We immediately run into a problem trying to make the premises true. Since q MUST be false (it is our conclusion) we must make p false in the first premise, in order to make the proposition true. BUT, if we do this, we force the second proposition to be false, since the second proposition is p. Since there is no way to make both premises true while the conclusion is false, this argument is valid. This should be no surprise to a careful reader - for this is an example of modus ponens.

Inconsistency[]

One supposedly strange occurrence in propositional logic is that any argument with inconsistent premises - self contradictory premises is valid, regardless of what its conclusion might be. Such a claim to me seems spurious, because any conclusion of a valid, but self contradictory argument is necessarily a non sequitur - it cannot be contained within the premises, ergo the premises cannot actually support this specific conclusion. In fact, a contradiction necessarily leads to implying that everything is true, a problem known as the principle of explosion:

The principle of explosion is the rule of classical logic that states that anything follows from a contradiction -

Here is the proof:

1) P v ~P By assumption (the law of noncontradiction which is axiomatic)
2) P By (1) and conjunction elimination
3) P v A By (2) and disjunction introduction
(4) ~P By (1) and conjunction elimination
(5) A By (3), (4), and disjunctive syllogism

Supporters of paraconsistent logic typically disagree with (3), holding that there are contradictory views (such as those found in quantum mechanics) that do not necessarily hold that contradictions must 'explode".

This following the Course in Logic 101 will next proceed to the next section, where they will learn the basics of Predicate Logic



References[]

  • Copi, I. M, Cohen, C., (2001), "Introduction to Logic", 11th Edition.
  • Hurely, P. J. (2000) A Concise Introduction to Logic - 7th Edition
Advertisement