SL Trees

So far we have learned one way to evaluate SL arguments for validity: an argument is valid just in case no interpretation satisfies the premises but falsifies the conclusion. We can check to see whether an argument is valid by constructing truth tables, representing all of the possible interpretations, and checking to see whether any of them do so. This method has two main advantages: one is that it can be done in a formal and formulaic way --- it requires no particular rational insight, just a straightforward application of the rules of SL. Another is that, assuming you follow the rules correctly, it will always succeed in identifying whether the argument is valid or not.

The truth-table method also has a significant disadvantage, however: it rapidly becomes extremely cumbersome for evaluating arguments using more than two or three atomic sentence-letters. To evaluate an argument like this one, you'd need to consider four rows of a truth-table:

P ≡ Q

¬Q

¬Q ∨ (P⊃¬Q)

¬P ⊃ Q

But change just two of the atoms to new letters, and the required table will grow exponentially. For this argument, you'd need sixteen rows.

P ≡ Q

¬A

¬Q ∨ (A⊃¬B)

¬P ⊃ B

For this one, you'd need two hundred fifty-six!

A ≡ B

¬B ⊃ (CD)

E ⊃ ¬C

D & F) ∨ G

¬A & E

H ∨ G

So it is useful to have alternate ways of proving entailments. In this chapter we will introduce a proof system for SL. The proofs we will construct in this chapter are called analytic tableaux. Another name for them is trees, because of the way they 'branch' out to represent possibilities. In Ch. 7 we will examine a different proof system.

5.1 Satisfiability and entailment

The method of trees is most directly a way to test for satisfiability of a set of SL sentences. Since validity is definable in terms of satisfiability, it also gives a way to test for validity.

Recall from section 4.3 the definition of entailment in SL: 𝒳 ⊨ Φ means that there is no interpretation that satisfies every sentence in 𝒳 but falsifies Φ. (Remember, '𝒳' can stand for any set of SL sentences.) This in turn is equivalent to the claim that there is no interpretation that satisfies {𝒳, ¬Φ}. Unsatisfiability can be represented with a double-turnstile with on the right-hand side.

In general, 𝒳 ⊨ Ψ is equivalent to 𝒳, ¬Ψ ⊨ ⊥.

So if you want to tell whether an argument in SL is valid, check to see whether the premises, along with the negation of the conclusion, are jointly satisfiable. If they are, the argument is invalid. If they're not, the argument is valid. This is the basic idea of the tree method. We'll write down the sentences we're attempting to satisfy, and then we'll work through their consequences, in order to see whether it's possible to complete the task.

Let's begin by working through an example that illustrates the general idea, then come back to give precise rules for the tree method.

5.2 An example: proving validity

Let's attempt to prove whether this is valid in SL:

A & B

¬(CD)

BC) ∨ E

E

To evaluate this argument, we consider whether it's possible to satisfy the three premises along with the negation of the conclusion. We begin by writing down the sentences we're going to attempt to satisfy.

Notice that we're putting the negation of the conclusion here in line (4), since we're looking to see whether it's possible to satisfy the premises and falsify the conclusion. The sentences we write down at the beginning of the tree, we call the root. Trees are designed to show whether the root is satisfiable, and if so, how.

With proof trees, the idea is to continue writing down that which our previous lines already commit us to. Consider the first line, A & B. For this conjunction to be true, both conjuncts must be true. So any interpretation that satisfies the top four lines must also satisfy both A and B; the truth of those sentences is a commitment of what we already have. Each follow from what is already written down. We represent this by continuing the tree with those sentences:

In addition to introducing lines (5) and (6), we also add a check mark on (1), to indicate that we've now considered it and represented its consequences. Think of the check mark as saying that we've extended the tree in a way that encodes the information from this line.

Now consider line (2). This is a negated disjunction. Disjunctions are true any time either disjunct is true, so this negated disjunction can only be true if both disjuncts are false. So we include new lines for ¬C and ¬D, adding a check mark on line (2):

Line (3) is a disjunction. Unlike the previous two cases, it doesn't tell us what categorically must be the case; it says that one of two possibilities must be met. We represent this in our tree by branching into two different columns:

Think of these two branches as representing the idea that the root implies that at least one of these ways of continuing the tree must be possible.

So now we have two branches to consider. Start by examining the right branch. It gives an atomic sentence, E. Notice, however, that line (4) was ¬E. So if we're looking for a way to satisfy (1)-(4), this right branch isn't a possible interpretation. It requires E to be true, and it also requires ¬E to be true. If a branch contains any sentence and also contains its negation, we know that branch doesn't correspond to a possible interpretation. We'll consider this branch closed, and mark this with an ×.

The left branch is another disjunction. It too branches out into its two disjuncts:

Both of these disjuncts also result in closed branches. The left branch at (10) is the negation of (6), and the right branch is the negation of (7). Now every branch in this tree is closed. This corresponds to the idea that every possible way there could be to satisfy (1)-(4) ended up requiring a contradiction. There is no interpretation that satisfies (1)-(4). In other words, the argument we began with was valid.

In the previous chapter we used the double turnstile, '', to represent entailment. In this chapter we will use a different but related symbol, the single turnstile, which looks like this: ''. Think of this symbol as representing provability. So in proving that E follows from the premises above, we're showing that {A & B, ¬(CD), BC) ∨ E} ⊢ E.

Unsurprisingly, provability and entailment will turn out to be closely connected; we'll consider the connection in detail in Chapter 6.

5.3 An example: proving invalidity

Let's work through another example to further illustrate the idea. In 5.4 we'll learn the formal rules for trees in SL. Consider this argument form:

(DA) & ¬N

N ∨ ¬A

¬N & A

As before, we'll construct a tree that includes the premises and the negation of the conclusion at the top, and we'll attempt to find an interpretation satisfying those three lines. By convention, we write the claim we're attempting to prove at the top of the tree. We begin by processing line (1), which is a conjunction; its conjuncts are given in (4) and (5). Line (6) processes the disjunction in line (2), and the left branch closes.

Line (3) is a negated conjunction. A conjunction is true if and only if both conjuncts are true, so it is false if at least one conjunct is false. So to resolve (3), line (7) branches into the negation of each conjunct, ¬¬N and ¬A. The former closes because it's the negation of line (5). The final sentence requiring resolution is the disjunction on line (4); it branches, and one branch closes.

The indicates that the open branch ending in D is a completed branch. (We'll precisely define 'completion' in Section 5.6.) What this means is that this branch represents a way to satisfy all three sentences at the root of the tree. In other words, this argument form is not valid in SL. Furthermore, examining the open branch demonstrates an interpretation that satisfies the root. The branch includes three wffs that are either atoms or negated atoms: ¬A, D, and ¬N. So it suggests this interpretation:

You can verify this result by evaluating the premises and the conclusion of the argument we began with on this interpretation. Since there is an interpretation that satisfies the premises and falsifies the conclusion, our tree system proves the argument invalid.

5.4 Resolution rules for SL trees

Hopefully the examples we've worked through have given you a decent intuitive sense of the tree method for SL. Now let's get more precise about it, by giving formal rules for trees. You should be able to recognize the following rules as a generalization of the steps of the proofs given above.

We begin with resolution rules. These rules identify ways that tree branches can be extended, given sentences that are already in that branch. The rules depend on the main connective of the sentence in question. (If the main connective is negation, then they also depend on the main connective of the negand.)

Conjunction

The rule for conjunction is that if you have a conjunction in a branch, you may make a linear extension of the branch that includes each conjunct, adding a check mark next to the conjunction. Using our Greek symbols once again as variables to stand in for any sentence of SL, any time you have this,

Φ&Ψ

you may extend each open branch of the tree to this:

It is important to remember once again that Φ and Ψ here can stand for any sentences of SL, including complex ones.

If you have extended a tree branch using the conjunction rule, we say that you have resolved the conjunction. When sentences are resolved, they are marked with a check. In stating the resolution rules, we'll typically omit the check mark; here is the conjunction rule:

Negated conjunction

If you have a negated conjunction, you may resolve it by branching into the negation of each conjunct. This makes sense because there are two ways for a negated conjunction to be true --- either conjunct can be false.

Disjunction

Disjunctions branch into each disjunct.

Negated disjunction

Since disjunctions are true any time either disjunct is true, they are only false if both disjuncts are false. So negated disjunctions are resolved with a linear development containing the negation of each disjunct.

Conditional

Recall the characteristic truth table for the conditional in SL:

ΦΨΦ ⊃ Ψ
111
100
011
001

Conditionals are true any time the antecedent is false, and also any time the consequent is true. We can represent this with a branching tree development, similar to a disjunction.

Negated conditional

As in the case of disjunction, the negation of a conditional is a relatively strong claim --- the only way for a conditional to be false is for its antecedent to be true and for its consequent to be false. We represent this with a linear development for negated conditionals:

Biconditional

Biconditionals require a slightly different structure. A biconditional says that two sentences have the same truth value: either both are true, or both are false. Since these represent two different ways a biconditional can be true, our tree rules will develop biconditionals into two new branches. But unlike in the case of our other rules, each new branch will contain two sentences. One branch will have both sides of the biconditional; the other will have both sides' negations:

Negated biconditional

Negated biconditionals yield the same structure, but instead of new branches where each subsentence has the same truth value, they yield branches in which the subsentences have opposite values.

Double negation

There is one more resolution rule, for double-negated sentences. One resolves a double negation by developing a linear branch with the negand of the negand --- i.e., the wff obtained from removing both negations from the left of the double-negation.

Resolution rules summary

These nine resolution rules describe, in purely formal terms, how to resolve most sentences of SL. The only exceptions are atomic sentences, and atomic sentences with a single negation sign in front of them. In our formal proof system, atoms and negated atoms have a special role to play; they are not resolved. In developing a tree proof, you may apply any of these rules to any unresolved sentence (other than an atom or negated atom) in the branch. Mark resolved sentences with a check.

Note that it is not a rule that sentences must be resolved in the order in which they appear in the tree; one can save sentences and come back later to resolve them. However, if you are resolving a sentence after the tree has already done some branching, you must perform the resolution rule under each open branch that includes that sentence. Here is a tree illustrating the issue.

This tree has two disjunctions in the root, so it will involve branching multiple times. We could start with either one, but this tree resolved line (1) first, branching into the two disjuncts at line (3). The '1 ' to the right at that line is the explanation for what is happening there: it indicates that the disjunction rule was applied to line (1). When the second disjunction is processed at line (4), it needs to be done on both branches, since both branches include line (2). That's why our two branches split into four at line (4). One branch closes at that point. At line (5), the conjunction at line (3) is processed. This needs to be done on both branches that include that disjunction, but not the right-most branch, which does not include it. Resolving a sentence affects everything below it in the tree in its own descendent branches, but it doesn't affect branches that are off to the side.

Besides the tree resolution rules, there are two more rules to our proof system.

5.5 Branch closure rules

In the tree examples above, we closed branches when they contained sentences along with their negations. Here is our general rule for tree closure:

If any branch contains some formula, that branch is closed. Any branch that is not closed is open. We mark closed branches with the '×' symbol.
A tree is closed if and only if every branch in that tree is closed.
A tree is open if and only if it is not closed.

If every branch is closed, the method proves that the root is unsatisfiable.

Note that 'Φ' stands in for any sentence of SL; it is not part of the closure rule that one must have an atomic sentence and its negation. Consider for example this tree:

In this tree, we only resolved the first sentence, using the conditional rule, and the tree immediately closed. The line numbers under the closure symbols specify exactly why the branches close: the left branch developed into the negation of (2), and the right branch developed into the negation of (3). Notice that we could have developed lines (2) and (3), using the conjunction or negated biconditional rules, respectively, but this would have resulted in a more complicated tree that ultimately yielded the same result. Similarly, if we hadn't noticed that both branches close at line (4), we could have continued processing the tree, developing the left branch using the negated conjunction rule on (4), and developing the right branch using the biconditionals rule. But this too would have involved needless complication for no benefit. (Eventually, the tree would have closed with simpler sentences too.) You can save yourself a lot of work by noticing when branches are ready to mark as closed, and by thinking a bit strategically about which sentences to resolve first.

5.6 Branch completion rules

Here is the final piece of our formal proof system. In our examples above we have used '' to indicate that an open branch is complete. We need a formal characterization of branch completion.

In SL, a resolvable wff is a wff for which we have a resolution rule. That is, anything other than an atom or a negated atom.

A branch is complete if and only if either it is closed, or every resolvable sentence in that branch has been resolved.
A tree is complete if and only if every branch in that tree is complete.

If there is at least one open branch in a completed tree, the tree method indicates that that branch corresponds to an interpretation that satisfies the root.

We use the ' ⊢ ' symbol to indicate tree closure. '𝒳 ⊢ ⊥' means that a tree with root 𝒳 closes. We use ' ⊬ ⊥' to indicate that a completed tree remains open.

We will also define our single turnstile symbol so that 𝒳 ⊢ Φ is equivalent to 𝒳, ¬Φ ⊢ ⊥. So the latter is a way of saying that the tree method shows that the argument from 𝒳 to Φ is valid.

This completes the introduction of the formal rules for the SL tree proof system. In the remainder of this chapter, we'll work through some more examples and provide some advice for working through trees more efficiently. In Chapter 6 we'll explain why the tree method works, and prove why it's a good one.

5.7 Resolution order

The resolution rules do not specify the order in which you should resolve sentences in a tree; you are free to resolve in any order you like. But some ways of processing trees are more efficient than others. Here is an example to illustrate. Suppose we want to consider whether {¬(C & A), D ≡ C, A ∨ B, ¬B} ⊢ ⊥. So we put those sentences in the root of a tree. Here's what happens if we resolve the sentences in the order in which they're given:

Again, the justifications for particular steps are added to the right of the tree for clarity. For example, '1¬ & ' indicates that line (5) was produced by performing the negated conjunction resolution rule on line (1). This tree remains open, offering this interpretation that satisfies the root:

This tree reminds us again that if a sentence in a tree has multiple open branches below it, then resolving that sentence requires that the resolution be performed in each open branch below. That's why, for example, resolving line (3) at line (8) requires three different new branchings. (We do not resolve under the left-most column, because it is already closed.) So when there are more open branches, resolving sentences requires more work on the tree. Consequently, it is sometimes a good idea to think a bit strategically, and close off parts of the tree earlier, to save yourself some work.

The example above is a perfectly fine instance of a completed tree, and it does yield the correct answer. However, it's possible to get there with less work, by choosing to resolve sentences that will close off branches right away. Here is another way of developing a tree with the same root as in the previous example. This version begins by resolving line (3), because doing so will close off one of its new branches immediately. It then continues to resolve in the most efficient order:

This tree gets us to the same result much more quickly, pointing to the same interpretation that satisfies the root. As a general rule of thumb, it's good advice to look a step ahead, and resolve sentences that won't lead to new open branches, before resolving ones that will. For just the same reason, it is usually more efficient to resolve those sentences that have linear rules before those that have branching rules.

5.8 Choosing the right root

Trees are used to indicate whether the root is satisfiable. We test arguments for validity with a tree by putting their premises along with the negation of their conclusions in the root; if the tree remains open, indicating that the set is satisfiable, that means it's possible to satisfy the premises while falsifying the conclusion, which means the argument is invalid. If the tree closes, that suggests the argument is valid.

But trees can be used to evaluate many other kinds of questions, aside from validity. Any question that can be converted into a question about satisfiability can be answered with a tree. For example, if you want to find out whether a set of sentences is consistent, put that set of sentences into the tree. A set of sentences is consistent if and only if there is an interpretation satisfying it, so you can use a tree to find out.

What if you want to find out whether a sentence is a tautology? A tautology is a sentence that is satisfied by every interpretation. To test this claim with a tree, we'll attempt to find an interpretation that falsifies it, by putting its negation in the root. If this tree remains open, it shows us how to satisfy the negation, which means that the sentence is not a tautology. If it closes, then the negation is unsatisfiable, which means the sentence is a tautology.

To find out whether a sentence is a contradiction, check and see whether it is possible to satisfy it, by putting it in the root of the tree. If the tree remains open, the sentence is satisfiable, which means it's not a contradiction. If the tree closes, then it is a contradiction.

To use a tree to answer a question, the first step is always to convert the question into a question about whether some sentence, or set of sentences, is satisfiable.

5.9 Practice Exercises

You can use this interactive widget to construct SL trees and have them automatically checked for correctness. Enter the initial set of sentences as a comma-separated list, like "A,B,C".

If you want additional practice, you can construct trees for any of the SL arguments and entailment claims in the exercises for the previous two chapters.

Part A To evaluate each of the following claims with a tree, (a) what would you put in the root of the tree?, and (b) if the tree closes, does that show that the claim is true or false?

  1. {P, P ⊃ Q, Q ⊃ ¬P} ⊢ ⊥

  2. (PQ) ≡ (QP) is a tautology.

  3. The argument with premises P&Q, ¬R ⊃ ¬Q and conclusion P&R is valid.

  4. There is no interpretation that satisfies A ∨ B, B ⊃ C, and A ≡ C without also satisfying C.

  5. A ≡ ¬A is a contradiction.

  6. Every interpretation satisfying P, P ⊃ Q, and ¬Q also satisfies A.

  7. There is at least one interpretation that satisfies P ⊃ Q, ¬P ∨ ¬Q, and Q ⊃ P.

Part B Evaluate this argument from the beginning of the Chapter by constructing a tree.

A ≡ B

¬B ⊃ (CD)

E ⊃ ¬C

D & F) ∨ G

¬A & E

H ∨ G

If it is invalid, give a model demonstrating it so.

Part C  Evaluate each claim from Part A by constructing a tree. If applicable, give the interpretation that demonstrates the claim true or false.

Part D  Recall the discussion of the Sheffer stroke in Chapter 3. That connective, if added to SL, would have this characteristic truth table:

ΦΨΦ|Ψ
110
101
011
001

What would be appropriate tree resolution rules for the Sheffer stroke, or the negated Sheffer stroke?