Get Email Updates Email this Topic Print this Page

Reply Mon 23 Jun, 2008 05:32 pm
So in order to make it simpler, we have systematized natural deductive logic.



Truth Functional Logic Proof Structuring

Before we go any further, it might be best to discuss how to go about proof structuring. THIS IS ESSENTIAL TO KNOW IF YOU WISH TO DO PROPOSITIONAL LOGIC PROOFS!!! It is not difficult to do at all. Your past experiences with truth tables will help you somewhat, but this is a little bit different form that which we previously have done.



http://i26.tinypic.com/14j9oph.jpg

This example of a logical proof looks complex at first, but it really isn't once you understand what each section of the proof is for and they all fit together.



http://i27.tinypic.com/33msq34.jpg



http://i27.tinypic.com/154en1v.jpg

Basic Structure of a Proof
The Basic structure of a proof is essentially a giant "plus." So when you first draw out a proof, you can just simply draw a big "plus" to start with. It is ESSENTIAL to keep to this format as it separates the four quadrants you will need to complete the proof. The sections not only denote what section they are, but also the step you would take when you begin to write a proof. We will get more in-depth into this once we start doing actual proofs, but this is just to familiarize you with the proof itself and the way to approach them.

Section 1 - Section 1 contains the number lines of the argument you are evaluating in your proof. The numbers of the number lines continue into the 3rd section, but just remember the this section is specifically for the initial lines of your argument.

Section 2 - Section 2 contains the argument itself in logical form. Each premise has its own line. But remember that the Conclusion does not have its own line!!! The conclusion does not have its own line because what we are attempting to do with the proof is prove that the arguments can be worked in such a way that they end with the conclusion. The reason we include the conclusion at the end [separated by the / (slash)] is so that we are reminded of what the conclusion actually is.

Section 3 -

Section 4 - This is the heart of your proof. This section contains everything you will need to develop your deductive argument to prove the conclusion you have been given. This section has two parts. One part is the derivations from previous lines. The other part is the citations of the inference rules and from what lines you derived the derivation from.

RECAP!!!!

A proof contain four sections which have number lines (sections 1 and 3) which tell us which line we are on, an argument with a conclusion (section 2), and the proof itself (section 4)

HOW TO APPROACH A PROOF
Now that we know what the sections are for and what the basic format of the proof looks like, I'll explain what do in order to get to the final step, actually solving the proof (which we will need the inference and replacement rules to solve.)

http://i32.tinypic.com/9k796o.jpg





INFERENCE RULES

At the beginning, we talking about recognized deductive techniques. Those recognized argumentative methods are embodied in Inference rules, which means that a specific conclusion can be inferred when certain premises are given for each inference.

There are 4 primary inference rules to begin with (i.e. Disjunctive Syllogism, Hypothetical Syllogism, Modus Ponens, and Modus Tollens. You have to be very familiar with these rules to move onto the next four inference rules and the two "short cut" proof methods.

The best way to cover inference rules would be to introduce a typical argument. Then we can reduce that argument to logical elements, and then reveal the inference rule in detail.


For all intensive purposes, I'm going to break the four major rules down into two symposiums.

That's pretty much it for the introduction to proofs and inference rules. As always, if you have any questions, don't hesitate to ask because I am happy to respond. Also, if it seems vague, it probably is, so please inform me if I could clarify it a bit more... or if I'm wrong.
 
VideCorSpoon
 
Reply Wed 25 Jun, 2008 11:32 am
@VideCorSpoon,
Conclusion Function
When setting up a proof, we put down the argument in the second section. But remember that we also put the conclusion in that proof, but we did not give it its own line. Why? We did this because the entire proof is meant to prove the conclusion. Simply, the conclusion cannot be part of the solution. We put the conclusion at the end of the last argument line to remind us of what we have to prove. We know we have successfully completed the proof when we can finally derive a conclusion that matches the desired conclusion in the second section.

http://i25.tinypic.com/20qj053.jpg

In the example above, we see where the second section conclusion reminder is, and how the final derivation line has the same sentence (i.e. ~H) and we got that final derivation by going through the inference and replacements in order to get there.

It dawns on me that people may not understand why we do proofs in the first place. This is probably the most useful answer. A proof can be solved in any number of ways to get a conclusion (usually). But the simpler the proof (i.e. the shorter the proof length) the more coherent the argument.) If you have to do all sorts of inferences and what not's to derive a conclusion from 40 something lines, the argument probably isn't that coherent. There's more to it, but that is just one advantage to these proofs.
 
Arjen
 
Reply Sat 28 Jun, 2008 02:25 pm
@VideCorSpoon,
This is an interesting post. I am used to very different notations. It give me a chance to see which is more clear, or which I like best. Smile
 
VideCorSpoon
 
Reply Sat 28 Jun, 2008 05:20 pm
@Arjen,
Im glad you find it useful. Actually, if you feel like it, could you post the different notations you are familiar with. I think that is one aspect I have not documented, which is the different notations for connectives and the different proof formations and why you find them easier. I think others will find it especially valuable.

Personally, for the bi conditional, I like the tri-equal sign best... but I cannot symbolize it easily on word.
 
Zetetic11235
 
Reply Tue 22 Jul, 2008 08:54 pm
@VideCorSpoon,
What are these names for the inference rules? I know of inference from Quine's Method's of Logic as (i) Any schema implies itself (ii)If one schema implies a second and the second implies a third then the first implies the third (iii) An inconsistent schema implies every schema and is implied by every other schema (iv) A valid schema is implied by every schema and implies only valid schema.

1.AvB
2.~A
3.B--> D
4.D--> ~E
5.H-->E /~H
Are these assumed true? If so can the be represented as equivalent to

AvB.~A.B->D.D->~E.H-> E implies ~H?

AvB.~A.B->D.D->~E.H->E:-> ~H
side 1 is true if all conjunctions are true, and side two is true only if 1 is true, correct? Or am I off? For side 1 to be true, the truth values must be :A=F, B=T,D=T,E=F,H=F, same is true of side 2. For it to be false,Any of these, (A=T&B=TVF,D=TVF,E=TVF). Since truth output for side one only comes out from one set of truth values and every other comes out false and in it H must be true by the structure D->~E.H->E heres a truth table for that

DHE D->~E . H_>E
TTT T F F F T T T
TTF T T T F T F F
TFT T F F F F T T
TFF T T T T F T F
FTT F T F T T T T**
FTF F T F F T F F
*The rest of the TT is not needed since H is false in the last two cases, The important thing is the indexed case ** where the proposition comes out true when H is true. for this to hold D must be flase and E true, for this to hold 3 must be such that it falsifies 1 to hold true thus this case is eliminated by the other cases. I assume by the above reasoning that my questions about the form of the proof are going to be answered yes. I'm am not totally sure however, so any help would be appreciated.
 
VideCorSpoon
 
Reply Tue 22 Jul, 2008 11:03 pm
@Zetetic11235,
The inference rules (modus ponens, modus Tollens, Disjunctive syllogism, hypothetical syllogism, simplification, conjunction, addition, constructive dilemma, indirect and conditional nested proofs) and the replacement rules (Communication, association, double negation, demorgan, distribution, transposition, implication, exportation, tautology, and equivalence) start in symposia 8 (although, you have to look for it at this point in time because it is not "stickied" to the rest of the symposiums.) I'm getting there. It's been my summer mission to put down a simplified account of propositional logic.

It's funny how we have different accounts for the same things in logic. It's not a very stream lined system. I had just gone though a previous thread today talking with protoman about the different labels of the same inference.

But to tell the truth, I'm not quite sure what you are trying to state in your post. Are you referring to the general consensus of truth functional logic? Are you reviewing a specific inference rule? Or just the given proof? Etc. It seems like you are grasping at a few things here. But it sounds like you have a valid point to make. I'll be able to answer when you clarify.
 
Zetetic11235
 
Reply Sat 9 Aug, 2008 05:08 pm
@VideCorSpoon,
Quine's method makes use of a completely different method using EI(existential instantation), UI(universal instantaiton) ect, with little to no appeal to the method you cite.

I don't know if this is because it is theory of quantification. He doesn't cover proof adise from this.
 
Arjen
 
Reply Sun 10 Aug, 2008 01:12 am
@Zetetic11235,
Quine speaks of Predicate logic. In predicate logic quantifiers are present: the existential and the universal quantifiers.
Perhaps a new topic would be suited for this discussion? It seem quite offtopic.
 
VideCorSpoon
 
Reply Sun 10 Aug, 2008 10:34 am
@Arjen,
Arjen is right on the money with his comment. Quine indeed speaks in terms of predicate logic, using universal and existential quantifiers. It is sometimes called quantificational logic or whatever have you. It is a more abstract method, like "there exists some x where" or "there exists some x where." Like propositional but more hypothetical. But still the method here applies to propositional logic.
 
Arjen
 
Reply Sun 10 Aug, 2008 01:14 pm
@VideCorSpoon,
If you ask me predicate logic is where the fun begins. I would like to correct Vide by the way. I'm sure he ment to say it, but he made an error in his typing:
VideCorSpoon wrote:

It is a more abstract method, like "there exists some x where" or "there exists some x where." Like propositional but more hypothetical.
[/SIZE]
The existential quantifier means "there exists some x where" and the universal quantifier means "for al x-es goes". Logic gets a lot more complicated here, but it can also far more accurately say what you want to. In predicate logic I can see the basis of all langues coming into view.

I was meaning to ask you by the way Vide, are you going to do an introduction into Predicate logic as well?
 
Zetetic11235
 
Reply Sun 10 Aug, 2008 02:29 pm
@Arjen,
Ah, I study predicate logic as my interest lies in formal lnaguage, mathematics and analytic philosophy. I am not sure what the proposotional logic is used for? I always assumed it was a less developed form of the predicate calculus? Or is it a method of parsing natural language as opposed to creating a more precises formal language?
 
VideCorSpoon
 
Reply Sun 10 Aug, 2008 04:36 pm
@Zetetic11235,
Arjen,

HA! Your right! Yeah, propositional logic uses to general syntactical structures, universal and existential general sentences. Universal quantifiers would make a claim about all members of a group where existential quantifiers make a claim about some members of a group. Sorry about that, I was in an existential mood.

As for an introduction to predicate logic, sure! I have a few more posts with the propositional logic series, but I'll do a predicate introduction if anyone wants it. It does get fun once you get into predicate logic. And the best thing about it is it is really not that difficult to get down if you understand the propositional elements in it. There is also monadic and modal logics as well which follow directly after predicate which I have always found interesting.

Zentetic,

Well I can assume that propositional logic is used for argumentative evaluation. At least that's why I use it. I found propositional logic especially useful as a proverbial "argument check." But I'm sure there is a more fantastic and eloquent operandi out there, but that seems to fit with what I use it for anyway. But from how I understand logic, propositional is as integral to predicate as predicate is integral to propositional. It's not that propositional is less developed, it's just that there are more advanced systems we can use to do roughly the same thing. This even applies within propositional logic, with basic truth tables and the more thorough proofs. Simply, there are limits to every system.
 
Arjen
 
Reply Sun 10 Aug, 2008 10:32 pm
@VideCorSpoon,
*orders some logic for future helpings*
:a-ok:
 
bettydlgc
 
Reply Wed 10 Dec, 2008 10:29 pm
@VideCorSpoon,
can you help me solve the following?

-(-P v -Q) therefore (P & Q)
 
hammersklavier
 
Reply Sat 24 Jan, 2009 02:35 pm
@VideCorSpoon,
Let me give a whack at it. I'm out of practice and I need to take the class again, so it's not going to be in the formal form.

1. ~ ( ~P v ~Q) / (P & Q)
2. (P v Q) -- as I recall I can use Negation Elimination to distribute the negative outside the parenthesis throughout the system (and thereby eliminate the negative expressions within the parentheses).
3. P 4. Q -- I need to use Disjunction Elimination to separate the two variables. I can't remember what it looks like either.
5. (P & Q) -- having separated the premises I can now use Conjunction Introduction to put them back together and come up with my proof. (&i 3,4)

I know this is untidy, and I'm way out of practice, but I think I've gotten the basic idea of what I have to do to prove this argument. If someone could order it formally?
 
VideCorSpoon
 
Reply Sat 24 Jan, 2009 10:10 pm
@hammersklavier,
It's very close. I put your proof into formal form to work it out.
http://i44.tinypic.com/15834tv.jpg
The initial problem that I see is that PvQ cnanot be inferred from ~ (~Pv~Q). Negation elimination (also called double negation) states ~~P can replace or be replaced by P. It's like saying P is the same as not not P. The main problem is that even if you negate 1, you are still left with the same formula, only with redundant negations tied on. For 3 and 4, disjunction elimination requires 3 elements to derive a result. There needs to be;

1 P-->Q
2 R -->S
3 P v R__
Q v S.

If you are looking at wiki for the definition of disjunction elimination Disjunction elimination - Wikipedia, the free encyclopedia , it is both right and wrong. Yes, it does follow the same structure as constructive dilemma (the term I use for the same inference), but they infer that from the conclusion of 2 and 3, you can get one variable (i.e. C). This is not the case and it is very misleading. You must infer a disjunction of the conclusion of the second line and the third line to derive the full inference (i.e. Q v S).

In order to separate any variable the way you want to in line #2, it must be a conjunction rather than a disjunction. Otherwise it must have negated antecedent to set up the inference ( i.e. it must follow the form of a disjunctive syllogism).

As for solving the proof, it is difficult to do so because you have a completely negated statement. You basically say " It is not the case that either not Paul or not Quincy swimming in the pool." Actually, it is quite paradoxical if you think about it. This in a way seems like a variation the liars paradox. If it is not the case that neither Paul nor Quincy is in the pool, this means that they are actually swimming in the pool. If you had more premises in your argument, you could give it a whirl. By itself, the only way I could see it being solved is either by indirect or conditional proof. Essentially, you would have to begin the proof with an assumed premise and either assume ~P, derive a contradiction, and end the indentation by asserting P or construct a conditional sentence , assume P, derive Q, and assert P-->Q. Its either those two inference rules or go into quantifier logic, which does not really apply well in this case. But indirect proof would probably be your best bet. You can assume ~P, add ~Pv~Q, but then that is where you will get hung up.
 
hammersklavier
 
Reply Sun 25 Jan, 2009 12:14 pm
@VideCorSpoon,
Thanks! I got a D in that course because of a distressing tendency to sleep through it (it was an early morning class) and I need to retake it, either during the summer or next fall. What I was trying to say is that I remember I can use Negation Elimination to get rid of all the "~"-es but I couldn't remember how many lines or what formal structure it took.
 
ali ant207
 
Reply Thu 30 Apr, 2009 05:35 am
@VideCorSpoon,
does anyone know how i'd solve p -> q therefore ~(p^~q)? i have no idea what so ever!
 
VideCorSpoon
 
Reply Thu 30 Apr, 2009 09:05 am
@ali ant207,
In my mind, the only way you can successfully solve the proof is with an indirect proof, essentially finding a contradiction and inferring the answer based on that contradiction. Any other way would be nearly impossible. It would be relatively simple if the negation for the Q in ~(P&~Q) were not there. You could just infer that from DeMorgans ~P v ~Q. But that's not the case.

This is my solution to the problem.

http://i40.tinypic.com/2hzr4nt.jpg
 
ali ant207
 
Reply Mon 4 May, 2009 05:01 am
@VideCorSpoon,
Thank you for that answer, but I'm still confused how you would start this problem?? Is there a particular order the rules have to be used? To me, it just looks like you could get a million answers, but theres clearly one certain way of doing it! Eeep!
 
 

 
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.05 seconds on 12/21/2024 at 10:20:06