Proto-Logic

  1. Philosophy Forum
  2. » Logic
  3. » Proto-Logic

Get Email Updates Email this Topic Print this Page

Reply Sun 18 Apr, 2010 08:02 pm
I'm more than a little interested in the structure of human thought. I feel that the TLP is concerned with what I call "proto-logic." An equally good term might be "transcendental logic."

Two little points I'll kick off with. Consider the word "what" (as an instance among other words of this type). "What now?" "What" seems to function like a variable, like an x. This seems related to the existential quantifier in formal logic and the variable in math. Note: I see "proto-logic" as foundational to both language, formal logic, mathematics, algorithms...

Another word is "not." The thought of negation. It seems built in. "He's not here" is the negation of "he's here." A sort of minus sign on a statement. Of course we have this in formal logic, and we have the negative equals sign in math.

For me, this is the core of philosophy, this searching after the proto-logic.
Thoughts?
 
mister kitten
 
Reply Sun 18 Apr, 2010 08:04 pm
@Reconstructo,
The negative sign also reads 'is opposite to' if you did not already know.
 
Reconstructo
 
Reply Sun 18 Apr, 2010 08:17 pm
@mister kitten,
mister kitten;153806 wrote:
The negative sign also reads 'is opposite to' if you did not already know.


Right on! And this ties into the dimension theme in the thread "2". Logic is a binary number system. It doesn't matter if it's digits are named T and F. Digit names are utterly contingent. (My opinion, and it's a charming metaphor for essence and accident.) And shall we not add that zero is the fulcrum of the subtraction/addition spectrum? Just as one is the center of the positives where multiplication/division is concerned?

I used to program in Pascal, which has a Boolean variable type. Quite useful and efficient as it requires one bit and only one bit. One could build a maze array out of False for empty space and True for a wall. Traditional numbers require more memory, and arrays are memory hogs already.
The 2 thread is a proto-logical investigation.
 
Reconstructo
 
Reply Mon 19 Apr, 2010 12:45 am
@Reconstructo,
If we translate from English to German, for instance, we are going to lose some information, aren't we? Subtle associations. That sort of thing.

But do we lose any information if we go from hexidecimal to binary? Or is quantity such an absolute abstraction that the number system involved is completely a matter of convenience? I think it is. I see a number like "134" and I now "see" those digits as just one aspect of something with an infinity of aspects. And yet we are talking about the same magnitude/quantity. Not only can we write a number in any base we please, we can also write an expression, even a definite integral, and we still just have a magnitude before us. We can also conceive of this "magnitude" simply as a position on a line (one dimension).

It seems that our words like "good, better, best" and "bad, worse, worst" are similarly arranged. "Who is the fastest?" Value, quality, etc.
 
jgweed
 
Reply Mon 19 Apr, 2010 06:43 am
@Reconstructo,
Computer machine language requires a 1 or a 0 to be dropped into a bucket and programming languages are translations of orders about what to drop in which bucket. But if we cannot assign a numerical quantity to such terms in a series such as good/better/best, we cannot use it in a programme.

And isn't it the case that the series is a model but the values can change according to the subject to which it is applied, can change by who applies them, and can be discarded for a more detailed series is appropriate (think of compass directions for example;NSEW is fine for some things, NNW for others)?
 
kennethamy
 
Reply Mon 19 Apr, 2010 07:08 am
@Reconstructo,
Reconstructo;153804 wrote:
I'm more than a little interested in the structure of human thought. I feel that the TLP is concerned with what I call "proto-logic." An equally good term might be "transcendental logic."

Two little points I'll kick off with. Consider the word "what" (as an instance among other words of this type). "What now?" "What" seems to function like a variable, like an x. This seems related to the existential quantifier in formal logic and the variable in math. Note: I see "proto-logic" as foundational to both language, formal logic, mathematics, algorithms...

Another word is "not." The thought of negation. It seems built in. "He's not here" is the negation of "he's here." A sort of minus sign on a statement. Of course we have this in formal logic, and we have the negative equals sign in math.

For me, this is the core of philosophy, this searching after the proto-logic.
Thoughts?


"What" is an interrogative pronoun. "Not" is the negative operator. It reverses the truth value of the proposition it operates on. It is not a "thought" of negation. It is a formal operator. You tell stories where there are no stories. What you seem to mean by "proto-logic" is the depth logic of language. Or, what Chomsky calls, "depth grammar". Bertrand Russell is a fine source for that. See his theory of descriptions.
 
Reconstructo
 
Reply Mon 19 Apr, 2010 03:39 pm
@jgweed,
jgweed;153945 wrote:
Computer machine language requires a 1 or a 0 to be dropped into a bucket and programming languages are translations of orders about what to drop in which bucket. But if we cannot assign a numerical quantity to such terms in a series such as good/better/best, we cannot use it in a programme.

Well, I would never argue that the two were one and the same, but only that they have a shared core. In both cases we are dealing with a spectrum. From glancing at at, it seems that fuzzy logic gives spectrum values to propositions, rather than binary values.. And this is obviously a move in the math direction. Words can never be calculated as numbers can. I propose that number is a subset of word, as strictly and systematically quantized magnitude and symbols of relation.
Yes, it's quite true. We would have to turn good-better-best into numbers.
I'm trying to point at the shared core though. The unity element. Objects are singular. If we name a plurality of objects, still this plurality is singular, just as the word plurality is singular. Considered as a plurality, they are being considered as a sum. if we speak of all cats, we speak of a sum. if we speak of our pet cat fluffy, we speak of a sum, the sum of our experiences of "fluffy." the Being of beings is perhaps unity.

---------- Post added 04-19-2010 at 04:47 PM ----------

jgweed;153945 wrote:

And isn't it the case that the series is a model but the values can change according to the subject to which it is applied, can change by who applies them, and can be discarded for a more detailed series is appropriate (think of compass directions for example;NSEW is fine for some things, NNW for others)?

I'm thinking of the sigma sign in math. Sum 2n (for n = natural numbers 1 to 10). It looks better when written in math scribble. 3 to the power of 3 is just an abbreviated way to order a multiplication, and multiplication is just addition abbreviated, etc. And position notation is just unary abbreviated. Of course all this depends on an established system of notation. Still, it surpasses our much more complex Word use in its clarity.

I agree that various level of precision are desired according to the situation. We are still on a one-dimensional spectrum though, which allows us two directions. I like that you mention NESW. Polar coordinates are great. And the compass and the clock are polar. Angle and radius.

It may be a failure on my part to express myself, but I do feel that you don't see what I'm driving at. It's a strange thought (my idea), and perhaps in no way useful.
 
Reconstructo
 
Reply Wed 21 Apr, 2010 08:14 pm
@Reconstructo,
This is great. Don't tell me, friends, that math isn't sculpture.

Grandi's series - Wikipedia, the free encyclopedia
 
kennethamy
 
Reply Wed 21 Apr, 2010 08:17 pm
@Reconstructo,
Reconstructo;155078 wrote:
This is great. Don't tell me, friends, that math isn't sculpture.

Grandi's series - Wikipedia, the free encyclopedia


Never, never, never. It would be too obviously true to say. It would be like telling you that a raven was not a writing desk. Why would I think that anyone believed it was?
 
Reconstructo
 
Reply Wed 21 Apr, 2010 09:47 pm
@Reconstructo,
The number i is nice little thought machine. Using just this one number we can rotate the equivalent of pi/2 or 90 degrees per self-multiplication. if positive one is forward one, and negative one is backward one, then i is lateral or better yet spinning one.

1 to the power of infinity is static. -1 to the power of infinity inverts itself. i to the power of infinity rotates between 1, -1, i, and -i.

It's off the o.p., but maybe not that far off.
 
GoshisDead
 
Reply Thu 22 Apr, 2010 11:20 am
@Reconstructo,
However one also has to think about culturally transitional grammar schemes. These things aren't universally fixed. In English, due to specific historical prescriptive language reforms, if I were to say, 'he is not not here' he would be here (double negative), but in most other languages a double negative is just the emphasis of the original negative that is why in most colloquial dialects of English one can say 'no he ain't here'. Just one little anecdotal example of why using language to get at proto-logic is so very difficult. But it is correct, I would suppose, to consider that every language as it is processed by the brain has a functional negativizer category that fuzzes out at the edges.
 
kennethamy
 
Reply Thu 22 Apr, 2010 12:10 pm
@GoshisDead,
GoshisDead;155240 wrote:
However one also has to think about culturally transitional grammar schemes. These things aren't universally fixed. In English, due to specific historical prescriptive language reforms, if I were to say, 'he is not not here' he would be here (double negative), but in most other languages a double negative is just the emphasis of the original negative that is why in most colloquial dialects of English one can say 'no he ain't here'. Just one little anecdotal example of why using language to get at proto-logic is so very difficult. But it is correct, I would suppose, to consider that every language as it is processed by the brain has a functional negativizer category that fuzzes out at the edges.


Chomsky's famous example of depth grammar as contrasted with surface grammar are the sentences, "John is eager to please" and, "John is easy to please". Those sentences have the same surface grammar, but very different depth grammar. In the former, "John" is the subject of the sentence. In the second, "John" is the object of the sentence. This is a quite easy example.
 
GoshisDead
 
Reply Thu 22 Apr, 2010 12:33 pm
@kennethamy,
And your point is Ken?
 
kennethamy
 
Reply Thu 22 Apr, 2010 01:18 pm
@GoshisDead,
GoshisDead;155253 wrote:
And your point is Ken?


That protologic (or deep logic) is not something impossibly hard to do.
 
GoshisDead
 
Reply Thu 22 Apr, 2010 02:02 pm
@Reconstructo,
Chomsky never refered to Deep Logic in his linguisitc work, he refered to deep structure which is a transformational grammar term used when linguists were still attempting to attain a system that could explain Universal Grammar. UG was considered to get at the human ability to use language, that somewhere in the brain there is a universal method of creating language it has a set of rules that explains linguisitic diversity. The issues with Chomsky's deep structure are numerous, not the least of which is its anglocentrism. This is based from the actual syntactic tranformative trees. and the rules used in deep structure transformations. The rules necessarily allow a researcher to transform the syntax from language to language, which was intitally promissing, but frustrated by the fact that transforming from language to language is a lateral move and could never really get to a real proto-logic.
 
kennethamy
 
Reply Thu 22 Apr, 2010 03:25 pm
@GoshisDead,
GoshisDead;155283 wrote:
Chomsky never refered to Deep Logic in his linguisitc work, he refered to deep structure which is a transformational grammar term used when linguists were still attempting to attain a system that could explain Universal Grammar. UG was considered to get at the human ability to use language, that somewhere in the brain there is a universal method of creating language it has a set of rules that explains linguisitic diversity. The issues with Chomsky's deep structure are numerous, not the least of which is its anglocentrism. This is based from the actual syntactic tranformative trees. and the rules used in deep structure transformations. The rules necessarily allow a researcher to transform the syntax from language to language, which was intitally promissing, but frustrated by the fact that transforming from language to language is a lateral move and could never really get to a real proto-logic.


I never used the term, "deep logic". I used the term, "deep grammar" which, I think Wittgenstein used. I understand that Chomsky's transformational grammar has issues. But it is still true that my examples illustrate the general point of the distinction between surface and deep structure. As illustration from philosophy is that although the term, "exists" appears to be a property term, it isn't really a property term.
 
GoshisDead
 
Reply Thu 22 Apr, 2010 03:56 pm
@kennethamy,
Wittgenstein's use of "deep grammar" was in reference to psychological causation of the meaning of surface grammar. It was an attempt to internalize the language process and eject the standard object/referent models. It Also suffers the same problems that Chomsky does, it is a series of transformations that lead nowehere. Except in wittgenstein's case the transformation is due to external forces applied, acquired, then transformed. But it still completely skips the actual mechanism of language. It does not address the actual cognitive process underlying language.
 
Reconstructo
 
Reply Thu 22 Apr, 2010 04:33 pm
@GoshisDead,
GoshisDead;155240 wrote:
However one also has to think about culturally transitional grammar schemes. These things aren't universally fixed. In English, due to specific historical prescriptive language reforms, if I were to say, 'he is not not here' he would be here (double negative), but in most other languages a double negative is just the emphasis of the original negative that is why in most colloquial dialects of English one can say 'no he ain't here'. Just one little anecdotal example of why using language to get at proto-logic is so very difficult. But it is correct, I would suppose, to consider that every language as it is processed by the brain has a functional negativizer category that fuzzes out at the edges.


I agree about the double negative often being used as if it were a stressed single negative. And yes, the protologic is quite a fox to hunt. I don't truly see how any presentation of a proto-logic can be proven. What is proof, after all, if not persuasion? I doubt Einstein troubled himself to read the Principia Mathematica. We trust certain tools and use them, because we trust them.

I've just got Dantzig's book which Einstein praised. It seems like a good one. Apparently Leibniz was quite excited about binary /base 2, and couldn't help finding some religious meaning in it. Also Cantor wanted just two transfinite sets. Aleph0 and Aleph1. How strange that even Mr. Transfinite himself still found himself echoing the 1/0 fascination. His grand obsession was to prove that infinity came in two and only two flavors.

This is why the NOT seems so important to me. It's a toggle switch, an inverter. Why was Nietzsche accused of inverting Plato, and not just rotating him 60 degrees? Why do we conceive of the number line as bi-directional? Functions have inversions, but not rotations.. It's possible that someone has messed with rotations, but this would be an after thought. Just as tone-row music is an after thought, and probably generated by boredom or the anxiety of influence. Smile
 
kennethamy
 
Reply Thu 22 Apr, 2010 06:27 pm
@GoshisDead,
GoshisDead;155315 wrote:
Wittgenstein's use of "deep grammar" was in reference to psychological causation of the meaning of surface grammar. It was an attempt to internalize the language process and eject the standard object/referent models. It Also suffers the same problems that Chomsky does, it is a series of transformations that lead nowehere. Except in wittgenstein's case the transformation is due to external forces applied, acquired, then transformed. But it still completely skips the actual mechanism of language. It does not address the actual cognitive process underlying language.


Oh, I think you are quite wrong about what Wittgenstein meant by "deep grammar". It had absolutely nothing whatever to do with psychology. Deep grammar has to do with the underlying logical structure of language. Wittgenstein could not care less about psycholinguistics. He is interested in the logic of language. That is why he makes the remark, "Theology as grammar".
 
jack phil
 
Reply Sat 24 Apr, 2010 10:31 pm
@Reconstructo,
Reconstructo;155116 wrote:
The number i is nice little thought machine. Using just this one number we can rotate the equivalent of pi/2 or 90 degrees per self-multiplication. if positive one is forward one, and negative one is backward one, then i is lateral or better yet spinning one.

1 to the power of infinity is static. -1 to the power of infinity inverts itself. i to the power of infinity rotates between 1, -1, i, and -i.

It's off the o.p., but maybe not that far off.


Isn't -1 inverted -1?
 
 

 
  1. Philosophy Forum
  2. » Logic
  3. » Proto-Logic
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.05 seconds on 12/22/2024 at 12:04:58