Methods on Meaning: Cybersemiotics and Evolutionary Continuity

  1. Philosophy Forum
  2. » Metaphysics
  3. » Methods on Meaning: Cybersemiotics and Evolutionary Continuity

Get Email Updates Email this Topic Print this Page

Reply Mon 9 Feb, 2009 01:33 am
I know this might be dense, and obscure - but hopefully you can help me think through this:

"."

- Soren Brier, Cybersemiotics: Why Information Is not Enough

In Pierces semiotic Triad, the symbol is Primary - the representaion of an object. As one assumes this primacy, the open ontological chaotic "system" becomes a closed system upon evaluative parameters. Within this closed system, a triad can be developed between the representation, what it represents, and the interpretation. Although, the object it represents is in many ways outside of the triad. I think the object must take on a purely denotative ("topographical") role, and not an ontological role. So, most of the object becomes lost in the system. Also, at another corner, the interpreter takes on another role where certain aspects of it become lost in the evaluative importance of the symbol. The symbol becomes full, where the others become partial. The system already has within it possibilities of other systems, as well as the possibility of merging these systems through cybersemiotics.

The interesting thing about systems, I've found, is that they are analagous to different types of lenses - The lense of our eye to the "lense" of an electron microscope. As with our eyes, when we focus on a single point in space, the point is of primacy as everything else becomes peripheral. Focul Points and periphery seem to be the general problem of all human enterprises. The steroscopic vision is unnatainable. It can only be attained through transrational means, where the focul points merge with the periphery in equilibrium. The only way to eliminate evaluations, isolated systems, high and low probability, plausibility and implausibility is to eliminate the current paradigm of rationality and go beyond it, but not without it. That is, if you're not shooting at pragmatic targets, but at a chaotic ontological "totality".

Though, ontology will never be total as it will always be reduced to mnemonic metaphors.

Sub-symbolic and pre-logical neural networks - the organic matter of the brain, in other words - in a few fancy terms is, what I think our linguistic and logical (meaningful) faculties emerge from. These neural networks are physical structures in a (cybernetic) autopoietic (automatically produced) feedback loop with perception. They are considered foundational to perception, even. For instance, the nerve endings in the eye (the retina) process light into perceptual information by transforming light into a digestable nerve-compatible material, that the optic nerve sends to nine nuclei that relay this information into the visual cortex which actually makes the initial signals from the optic nerve more complex. This example, I think illustrates how nerve digestion and processing of light is an increasingly complex process that is a pre-logical, sub-symbolic material process. After the additive complexity within the primary visual cortex, it becomes even more complex as the brain, through neural communication and organic mutation, processes the perception with hyper-complex (cannot predict it mathematically) logical and symbolic faculties. This hyper-complex structure of "buzzing" neural networks become meaningful only to the extent that "difference makes a difference". What this means, I think, is that meaning is not something that nerologists will find in the brain. It is emergent from neural networks - but these neural networks are a historically continual process; meaning that there is never a physical gap in the evolution of bodies (and brains).

Evolution is a continuity that never ceases, in other words. Neg-entropy is an explanatory tool that combines thermodynamic entropy with informational entropy in hopes of creating a new evolutionary theory that combines matter, energy, and information. (Soren Brier) Meaning, then, under this theory is understood to be a methodological combination of Cybernetics, Neurology, thermodynamics, linguistics (particularly Wittegenstein's language games), and semiotics in a field called, Cybersemiotics.

Long-winded, yes. But hopefully you can find something interesting to talk about.

Thanks.
 
Bones-O
 
Reply Mon 9 Feb, 2009 11:50 am
@cypressmoon,
cypressmoon wrote:
I think the object must take on a purely denotative ("topographical") role, and not an ontological role. So, most of the object becomes lost in the system. Also, at another corner, the interpreter takes on another role where certain aspects of it become lost in the evaluative importance of the symbol. The symbol becomes full, where the others become partial. The system already has within it possibilities of other systems, as well as the possibility of merging these systems through cybersemiotics.

I'm not sure I understand this point correctly. I'm interpreting it thus: The interpreter places disproportional importance on the symbol rather than what it represents or what it means, thus the symbol begins to take on the quality of an object. Is that right?
 
cypressmoon
 
Reply Mon 9 Feb, 2009 12:26 pm
@cypressmoon,
Thanks for the response, Bones. Although, I think that you might be interpreting it as though it's an epistemic model. I don't think that's what Quine's semiotics was. The way I see it, is that it is a semiotic system, not an epistemological subject/object relationship. By semiotic system, I mean that it's rational parameters (the triangle) are designed in such a way as to give insights into the symbols in relation to the object(s) it represents and the interpretor. It is a three-way relationship that is established, where the symbol takes primary importance rather than having the interpretor (a subject) take on primary importance which would be more of an epistemic evaluation.

This method, then, places more significance on the symbol, less on the object and interpretor. It is not an epistemic model, though. It's a semiotic model. The shifting of values from one to another creates whole new closed systems to analyse, I'd say. As a counter-example, physics places importance on the object, making it Firstness... where the interpretor is only slightly significant when they run into problems of method.

That's my take on certain aspects of systems anyhow.
 
Bones-O
 
Reply Mon 9 Feb, 2009 01:34 pm
@cypressmoon,
Ah, I see. Sorry, I misinterpreted the symbol 'it'. Smile

cypressmoon wrote:

In Pierces semiotic Triad, the symbol is Primary - the representaion of an object. As one assumes this primacy, the open ontological chaotic "system" becomes a closed system upon evaluative parameters.

So once we focus on the symbol rather than what is means or represents, the open-endedness of what is real and what reality means becomes closed insofar as symbols are inherently and unambigously comprehendible?

cypressmoon wrote:

Although, the object it represents is in many ways outside of the triad. I think the object must take on a purely denotative ("topographical") role, and not an ontological role.

What is denotative that is not already symbolic?

cypressmoon wrote:

So, most of the object becomes lost in the system.

i.e. Not all of the object is represented? True enough if that's what you mean.

cypressmoon wrote:

Also, at another corner, the interpreter takes on another role where certain aspects of it become lost in the evaluative importance of the symbol.

This I struggle with since it reads like interpretation is only partially lost when we make the symbol primary, unless you are saying this is a necessary side-effect of interpretation since we must evaluate the symbol first.

cypressmoon wrote:

The system already has within it possibilities of other systems, as well as the possibility of merging these systems through cybersemiotics.

Can you expand on this further?

cypressmoon wrote:

The steroscopic vision is unnatainable. It can only be attained through transrational means, where the focul points merge with the periphery in equilibrium.

And this too if you don't mind. Which sorts of transrational processes do you have in mind, and what does it mean when focal points and periphery attain equilibrium? Ta.

cypressmoon wrote:

What this means, I think, is that meaning is not something that nerologists will find in the brain. It is emergent from neural networks - but these neural networks are a historically continual process; meaning that there is never a physical gap in the evolution of bodies (and brains).

I certainly agree that what we assign as meaning is relational rather than intrinsic, and that the brain is dynamic in this sense. Are you using evolution here in its strict Darwinian sense? Because I don't see the link.

cypressmoon wrote:

Meaning, then, under this theory is understood to be a methodological combination of Cybernetics, Neurology, thermodynamics, linguistics (particularly Wittegenstein's language games), and semiotics in a field called, Cybersemiotics.

That seems unfair. It should be called Cyberneuroentrolinguisemiotics. Smile
 
cypressmoon
 
Reply Thu 9 Apr, 2009 08:38 pm
@cypressmoon,
MJ,

Long time no read. How's the cult over at PF?

What do you think about "otherness"?
 
 

 
  1. Philosophy Forum
  2. » Metaphysics
  3. » Methods on Meaning: Cybersemiotics and Evolutionary Continuity
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.02 seconds on 04/26/2024 at 07:26:26