Alan McDougall;72567 wrote:
Like germinating seeds, this global brain continues to evolve and as some forward-thinkers believe, will not stop until it develops feelings and achieves consciousness.
Your post brings up a lot of issues that need to be discussed. I will try to focus on not to many at a time.
First of all let me say this:
Any calculation of how long it will take an entity to become conscious is simply ridiculous as long as there is no valid definition of consciousness (nothing against you but against that 'forward-thinker' who came up with this hilarious number of 2030).
Let me tell you an important experience i made: In the beginning nineties i heard them say on the radio that computer scientists assume that computers will be able to talk like humans in a couple of years (meaning long before 2000). This assumption was based on the continously accelerating doubling rate of processor speed (Moores law about the doubling rate of processor speed was long time outnumbered already, the development that took place was even much faster than what Moore predicted).
In simpler words the computer speed increased at such impressive rates that people thought "Look, in a couple of years computers will talk just like you and i do".
What they didn't see was that computer scientists when they make a prediction they tend to be half blindfolded.
Nowadays we have proof of how ridiculous this idea was, because it is still impossible for computers to understand language half way enough to have them translate a text from one language into another. Have you ever tried it? Try it, the result will even rescue a party that starts getting boring.
The prediction that some 'scientists' made has proven wrong, but actually could have been refuted in those days already if they had taken the more philosophical aspects of language in account.
John Roger Searle had already collected some very important points about this problem in his model of the chinese room
Language, referring to his conclusion is a major component of 'mind'.
Using language takes an extremely high developed level of 'understanding'.
Computer guys of course don't usually deal with such esoteric terms like 'understanding'.
I talked to several people who certainly can be considered 'intellectual' who were absolutely convienced that a dictionary combined with the rules of grammar totally enables a computer to speak (respond reasonably).
If the computer won't speak the program needs to become more complex, that's what they assume.
They consider the substance of words combined with the logic of grammar the basis for language. This is a typical 'age of enlightenment' world view according to the mechanistic picture that is still much more present in critical minds than we expect.
From my point of view it looks like: Grammar is a horizontal logic that can be applied to the substance of words, but it's only functional with the vertical logic of semantics that also needs to be applied.
And moving from the horizontal logic of grammar to the vertical logic of semantics adds a new dimension which increases the complexity of the system just the way as when you change your perspective from a twodimensional to a threedimensional system.
It's an explosion.
Why am i talking about language, if your question was not about language at all?
Look, it's the same to consciousness. This guy who wrote that sermon you posted seems to believe, just like the guys who expected computers to talk, that an increase of processed information will cause consciousness to rise.
This is ridiculous. Honestly i do not believe that consciousness is a god's gift, but i do see that consciousness is way to complex to be accidentally created by the mix of information, opinions, services and porn that we find on the internet.
It's the same as the language case.
People see (actually existing) parallels between the way neurons function and the way searching machines develop their (neuronal) weights.
Neuronal nets however are not something mystic, they are being used as computer programs for particular problems.
But they are not going to become conscious, just as much and as little computers talk like humans.
How conscious is the computer that sais "Press One if you want to talk to a human person"?
How much does he understand of what he sais?
As long as you can not have a reasonable conversation with a computer you should laugh about anyone who tries to sell you a calculation of how long it takes for computers to become conscious.
Those people are not even able to define what they mean when they say 'consciousness' .