.com.unity Forums
  The Official e-Store of Shrapnel Games

This Month's Specials

Raging Tiger- Save $9.00
winSPMBT: Main Battle Tank- Save $6.00

   







Go Back   .com.unity Forums > Shrapnel Community > Space Empires: IV & V

Reply
 
Thread Tools Display Modes
  #251  
Old March 24th, 2003, 05:41 AM
QuarianRex's Avatar

QuarianRex QuarianRex is offline
Sergeant
 
Join Date: Feb 2002
Location: Canada
Posts: 346
Thanks: 0
Thanked 0 Times in 0 Posts
QuarianRex is on a distinguished road
Default Re: [OT] Plato\'s Pub and Philosophical Society

Quote:
Originally posted by Chronon:
Also, I think the human mind is more a pattern of energy than strictly a chemical process. And this pattern is extremely complex and ever-changing.
Actually, the energy component is relatively short-term, about 20 min or so. It is during these 20 min that the looping energy pattern that will be memory acts as a template for the growth of dendritic spines on the neurons. Further stimulation (through dreams) can tead to growth of actual dendritic branches. Once the growth is finished you have yourself a new memory. Also, the more that memory is activated, the more the dendrites grow, and so the easier it is to access that memory.

Why do we structurally store memories? Because the energy Version is too damn fragile. One good bump and it's gone. That is why those who have had head trauma, electric shock, etc. have memory loss (usually of the preceeding 20 min).

Also, realize that energy is not transmitted within our brains through electrons. It is done through chemical ions (calcium, potassium, etc.). That is one of the reasons that computers are so fast. They operate through much faster electrons. Ironicly, this may be a strike against them ever becoming conscious. Cousciousness is very much a temporal event. In fact it only emerges with specific timing (20 msec) and complexity. Computers just might be too fast to gain cousciousness.

Thinking of the brain as some kind of crude anchor to which the energy of ou mind is tied is somewhat baseless. The structure of our brain is our mind.
__________________
I do not know with what weapons World War III will be fought, but I know that World War IV will be fought with sticks and stones.
-Albert Einstein
Reply With Quote
  #252  
Old March 24th, 2003, 05:51 AM
Kamog's Avatar

Kamog Kamog is offline
Lieutenant General
 
Join Date: Nov 2002
Posts: 2,903
Thanks: 1
Thanked 0 Times in 0 Posts
Kamog is on a distinguished road
Default Re: [OT] Plato\'s Pub and Philosophical Society

Hmm, can you explain this a bit more? I didn't quite understand how faster speed can prevent consciousness... If the computer is fast, wouldn't it just have thoughts that happen faster?
Reply With Quote
  #253  
Old March 24th, 2003, 05:58 AM
Suicide Junkie's Avatar
Suicide Junkie Suicide Junkie is offline
Shrapnel Fanatic
 
Join Date: Feb 2001
Location: Waterloo, Ontario, Canada
Posts: 11,451
Thanks: 1
Thanked 4 Times in 4 Posts
Suicide Junkie is on a distinguished road
Default Re: [OT] Plato\'s Pub and Philosophical Society

Quote:
Consciousness is very much a temporal event. In fact it only emerges with specific timing (20 msec) and complexity.
That's an interesting statement... Where did you get that from?
Reply With Quote
  #254  
Old March 24th, 2003, 09:08 AM
QuarianRex's Avatar

QuarianRex QuarianRex is offline
Sergeant
 
Join Date: Feb 2002
Location: Canada
Posts: 346
Thanks: 0
Thanked 0 Times in 0 Posts
QuarianRex is on a distinguished road
Default Re: [OT] Plato\'s Pub and Philosophical Society

Suicide Junkie:
It was something that was discussed in my neuro-science class Last semester. To get much more specific I would have to dig through my notes to find the relevant data (perhaps I'll get around to doing that later, if I have time).

Kamog:
One way to put it is that consciousness is something that has to have the time to consider itself. If a complex task is performed too quickly it doesn't have enough time to displat the emergent property of consciousness. It may be that computers are finishing their tasks before they have a chance to be more than their task. A lot of our consciousness is a result of the lingering effects (often self-perpetuated) of a stimulus rather than specifically due to any given stimulus. Sort of like with the example I gave of how we encode memory. A stimulus sets up a feedback loop that stimulates the growth of memory. This is further reinforced by dreaming to consolidate the memory. It is from this entire process (and many others) that we derive consciousness. A computer can store data and be done with it, no further actvity needed.

This isn't to say that an actual AI is impossible though. Merely that it would be qualitatively different from ours, if possible at all.

Interesting fact: the brain works in binary. Don't believe me? Ok. Each neuron transmits info through pulses down its axon. At any point on the axon there two possibilities, that it is "spiking" (passing an ion charge) or that it is not. The refractory period of the axon (the minimum time a point takes to "reset" after a burst) is 1 msec. Therefore in 5 msec there are 2^5 possible combinations of 1 and 0. In one sec there are 2^1000. And that is only for a single axon. Multiply that by the number of neurons in the brain and you get and idea of its actual computational power.

The latest generation of computer processors is getting up there. In fact, the tendency of pentiums and higher to randomly(?) take and odd action , or otherwise show the odd unexplainable bug, may be a precursor to something like an AI awakening.

I have often wondered at how (if) humans and AI's would be able to understand each other. It seems as if we would reach consciousness from opposite ends of the spectrum. Our brains (organic animals) developed as a capacity for action and then eventually evolved memory. AI's would have started as pure memory and then developed the capacity for action. What differences would there be between the products of such different origins? Would we be able to reconcile such differences? Sometimes such questions keep me up at night.
__________________
I do not know with what weapons World War III will be fought, but I know that World War IV will be fought with sticks and stones.
-Albert Einstein
Reply With Quote
  #255  
Old March 24th, 2003, 01:30 PM
dogscoff's Avatar

dogscoff dogscoff is offline
General
 
Join Date: Mar 2001
Location: UK
Posts: 4,245
Thanks: 0
Thanked 0 Times in 0 Posts
dogscoff is on a distinguished road
Default Re: [OT] Plato\'s Pub and Philosophical Society

Quote:
If humans have free will, then do dogs have free will? How about fish? Insects? Bacteria? Where do we draw the line? In my opinion, if we say that humans have free will, then all life forms must have it also.
I think there is a line somewhere, albeit a very fuzzy one. It all comes down to the complexity of the brain. Chimps, elephants, dolphins, dogs, cats... these animals and others have a lot more 'human' qualities than are generally ascribed to them. For example, I think some animals definitely have a sense of humour and capacity for human-like emotion. I also thnk that complexity isn't enough- brain complexity is only potential intelligence/ free-will/ sentience/ self awareness/ whatever we're calling it. You need to fill that complexity with experience and memory for it to become self aware. For that reason I don't think all dogs are necessarily sentient- just the ones which have had sufficient mental stimulation and interaction to become self-aware. Likewise a new-born baby is not a sentient creature- it's just a potentially sentient one. (That doesn't mean I value babies any less than anyone else does, though.) At some point in their development they cross the barrier and become self-aware.

Quote:
If it is possible to arrange a collection of atoms in such a way as to have free will (as in a human brain), then in theory it must also be possible to construct a machine that has free will.
I think we will see AIs in our lifetimes. I agree with Quarian in that they will have to operate in vastly different ways to us. The difficulty will not be building the "brain" - the potential intelligence - but in filling that brain with useful experience and interaction. Until we can build a viable robotic body for an AI, it will exist inside some static mainframe-box and will have to get all its experiences second hand (through encyclopedia, the web, media, conversation with humans etc) or via virtual simulations. Either way, it will be very difficult for the emergent AI to relate to humans, because their store of experiences will be so different to ours. Also, early AI will not have any need for motivation and so will not be given any- this will make them even more alien to creatures like us that are driven by biological and societal motivations. Later AIs, especially more mobile ones, will be given desires and drives- self preservation, empathy, the desire to achieve, to learn etc.

This will eventually make them easier for us to accept, but the first few years of human/ AI relations will be very difficult. People will fear AIs as a threat, (I can see the "Frankenstein" headlines in parts of the Brtitish press now ) and their initial alien-ness will mean people either refuse to accept their intelligence and treat them as dumb machines (effectively consigning them to slavery) or block further development in AI tech, or both.

I would like to see human rights organisations pre-empt AI technology by defining NOW what constitutes an artificial intelligence for the purposes of assigning it certain rights and protections. Unfortunately I don't think this is likely to happen, and AIs will be used as cheap labour, no doubt programmed to obey (like in asimov's second law of robotics: "A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.")

Once we are able to put AIs in human-like bodies, they will be able to gather their experiences in much the same way that we do, they will be much easier for us to relate to and (some) humans will be able to accept their status and sympathise with them. Then the struggle for AI rights will begin, with economic interests trying to keep them in chains. However I doubt this will manifest itself in the kind of terminator2-style apocolypse postulated by the likes of blatant self-publicist Kevin Warwick, because AIs will be fundamentally safe: Although Aasimov's positronic brain and three laws are really pure technobabble, I'm sure human fears will make sure some kind of coded inhibition against anti-social behaviour will is implemented.

Which brings us back around to free will...

[ March 24, 2003, 11:35: Message edited by: dogscoff ]
Reply With Quote
  #256  
Old March 24th, 2003, 03:55 PM
Suicide Junkie's Avatar
Suicide Junkie Suicide Junkie is offline
Shrapnel Fanatic
 
Join Date: Feb 2001
Location: Waterloo, Ontario, Canada
Posts: 11,451
Thanks: 1
Thanked 4 Times in 4 Posts
Suicide Junkie is on a distinguished road
Default Re: [OT] Plato\'s Pub and Philosophical Society

Quote:
One way to put it is that consciousness is something that has to have the time to consider itself. If a complex task is performed too quickly it doesn't have enough time to displat the emergent property of consciousness. It may be that computers are finishing their tasks before they have a chance to be more than their task.
It sounds to me like the speed is not the problem here. Doing it faster means more time left over. The problem would be the narrow-mindedness, where the computer immediately stops thinking about all of the information it just processed in order to concentrate 100% on the next problem. Even if that problem is just watching the clock and waiting for input.

I'm not sure how you'd go about coding it to think about what it's doing... some set of parallel processors inspecting the incoming codes, perhaps an evolutionary programming system where it takes the majority decision of the currently top-ranked algorithms. (Ranked via various needs sensors, and perhaps a pair of "good bot"/"bad bot" social buttons on the front)
Reply With Quote
  #257  
Old March 24th, 2003, 10:49 PM
QuarianRex's Avatar

QuarianRex QuarianRex is offline
Sergeant
 
Join Date: Feb 2002
Location: Canada
Posts: 346
Thanks: 0
Thanked 0 Times in 0 Posts
QuarianRex is on a distinguished road
Default Re: [OT] Plato\'s Pub and Philosophical Society

Quote:
Originally posted by dogscoff:
I think there is a line somewhere, albeit a very fuzzy one. It all comes down to the complexity of the brain. Chimps, elephants, dolphins, dogs, cats... these animals and others have a lot more 'human' qualities than are generally ascribed to them. For example, I think some animals definitely have a sense of humour and capacity for human-like emotion. I also thnk that complexity isn't enough- brain complexity is only potential intelligence/ free-will/ sentience/ self awareness/ whatever we're calling it. You need to fill that complexity with experience and memory for it to become self aware. For that reason I don't think all dogs are necessarily sentient- just the ones which have had sufficient mental stimulation and interaction to become self-aware. Likewise a new-born baby is not a sentient creature- it's just a potentially sentient one. (That doesn't mean I value babies any less than anyone else does, though.) At some point in their development they cross the barrier and become self-aware.
Here's an interesting note on self-awareness. Self-awareness seems to be tied to longevity. People who are mentally challenged tend to score very low on scales of self-awareness. This seems to be tied to a lack of fourth and fifth order dendritic arborization (dendrite branches growing on branches, growing on branches, growing on branches, etc.). The interesting correlate here is that they tend to have shortened lifespans, typically dying in their late forties or fifties.

This is interestingly mirrored in the case of feral children (those raised by animals). Ferals tend to show a similar deficiency in self-awareness and lack of fourth and fifth order dendrite growth, even after extensive cultural assimilation and education. And guess what? Ferals tend to die in their forties and fifties as well.

Interesting is it not?
__________________
I do not know with what weapons World War III will be fought, but I know that World War IV will be fought with sticks and stones.
-Albert Einstein
Reply With Quote
  #258  
Old March 24th, 2003, 11:06 PM
QuarianRex's Avatar

QuarianRex QuarianRex is offline
Sergeant
 
Join Date: Feb 2002
Location: Canada
Posts: 346
Thanks: 0
Thanked 0 Times in 0 Posts
QuarianRex is on a distinguished road
Default Re: [OT] Plato\'s Pub and Philosophical Society

Quote:
Originally posted by Suicide Junkie:
quote:
One way to put it is that consciousness is something that has to have the time to consider itself. If a complex task is performed too quickly it doesn't have enough time to displat the emergent property of consciousness. It may be that computers are finishing their tasks before they have a chance to be more than their task.
It sounds to me like the speed is not the problem here. Doing it faster means more time left over. The problem would be the narrow-mindedness, where the computer immediately stops thinking about all of the information it just processed in order to concentrate 100% on the next problem. Even if that problem is just watching the clock and waiting for input.

I'm not sure how you'd go about coding it to think about what it's doing... some set of parallel processors inspecting the incoming codes, perhaps an evolutionary programming system where it takes the majority decision of the currently top-ranked algorithms. (Ranked via various needs sensors, and perhaps a pair of "good bot"/"bad bot" social buttons on the front)

I think it has a lot to do with the way in which we each encode information (humans and computers). Humans don't just store information, we store our interpretation of information. That filtering process is part of what gives us cousciousness. Computers can just store data whole cloth, no need for interpretation. I don't think that finding a way for computers to mimic our encoding process is the answrer to creating an AI. For AI's an entirely different process would have to be discovered, one taking into account such fundamental differences.

As for the 20 msec time frame, we have processes within us that happen both faster and slower, but it is only those that occur at @20 msec that produce/are a part/define consciousness. If computers can achieve consciousness it will most likely be in a very different timeframe. Perhaps one in which we will be unable to recognize their awakening.
__________________
I do not know with what weapons World War III will be fought, but I know that World War IV will be fought with sticks and stones.
-Albert Einstein
Reply With Quote
  #259  
Old March 24th, 2003, 11:33 PM
Fyron's Avatar

Fyron Fyron is offline
Shrapnel Fanatic
 
Join Date: Jul 2001
Location: Southern CA, USA
Posts: 18,394
Thanks: 0
Thanked 12 Times in 10 Posts
Fyron is an unknown quantity at this point
Default Re: [OT] Plato\'s Pub and Philosophical Society

Quote:
and perhaps a pair of "good bot"/"bad bot" social buttons on the front)
ROFLOL!
__________________
It's not whether you win or lose that counts: it's how much pain you inflict along the way.
--- SpaceEmpires.net --- RSS --- SEnet ModWorks --- SEIV Modding 101 Tutorial
--- Join us in the #SpaceEmpires IRC channel on the Freenode IRC network.
--- Due to restrictively low sig limits, you must visit this link to view the rest of my signature.
Reply With Quote
  #260  
Old March 25th, 2003, 08:21 PM
QuarianRex's Avatar

QuarianRex QuarianRex is offline
Sergeant
 
Join Date: Feb 2002
Location: Canada
Posts: 346
Thanks: 0
Thanked 0 Times in 0 Posts
QuarianRex is on a distinguished road
Default Re: [OT] Plato\'s Pub and Philosophical Society

Has anyone ever read the hyperion series by Dan Simmons? It has an interesting account of the development of AI's, especially in the Last two books (Endymion and Rise of Endymion). It sees them basically as viruses that gained sentience through parasitic consumption of their bretheren. This had some interesting implications on their group psychology and in their interaction with humans.
__________________
I do not know with what weapons World War III will be fought, but I know that World War IV will be fought with sticks and stones.
-Albert Einstein
Reply With Quote
Reply

Bookmarks

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is On

Forum Jump


All times are GMT -4. The time now is 06:47 PM.


Powered by vBulletin® Version 3.8.1
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©1999 - 2024, Shrapnel Games, Inc. - All Rights Reserved.