Total Posts:33|Showing Posts:1-30|Last Page
Jump to topic:

Consciousness

jewelessien
Posts: 155
Add as Friend
Challenge to a Debate
Send a Message
3/6/2014 3:53:05 AM
Posted: 2 years ago
Once again I'm tired of evolution and intelligent design and waiting to see how fast a science thread breaks down into an argument on religion.

So here is a question from Iredia: How does a fully physical, jellied mushroom in your head think or feel?

Electrochemistry? Emergent properties? Mind over matter? Anyone?
Everything is up for questioning. If it won't defend itself, then how do we know it can?
chui
Posts: 511
Add as Friend
Challenge to a Debate
Send a Message
3/6/2014 6:33:57 AM
Posted: 2 years ago
Perhaps consciousness is an illusion? On the other hand it may be purely a product of the biochemistry of the brain. A third option is something 'other' that science cannot talk about because it lacks the tools. To get completely weird, maybe we live in a virtual universe and everything is an illusion?
jewelessien
Posts: 155
Add as Friend
Challenge to a Debate
Send a Message
3/6/2014 7:48:03 AM
Posted: 2 years ago
At 3/6/2014 6:33:57 AM, chui wrote:
Perhaps consciousness is an illusion? On the other hand it may be purely a product of the biochemistry of the brain. A third option is something 'other' that science cannot talk about because it lacks the tools. To get completely weird, maybe we live in a virtual universe and everything is an illusion?

That last option reminds me of one of the stories in David Eagleman's Sum. In it humans are actually the nodes, transistors sort of, in this giant galactic supercomputer: we pass data along through random "unconscious" tics - tapping a foot, scratching an arm, blinking. But the Programmers are confused because somehow we think we're the bees' knees and we actually try to keep ourselves alive for them. Lol.

Seriously though, consciousness could be a by-product of biochemistry so complex that it can't help thinking about itself...
Everything is up for questioning. If it won't defend itself, then how do we know it can?
Skepticalone
Posts: 6,132
Add as Friend
Challenge to a Debate
Send a Message
3/6/2014 8:47:11 AM
Posted: 2 years ago
At 3/6/2014 3:53:05 AM, jewelessien wrote:
Once again I'm tired of evolution and intelligent design and waiting to see how fast a science thread breaks down into an argument on religion.

So here is a question from Iredia: How does a fully physical, jellied mushroom in your head think or feel?

Electrochemistry? Emergent properties? Mind over matter? Anyone?

I will go out on a limb, and say consciousness is the product of biochemistry and is an emergent property of the physical brain. We can show consciousness can be affected by mind altering drugs and brain damage. *notices thread may be fraying*
This thread is like eavesdropping on a conversation in a mental asylum. - Bulproof

You can call your invisible friends whatever you like. - Desmac

What the hell kind of coked up sideshow has this thread turned into. - Casten
jewelessien
Posts: 155
Add as Friend
Challenge to a Debate
Send a Message
3/6/2014 9:05:05 AM
Posted: 2 years ago
At 3/6/2014 8:47:11 AM, Skepticalone wrote:
At 3/6/2014 3:53:05 AM, jewelessien wrote:
Once again I'm tired of evolution and intelligent design and waiting to see how fast a science thread breaks down into an argument on religion.

So here is a question from Iredia: How does a fully physical, jellied mushroom in your head think or feel?

Electrochemistry? Emergent properties? Mind over matter? Anyone?

I will go out on a limb, and say consciousness is the product of biochemistry and is an emergent property of the physical brain. We can show consciousness can be affected by mind altering drugs and brain damage. *notices thread may be fraying*

Extreme analogy: we can show the working of a computer processor to be affected by dunking it in hot coffee. Does that necessarily mean that hot coffee affects the algorithms by which a processor works?
And mostly it is subconscious processes that are affected by mind-altering drugs and stuff.
Everything is up for questioning. If it won't defend itself, then how do we know it can?
Skepticalone
Posts: 6,132
Add as Friend
Challenge to a Debate
Send a Message
3/6/2014 9:14:59 AM
Posted: 2 years ago
At 3/6/2014 9:05:05 AM, jewelessien wrote:
At 3/6/2014 8:47:11 AM, Skepticalone wrote:
At 3/6/2014 3:53:05 AM, jewelessien wrote:
Once again I'm tired of evolution and intelligent design and waiting to see how fast a science thread breaks down into an argument on religion.

So here is a question from Iredia: How does a fully physical, jellied mushroom in your head think or feel?

Electrochemistry? Emergent properties? Mind over matter? Anyone?

I will go out on a limb, and say consciousness is the product of biochemistry and is an emergent property of the physical brain. We can show consciousness can be affected by mind altering drugs and brain damage. *notices thread may be fraying*

Extreme analogy: we can show the working of a computer processor to be affected by dunking it in hot coffee. Does that necessarily mean that hot coffee affects the algorithms by which a processor works?
And mostly it is subconscious processes that are affected by mind-altering drugs and stuff.

I'm not sure I understand how the coffee analogy applies to consciousness. I'm not trying to be difficult - I have a lot of ignorance on this subject. Also, would the subconsciousness be just another 'processor' in the brain? 'Splain it to me, Jewel!
This thread is like eavesdropping on a conversation in a mental asylum. - Bulproof

You can call your invisible friends whatever you like. - Desmac

What the hell kind of coked up sideshow has this thread turned into. - Casten
jewelessien
Posts: 155
Add as Friend
Challenge to a Debate
Send a Message
3/6/2014 9:22:10 AM
Posted: 2 years ago
At 3/6/2014 9:14:59 AM, Skepticalone wrote:
At 3/6/2014 9:05:05 AM, jewelessien wrote:
Extreme analogy: we can show the working of a computer processor to be affected by dunking it in hot coffee. Does that necessarily mean that hot coffee affects the algorithms by which a processor works?
And mostly it is subconscious processes that are affected by mind-altering drugs and stuff.

I'm not sure I understand how the coffee analogy applies to consciousness. I'm not trying to be difficult - I have a lot of ignorance on this subject. Also, would the subconsciousness be just another 'processor' in the brain? 'Splain it to me, Jewel!

*cracks knuckles*
Basically, I meant that the processor - the brain - is affected by its physical environment. But the algorithms and assembly code by which the processor works - the unconscious mind - are not the product of or particularly influenced by physical environments, though they are the product of the processor's wiring. The conscious mind is akin to an operating system written in a high level programming language. Its design and driving principles have almost nothing to do with hardware, but it only exists through the medium of the processor, and so it's indirectly affected by the state of the processor. But it would be absurd to conclude that the operating system or even the assembly code arose as a result of the processor's wiring.
Everything is up for questioning. If it won't defend itself, then how do we know it can?
v3nesl
Posts: 4,505
Add as Friend
Challenge to a Debate
Send a Message
3/6/2014 9:25:20 AM
Posted: 2 years ago
At 3/6/2014 3:53:05 AM, jewelessien wrote:
Once again I'm tired of evolution and intelligent design and waiting to see how fast a science thread breaks down into an argument on religion.

So here is a question from Iredia: How does a fully physical, jellied mushroom in your head think or feel?

Electrochemistry? Emergent properties? Mind over matter? Anyone?

I think thought is something non-physical. It is hosted by the physical in our case. Our brains are the medium, but not the source.

I learned the basic philosophic argument from C.S. Lewis, at the beginning of his book "Miracles". He notes how we dismiss ideas that are thought to have an identifiable cause: "You only say that because you're a girl". "You only believe that because you grew up in church". So we have the idea that valid thought is something apart from strict material cause and effect. A physical brain is just a machine, something different from what we think thought is.

People will bring up quantum mechanics as a source for independent thought, the seemingly random effects of QM. But that doesn't really work either - thoughts without any basis are insanity, so that's not what we're talking about either. Thought must have a basis, just not exclusively a material one. We think that we can think *about* the material world, and have true thoughts.

And btw, saying thought is an illusion is a mobius strip. If it's an ill-lusion, that presumes we can have true perception. So we can have thought but it's false thoughts about the ability to have true thoughts.... It goes nowhere, that idea.
This space for rent.
v3nesl
Posts: 4,505
Add as Friend
Challenge to a Debate
Send a Message
3/6/2014 9:29:35 AM
Posted: 2 years ago
At 3/6/2014 9:22:10 AM, jewelessien wrote:
At 3/6/2014 9:14:59 AM, Skepticalone wrote:
At 3/6/2014 9:05:05 AM, jewelessien wrote:
Extreme analogy: we can show the working of a computer processor to be affected by dunking it in hot coffee. Does that necessarily mean that hot coffee affects the algorithms by which a processor works?
And mostly it is subconscious processes that are affected by mind-altering drugs and stuff.

I'm not sure I understand how the coffee analogy applies to consciousness. I'm not trying to be difficult - I have a lot of ignorance on this subject. Also, would the subconsciousness be just another 'processor' in the brain? 'Splain it to me, Jewel!

*cracks knuckles*
Basically, I meant that the processor - the brain - is affected by its physical environment. But the algorithms and assembly code by which the processor works - the unconscious mind - are not the product of or particularly influenced by physical environments, though they are the product of the processor's wiring. The conscious mind is akin to an operating system written in a high level programming language. Its design and driving principles have almost nothing to do with hardware, but it only exists through the medium of the processor, and so it's indirectly affected by the state of the processor. But it would be absurd to conclude that the operating system or even the assembly code arose as a result of the processor's wiring.

Great analogy!

The thing I still puzzle over: I can handle the idea that consciousness is sort of riding on the brain, and the conscious brain is its interface, the screen, if you will. But what about the keyboard? If consciousness is something in essence non-brain, can it steer the brain? If so, what would the physical interface be? It would have to mean that the brain could go from state A to state B with no identifiable physical cause, it seems. At that's where my head explodes.
This space for rent.
chui
Posts: 511
Add as Friend
Challenge to a Debate
Send a Message
3/6/2014 10:30:12 AM
Posted: 2 years ago
At 3/6/2014 7:48:03 AM, jewelessien wrote:
At 3/6/2014 6:33:57 AM, chui wrote:
Perhaps consciousness is an illusion? On the other hand it may be purely a product of the biochemistry of the brain. A third option is something 'other' that science cannot talk about because it lacks the tools. To get completely weird, maybe we live in a virtual universe and everything is an illusion?

That last option reminds me of one of the stories in David Eagleman's Sum. In it humans are actually the nodes, transistors sort of, in this giant galactic supercomputer: we pass data along through random "unconscious" tics - tapping a foot, scratching an arm, blinking. But the Programmers are confused because somehow we think we're the bees' knees and we actually try to keep ourselves alive for them. Lol.

Seriously though, consciousness could be a by-product of biochemistry so complex that it can't help thinking about itself...

I remember reading in New Scientist an article about a researcher who was actually looking for evidence that the universe is artificial! It had something to do with high energy cosmic rays showing some sort of symmetry breaking.

Being somewhat of a materialist I am drawn to the idea of consciousness being illusory, that there is nothing beyond perception and memory. However it is difficult to dismiss consciousness altogether when confronted with the amazing intellectual achievements of humanity.

The brain appears to be constructed on relatively simple terms as just a collection of neurons, or am I being ignorant here? However chaos theory has shown how remarkable complexity can come from the relatively simple eg Mandelbrot systems.

I also find it interesting to consider the idea of non-human consciousness. Are other animals conscious and if so are there levels of consciousness or is it discrete as in conscious or not conscious? My instincts tell me some are and to various degrees of consciousness but I have no evidence to base that on.

With advances in neuroscience would it ever be possible in the future to connect one person directly to another so that we could experience life from their view? Would that be desirable or ethical?
Skepticalone
Posts: 6,132
Add as Friend
Challenge to a Debate
Send a Message
3/6/2014 11:00:01 PM
Posted: 2 years ago
At 3/6/2014 9:22:10 AM, jewelessien wrote:
At 3/6/2014 9:14:59 AM, Skepticalone wrote:
At 3/6/2014 9:05:05 AM, jewelessien wrote:
Extreme analogy: we can show the working of a computer processor to be affected by dunking it in hot coffee. Does that necessarily mean that hot coffee affects the algorithms by which a processor works?
And mostly it is subconscious processes that are affected by mind-altering drugs and stuff.

I'm not sure I understand how the coffee analogy applies to consciousness. I'm not trying to be difficult - I have a lot of ignorance on this subject. Also, would the subconsciousness be just another 'processor' in the brain? 'Splain it to me, Jewel!

*cracks knuckles*
Basically, I meant that the processor - the brain - is affected by its physical environment. But the algorithms and assembly code by which the processor works - the unconscious mind - are not the product of or particularly influenced by physical environments, though they are the product of the processor's wiring. The conscious mind is akin to an operating system written in a high level programming language. Its design and driving principles have almost nothing to do with hardware, but it only exists through the medium of the processor, and so it's indirectly affected by the state of the processor. But it would be absurd to conclude that the operating system or even the assembly code arose as a result of the processor's wiring.

Okay, first let me clarify your my understanding of your position (remember my ignorance). You believe the mind is not physical even though it is linked to the physical brain?
This thread is like eavesdropping on a conversation in a mental asylum. - Bulproof

You can call your invisible friends whatever you like. - Desmac

What the hell kind of coked up sideshow has this thread turned into. - Casten
jewelessien
Posts: 155
Add as Friend
Challenge to a Debate
Send a Message
3/7/2014 3:29:05 AM
Posted: 2 years ago
At 3/6/2014 11:00:01 PM, Skepticalone wrote:
At 3/6/2014 9:22:10 AM, jewelessien wrote:
Basically, I meant that the processor - the brain - is affected by its physical environment. But the algorithms and assembly code by which the processor works - the unconscious mind - are not the product of or particularly influenced by physical environments, though they are the product of the processor's wiring. The conscious mind is akin to an operating system written in a high level programming language. Its design and driving principles have almost nothing to do with hardware, but it only exists through the medium of the processor, and so it's indirectly affected by the state of the processor. But it would be absurd to conclude that the operating system or even the assembly code arose as a result of the processor's wiring.

Okay, first let me clarify your my understanding of your position (remember my ignorance). You believe the mind is not physical even though it is linked to the physical brain?

That depends on how you define "physical", and what the alternative to "physical" is.

To continue with my analogy: is the operating system a "physical" thing in the sense of a visible, tactile thing? No, it's not. But you could hardly call it "supernatural". I think consciousness (and subconsciousness to a degree) is something intangible, although it is expressed in (and is also the expression of) tangible electrochemical processes in the brain.
Everything is up for questioning. If it won't defend itself, then how do we know it can?
jewelessien
Posts: 155
Add as Friend
Challenge to a Debate
Send a Message
3/7/2014 3:29:22 AM
Posted: 2 years ago
At 3/6/2014 9:29:35 AM, v3nesl wrote:
Great analogy!

The thing I still puzzle over: I can handle the idea that consciousness is sort of riding on the brain, and the conscious brain is its interface, the screen, if you will. But what about the keyboard? If consciousness is something in essence non-brain, can it steer the brain? If so, what would the physical interface be? It would have to mean that the brain could go from state A to state B with no identifiable physical cause, it seems. At that's where my head explodes.

Actually the consciousness would be the operating system, riding on the CPU (processor, internal memory, arithmetic and logic unit, etc corresponding to different brain areas). Our bodies would be the rest of the computer: "sensory" input from the keyboard/touchpad/webcam, "motor" output to the screen/printers, and so on. A computer can function (albeit with very tight limits) without an operating system - we can run on our subconscious, although the experience is very far from what we'd want.
Honestly, I think neuroscientists should stop messing around with lumps of jelly and just study computer science instead :)
Everything is up for questioning. If it won't defend itself, then how do we know it can?
jewelessien
Posts: 155
Add as Friend
Challenge to a Debate
Send a Message
3/7/2014 3:29:27 AM
Posted: 2 years ago
At 3/6/2014 10:30:12 AM, chui wrote:
I remember reading in New Scientist an article about a researcher who was actually looking for evidence that the universe is artificial! It had something to do with high energy cosmic rays showing some sort of symmetry breaking.

What, like we're actually just a giant Sim City in some multiverse computer?
OH GOD THAT'S WHAT HAPPENED TO THE DINOSAURS SOMEONE PULLED THE PLUG BY MISTAKE

Being somewhat of a materialist I am drawn to the idea of consciousness being illusory, that there is nothing beyond perception and memory. However it is difficult to dismiss consciousness altogether when confronted with the amazing intellectual achievements of humanity.

Beyond perception and memory, there is logical reasoning - if A then B - as well as the ability to comprehend abstract concepts and connect two superficially dissimilar things (intelligence, basically). When you toss those into the mix, it's not surprising that consciousness, however illusory, would arise.

The brain appears to be constructed on relatively simple terms as just a collection of neurons, or am I being ignorant here? However chaos theory has shown how remarkable complexity can come from the relatively simple eg Mandelbrot systems.

No actually, the brain is much more than a collection of neurons. They're connected in a 3D network so complex that dozens of supercomputers working in tandem are unable to replicate it - and the brain manages to run that in the relatively tiny space of your skull while using less power than the average light bulb. The associated glial cells, from microglia to astrocytes, have functions that neurology hasn't even scratched the surface of. And the neuron itself is several orders of magnitude more efficient and advanced than the basic transistor, IMHO.

I also find it interesting to consider the idea of non-human consciousness. Are other animals conscious and if so are there levels of consciousness or is it discrete as in conscious or not conscious? My instincts tell me some are and to various degrees of consciousness but I have no evidence to base that on.

I think consciousness and conscious thought run on a spectrum - there is a huge range between thinking of yourself in abstraction and spinning Plato's or Socrates' works. And I've a firm believer in non-human consciousness. Here's a thread I started on that (specifically addressing dolphins and other cetaceans): http://www.debate.org...
Just read through, no need to resurrect it!

With advances in neuroscience would it ever be possible in the future to connect one person directly to another so that we could experience life from their view? Would that be desirable or ethical?

To continue with my computer analogy: once it has been installed, the operating system becomes so deeply married to that particular computer system that they become practically non-portable. It's why you can't just copy the Windows folder from your friend's laptop and expect Windows to run properly on yours. That having been said, if you take out the entire hard disk and mount it in another, compatible laptop (same processor architecture, etc) the operating system would run as normal.
Now consider that the consciousness isn't installed on a "hard disk", but on the actual freaking processor. I don't know about you, but I'm not very open to people taking out crucial portions of my brain and mounting them in another person's head.
Everything is up for questioning. If it won't defend itself, then how do we know it can?
v3nesl
Posts: 4,505
Add as Friend
Challenge to a Debate
Send a Message
3/7/2014 8:21:06 AM
Posted: 2 years ago
At 3/7/2014 3:29:22 AM, jewelessien wrote:
At 3/6/2014 9:29:35 AM, v3nesl wrote:
Great analogy!

The thing I still puzzle over: I can handle the idea that consciousness is sort of riding on the brain, and the conscious brain is its interface, the screen, if you will. But what about the keyboard? If consciousness is something in essence non-brain, can it steer the brain? If so, what would the physical interface be? It would have to mean that the brain could go from state A to state B with no identifiable physical cause, it seems. At that's where my head explodes.

Actually the consciousness would be the operating system, riding on the CPU

Yeah, but the analogy falls down in that an operating system is not aware. An OS programmer isn't even generally aware of all the computers his creation is copied onto. So the answer doesn't lie down this path. It's a great analogy of how intelligence can be distinct from medium, but doesn't help us know how our particular sort of intelligence interacts with our brains.

Honestly, I think neuroscientists should stop messing around with lumps of jelly and just study computer science instead :)

Eh, there's lots of clues from computer science, but they are quite different sorts of machines. Neurons are quite different from transistors. And, as I say, computers offer zero insight into consciousness, in my opinion. They are, at the end of the day, just machines, no closer to sentience than the bi-metallic gadgets that 'know' how to turn your blinkers on and off when you want to change lanes (and you all use your blankety-blank blinkers when changing lanes, right? You're in a tin can going 70 miles an hour, signal your intentions, dammit!)
This space for rent.
jewelessien
Posts: 155
Add as Friend
Challenge to a Debate
Send a Message
3/7/2014 8:33:35 AM
Posted: 2 years ago
At 3/7/2014 8:21:06 AM, v3nesl wrote:
At 3/7/2014 3:29:22 AM, jewelessien wrote:
Actually the consciousness would be the operating system, riding on the CPU

Yeah, but the analogy falls down in that an operating system is not aware. An OS programmer isn't even generally aware of all the computers his creation is copied onto. So the answer doesn't lie down this path. It's a great analogy of how intelligence can be distinct from medium, but doesn't help us know how our particular sort of intelligence interacts with our brains.

The analogy follows because computers were modeled on human brains. Already we are taking baby steps towards true AI - take Siri, for example, or Deep Blue (the supercomputer that's a Master at chess) - and I don't think it will take so many decades before we can develop computer systems that are aware of themselves (though that would open the door for Armageddon).

Honestly, I think neuroscientists should stop messing around with lumps of jelly and just study computer science instead :)

Eh, there's lots of clues from computer science, but they are quite different sorts of machines. Neurons are quite different from transistors. And, as I say, computers offer zero insight into consciousness, in my opinion. They are, at the end of the day, just machines, no closer to sentience than the bi-metallic gadgets that 'know' how to turn your blinkers on and off when you want to change lanes (and you all use your blankety-blank blinkers when changing lanes, right? You're in a tin can going 70 miles an hour, signal your intentions, dammit!)

Neurons are what transistors wish they could be. They work on the same basic principle. As I said, computers were modeled on humans. And if neuroscientists took their insights into the brain and put that to work on computer science - organic computing, to be specific - perhaps we'd make important breakthroughs in both fields much earlier.
Everything is up for questioning. If it won't defend itself, then how do we know it can?
slo1
Posts: 4,361
Add as Friend
Challenge to a Debate
Send a Message
3/7/2014 8:36:38 AM
Posted: 2 years ago
At 3/7/2014 3:29:22 AM, jewelessien wrote:
At 3/6/2014 9:29:35 AM, v3nesl wrote:
Great analogy!

The thing I still puzzle over: I can handle the idea that consciousness is sort of riding on the brain, and the conscious brain is its interface, the screen, if you will. But what about the keyboard? If consciousness is something in essence non-brain, can it steer the brain? If so, what would the physical interface be? It would have to mean that the brain could go from state A to state B with no identifiable physical cause, it seems. At that's where my head explodes.

Actually the consciousness would be the operating system, riding on the CPU (processor, internal memory, arithmetic and logic unit, etc corresponding to different brain areas). Our bodies would be the rest of the computer: "sensory" input from the keyboard/touchpad/webcam, "motor" output to the screen/printers, and so on. A computer can function (albeit with very tight limits) without an operating system - we can run on our subconscious, although the experience is very far from what we'd want.
Honestly, I think neuroscientists should stop messing around with lumps of jelly and just study computer science instead :)

Nice and tidy indeed, but the computer analogy is just that and maybe nothing like how a brain functions. There are more effects happening in the brain than what is going on in a computer. The challenge for computer science is to match the brains computing power, which is is far behind. I would conclude as a result computer folk should study the brain. Not to be difficult or anything.
v3nesl
Posts: 4,505
Add as Friend
Challenge to a Debate
Send a Message
3/7/2014 9:00:32 AM
Posted: 2 years ago
At 3/7/2014 8:33:35 AM, jewelessien wrote:
At 3/7/2014 8:21:06 AM, v3nesl wrote:
At 3/7/2014 3:29:22 AM, jewelessien wrote:
Actually the consciousness would be the operating system, riding on the CPU

Yeah, but the analogy falls down in that an operating system is not aware. An OS programmer isn't even generally aware of all the computers his creation is copied onto. So the answer doesn't lie down this path. It's a great analogy of how intelligence can be distinct from medium, but doesn't help us know how our particular sort of intelligence interacts with our brains.

The analogy follows because computers were modeled on human brains.

Well, no, they weren't, at all. They were fancy abacuses at first, AI is an afterthought. ENIAC was a military targeting calculator, if I remember correctly.

Already we are taking baby steps towards true AI - take Siri, for example, or Deep Blue (the supercomputer that's a Master at chess) - and I don't think it will take so many decades before we can develop computer systems that are aware of themselves (though that would open the door for Armageddon).

I don't think computers will EVER be aware of themselves. This is the point of discussion here, I don't understand why people take these fantastic leaps of assumption. "Artificial Intelligence" is a bit of a misleading term: Yes, it's very cool that computers can act like humans, but so can a parrot, or VHS tape. We can no more simply assume that AI is actually intelligent then we can assume the announcer on a screen actually knows what he's saying. It just mimics human responses, and conditional response is a very, VERY, different thing from intelligence.


Neurons are what transistors wish they could be.

I know you're being colorful, but we should avoid this kind of talk in this context. Transistors are just bits of silicon where the conductivity can be altered with electricity or light or heat. They don't wish anything, they're just devices.

They work on the same basic principle.

I think you kind of contradicted this yourself, didn't you, in your post above? Neurons do have electrical properties, but also chemical and physical properties that have no equivalent in transistors. I'm not saying that's relevant to consciousness, just that they are very different sorts of devices.

As I said, computers were modeled on humans. And if neuroscientists took their insights into the brain and put that to work on computer science - organic computing, to be specific - perhaps we'd make important breakthroughs in both fields much earlier.

They have, actually, but it didn't turn out to be as great as hoped. The japs were big on 'fuzzy computing' 20 years ago or so, and there was also a lot of interest in 'neural networks'. They both had uses, but the 'zillions of simple switches' computer is still where all the action is.
This space for rent.
jewelessien
Posts: 155
Add as Friend
Challenge to a Debate
Send a Message
3/7/2014 9:08:36 AM
Posted: 2 years ago
At 3/7/2014 8:36:38 AM, slo1 wrote:
At 3/7/2014 3:29:22 AM, jewelessien wrote:
At 3/6/2014 9:29:35 AM, v3nesl wrote:
Great analogy!

The thing I still puzzle over: I can handle the idea that consciousness is sort of riding on the brain, and the conscious brain is its interface, the screen, if you will. But what about the keyboard? If consciousness is something in essence non-brain, can it steer the brain? If so, what would the physical interface be? It would have to mean that the brain could go from state A to state B with no identifiable physical cause, it seems. At that's where my head explodes.

Actually the consciousness would be the operating system, riding on the CPU (processor, internal memory, arithmetic and logic unit, etc corresponding to different brain areas). Our bodies would be the rest of the computer: "sensory" input from the keyboard/touchpad/webcam, "motor" output to the screen/printers, and so on. A computer can function (albeit with very tight limits) without an operating system - we can run on our subconscious, although the experience is very far from what we'd want.
Honestly, I think neuroscientists should stop messing around with lumps of jelly and just study computer science instead :)

Nice and tidy indeed, but the computer analogy is just that and maybe nothing like how a brain functions. There are more effects happening in the brain than what is going on in a computer. The challenge for computer science is to match the brains computing power, which is is far behind. I would conclude as a result computer folk should study the brain. Not to be difficult or anything.

They are different kinds of effect: one is inorganic, one is organic. Let's start from there.

As I've said, the computer was modeled on the human brain, and so it's a perfectly valid analogy.

Computing power? IMHO computers have done that and more, to a reasonable extent. I read somewhere, can't remember, that the brain's processing speed is two million bits per second - 2MHz. I'm sure you'll agree that's way behind modern processors. Even if I'm remembering wrong and it was two billion, I'm typing this on a 2010 laptop with a 2.4GHz dual-core processor. Computers complete logic and calculating tasks and process sensory input much faster than humans. I suppose the case you're making is for efficiency, not brute processing power, and for that you can hardly fault the computers. Neurons are just more efficient because they're organic: they repair themselves, shuffle themselves round to make new connections, can "learn" responses, make fairly complex decisions (as a cell), and have lots of little helpers (the glial cells). Plus they're packed in a 3D network, which is kind of hard to replicate on a flat surface. Transistors can't run on low-power sodium-potassium pumps or facilitate processes/increase longevity by utilizing chemicals across synapses. Sigh.
Everything is up for questioning. If it won't defend itself, then how do we know it can?
slo1
Posts: 4,361
Add as Friend
Challenge to a Debate
Send a Message
3/7/2014 9:41:19 AM
Posted: 2 years ago
At 3/7/2014 9:08:36 AM, jewelessien wrote:
At 3/7/2014 8:36:38 AM, slo1 wrote:
At 3/7/2014 3:29:22 AM, jewelessien wrote:
At 3/6/2014 9:29:35 AM, v3nesl wrote:
Great analogy!

The thing I still puzzle over: I can handle the idea that consciousness is sort of riding on the brain, and the conscious brain is its interface, the screen, if you will. But what about the keyboard? If consciousness is something in essence non-brain, can it steer the brain? If so, what would the physical interface be? It would have to mean that the brain could go from state A to state B with no identifiable physical cause, it seems. At that's where my head explodes.

Actually the consciousness would be the operating system, riding on the CPU (processor, internal memory, arithmetic and logic unit, etc corresponding to different brain areas). Our bodies would be the rest of the computer: "sensory" input from the keyboard/touchpad/webcam, "motor" output to the screen/printers, and so on. A computer can function (albeit with very tight limits) without an operating system - we can run on our subconscious, although the experience is very far from what we'd want.
Honestly, I think neuroscientists should stop messing around with lumps of jelly and just study computer science instead :)

Nice and tidy indeed, but the computer analogy is just that and maybe nothing like how a brain functions. There are more effects happening in the brain than what is going on in a computer. The challenge for computer science is to match the brains computing power, which is is far behind. I would conclude as a result computer folk should study the brain. Not to be difficult or anything.


They are different kinds of effect: one is inorganic, one is organic. Let's start from there.

As I've said, the computer was modeled on the human brain, and so it's a perfectly valid analogy.

Computing power? IMHO computers have done that and more, to a reasonable extent. I read somewhere, can't remember, that the brain's processing speed is two million bits per second - 2MHz. I'm sure you'll agree that's way behind modern processors. Even if I'm remembering wrong and it was two billion, I'm typing this on a 2010 laptop with a 2.4GHz dual-core processor. Computers complete logic and calculating tasks and process sensory input much faster than humans. I suppose the case you're making is for efficiency, not brute processing power, and for that you can hardly fault the computers. Neurons are just more efficient because they're organic: they repair themselves, shuffle themselves round to make new connections, can "learn" responses, make fairly complex decisions (as a cell), and have lots of little helpers (the glial cells). Plus they're packed in a 3D network, which is kind of hard to replicate on a flat surface. Transistors can't run on low-power sodium-potassium pumps or facilitate processes/increase longevity by utilizing chemicals across synapses. Sigh.

I disagree that it is behind modern processors. It is hard to compare the two because a computer processor is so well defined and its operation is deterministic, so it does not have the width and breadth of functionality a brain has.

The computing power can be calculated simply by the physical make up of the processor and how many of them there are.

The brain on the other hand runs so much concurrent processes, it is not well understood, so it is unknown how powerful the entire brain is from a computing aspect.

If trying to model a computer as a brain with a processes for each neuron, we can only match a very small portion or duration of processing power of the brain and at huge energy costs as you mention. http://www.extremetech.com...

If you just go by pure processing power and don't care that a regular computer is deterministic and predefined to only do certain things then there are some super computers more powerful, but our desktops are certainly not even close.
http://www.scientificamerican.com...
jewelessien
Posts: 155
Add as Friend
Challenge to a Debate
Send a Message
3/7/2014 9:54:31 AM
Posted: 2 years ago
At 3/7/2014 9:00:32 AM, v3nesl wrote:
At 3/7/2014 8:33:35 AM, jewelessien wrote:
The analogy follows because computers were modeled on human brains.

Well, no, they weren't, at all. They were fancy abacuses at first, AI is an afterthought. ENIAC was a military targeting calculator, if I remember correctly.

Ever read anything by Ada Lovelace or Charles Babbage (way back in the ole nineteenth century) about computing? "Arithmetic and logic unit", "memory", "processor" - do these vaguely correspond to certain brain regions to you? Even the entire idea of the transistor, a basically oversimplified neuron. AI could have been an afterthought, but I'm pretty sure consciousness was also an afterthought in the development of the brain.

I don't think computers will EVER be aware of themselves. This is the point of discussion here, I don't understand why people take these fantastic leaps of assumption. "Artificial Intelligence" is a bit of a misleading term: Yes, it's very cool that computers can act like humans, but so can a parrot, or VHS tape. We can no more simply assume that AI is actually intelligent then we can assume the announcer on a screen actually knows what he's saying. It just mimics human responses, and conditional response is a very, VERY, different thing from intelligence.

Computers can become aware of themselves, yes. Anything that breaks a certain intelligence barrier will eventually begin to ponder itself with its spare processing power. It's why consciousness is described as an emergent property.
And as for "mimicking humans", that's massive anthropocentrism at play. What makes your brand of intelligence different from "conditioned responses"?

Neurons are what transistors wish they could be.
I know you're being colorful, but we should avoid this kind of talk in this context. Transistors are just bits of silicon where the conductivity can be altered with electricity or light or heat. They don't wish anything, they're just devices.

I fail to see how a simple literary device is ruining the context.
Fine: neurons are what electronics engineers wish they could make transistors like.

They work on the same basic principle.
I think you kind of contradicted this yourself, didn't you, in your post above? Neurons do have electrical properties, but also chemical and physical properties that have no equivalent in transistors. I'm not saying that's relevant to consciousness, just that they are very different sorts of devices.

Uh, no. They share the same basic principle: receive an electrical input and direct it to another neuron/transistor based on prewired pathways. But of course, a transistor isn't a freaking complete factory on pseudopods, so they're oooobviously different sorts of devices.

As I said, computers were modeled on humans. And if neuroscientists took their insights into the brain and put that to work on computer science - organic computing, to be specific - perhaps we'd make important breakthroughs in both fields much earlier.

They have, actually, but it didn't turn out to be as great as hoped. The japs were big on 'fuzzy computing' 20 years ago or so, and there was also a lot of interest in 'neural networks'. They both had uses, but the 'zillions of simple switches' computer is still where all the action is.

See, neurons are basically zillions of little switches, albeit little switches that are entire organisms in themselves - lots of little switches within a little switch, if you will, on an impossibly tiny scale. And no, organic computing is a fairly new field, and one that is directed towards software, not hardware. I think everyone has already accepted that hardware for hardware, we're not going to develop the tech to compete with brain cells. Organic computing is directed at developing the traits that make neurons so badass through the software, not the hardware, which is where I say neuroscientists' expertise would be highly welcome.
Everything is up for questioning. If it won't defend itself, then how do we know it can?
jewelessien
Posts: 155
Add as Friend
Challenge to a Debate
Send a Message
3/7/2014 9:57:05 AM
Posted: 2 years ago
At 3/7/2014 9:41:19 AM, slo1 wrote:
I disagree that it is behind modern processors. It is hard to compare the two because a computer processor is so well defined and its operation is deterministic, so it does not have the width and breadth of functionality a brain has.

The computing power can be calculated simply by the physical make up of the processor and how many of them there are.

The brain on the other hand runs so much concurrent processes, it is not well understood, so it is unknown how powerful the entire brain is from a computing aspect.

If trying to model a computer as a brain with a processes for each neuron, we can only match a very small portion or duration of processing power of the brain and at huge energy costs as you mention. http://www.extremetech.com...

If you just go by pure processing power and don't care that a regular computer is deterministic and predefined to only do certain things then there are some super computers more powerful, but our desktops are certainly not even close.
http://www.scientificamerican.com...

A processor isn't deterministic per se: its assembly code and software make it deterministic. The brain is lucky (?) to have the ultimate in operating systems running it. But brute force for brute force, then yeah, processors kick us out the window,
Everything is up for questioning. If it won't defend itself, then how do we know it can?
v3nesl
Posts: 4,505
Add as Friend
Challenge to a Debate
Send a Message
3/7/2014 10:11:51 AM
Posted: 2 years ago
At 3/7/2014 9:54:31 AM, jewelessien wrote:
At 3/7/2014 9:00:32 AM, v3nesl wrote:
At 3/7/2014 8:33:35 AM, jewelessien wrote:
The analogy follows because computers were modeled on human brains.

Well, no, they weren't, at all. They were fancy abacuses at first, AI is an afterthought. ENIAC was a military targeting calculator, if I remember correctly.

Ever read anything by Ada Lovelace or Charles Babbage (way back in the ole nineteenth century) about computing? "Arithmetic and logic unit", "memory", "processor" - do these vaguely correspond to certain brain regions to you? Even the entire idea of the transistor, a basically oversimplified neuron.

Well, ok, I don't know that this is all that germane anyway. It's just that it's my field, I design embedded processors for a living, and the history as I know it is much more blue collar, if you will. The transistor was a better vacuum tube, the basis for an amplifier, is how I understand its development. It was developed for the telephone industry, and was an analog amp rather than a switch originally.

AI could have been an afterthought, but I'm pretty sure consciousness was also an afterthought in the development of the brain.


Pretty sure? I can't imagine how you could be. You're just assuming a story line here, I get the feeling you're not really thinking about it very deeply.

I don't think computers will EVER be aware of themselves. This is the point of discussion here, I don't understand why people take these fantastic leaps of assumption. "Artificial Intelligence" is a bit of a misleading term: Yes, it's very cool that computers can act like humans, but so can a parrot, or VHS tape. We can no more simply assume that AI is actually intelligent then we can assume the announcer on a screen actually knows what he's saying. It just mimics human responses, and conditional response is a very, VERY, different thing from intelligence.

Computers can become aware of themselves, yes. Anything that breaks a certain intelligence barrier will eventually begin to ponder itself with its spare processing power.

I find this bizarre, so assume such a thing. This is acid trip talk, not philosophy or science.

It's why consciousness is described as an emergent property.

I thought we weren't going to do evolution? That's where this story line comes from, not from any examination of thought or the brain itself.

And as for "mimicking humans", that's massive anthropocentrism at play. What makes your brand of intelligence different from "conditioned responses"?


Please see my first post. Seriously, that's my premise of all this, that we think thought to be something other than the deterministic effect of physical causes.


Fine: neurons are what electronics engineers wish they could make transistors like.


I'm afraid I have to disagree with this too, as an engineer, from this perspective: The thing that makes the brain so awesome is also what makes it unreliable. We forget, we assume, etc. A computer, again, is a machine, and that's what makes the computer so awesome. We don't want neurons in a computer, we want binary switches that are super fast and super low power and super small. And cheap.

There's a science fiction story running through all this, and I do think it's all part of the modern creation myth that is evolution. Sorry, but that's where it comes from, so it has to be dealt with. There's this desire to explain mind as an elaborate accident, and I'm afraid that keeps the evo from being a good philosopher, from thinking clearly about thought.
This space for rent.
v3nesl
Posts: 4,505
Add as Friend
Challenge to a Debate
Send a Message
3/7/2014 10:15:21 AM
Posted: 2 years ago
At 3/7/2014 9:41:19 AM, slo1 wrote:
At 3/7/2014 9:08:36 AM, jewelessien wrote:
At 3/7/2014 8:36:38 AM, slo1 wrote:
At 3/7/2014 3:29:22 AM, jewelessien wrote:
At 3/6/2014 9:29:35 AM, v3nesl wrote:
Great analogy!

The thing I still puzzle over: I can handle the idea that consciousness is sort of riding on the brain, and the conscious brain is its interface, the screen, if you will. But what about the keyboard? If consciousness is something in essence non-brain, can it steer the brain? If so, what would the physical interface be? It would have to mean that the brain could go from state A to state B with no identifiable physical cause, it seems. At that's where my head explodes.

Actually the consciousness would be the operating system, riding on the CPU (processor, internal memory, arithmetic and logic unit, etc corresponding to different brain areas). Our bodies would be the rest of the computer: "sensory" input from the keyboard/touchpad/webcam, "motor" output to the screen/printers, and so on. A computer can function (albeit with very tight limits) without an operating system - we can run on our subconscious, although the experience is very far from what we'd want.
Honestly, I think neuroscientists should stop messing around with lumps of jelly and just study computer science instead :)

Nice and tidy indeed, but the computer analogy is just that and maybe nothing like how a brain functions. There are more effects happening in the brain than what is going on in a computer. The challenge for computer science is to match the brains computing power, which is is far behind. I would conclude as a result computer folk should study the brain. Not to be difficult or anything.


They are different kinds of effect: one is inorganic, one is organic. Let's start from there.

As I've said, the computer was modeled on the human brain, and so it's a perfectly valid analogy.

Computing power? IMHO computers have done that and more, to a reasonable extent. I read somewhere, can't remember, that the brain's processing speed is two million bits per second - 2MHz. I'm sure you'll agree that's way behind modern processors. Even if I'm remembering wrong and it was two billion, I'm typing this on a 2010 laptop with a 2.4GHz dual-core processor. Computers complete logic and calculating tasks and process sensory input much faster than humans. I suppose the case you're making is for efficiency, not brute processing power, and for that you can hardly fault the computers. Neurons are just more efficient because they're organic: they repair themselves, shuffle themselves round to make new connections, can "learn" responses, make fairly complex decisions (as a cell), and have lots of little helpers (the glial cells). Plus they're packed in a 3D network, which is kind of hard to replicate on a flat surface. Transistors can't run on low-power sodium-potassium pumps or facilitate processes/increase longevity by utilizing chemicals across synapses. Sigh.

I disagree that it is behind modern processors. It is hard to compare the two because a computer processor is so well defined and its operation is deterministic, so it does not have the width and breadth of functionality a brain has.


Yeah, in computer science we'd say that the brain is "massively parallel". The neurons are radically slower than gigaherz transistors, but are all working at once, whereas a CPU is pretty much one instruction at a time.
This space for rent.
chui
Posts: 511
Add as Friend
Challenge to a Debate
Send a Message
3/7/2014 10:58:24 AM
Posted: 2 years ago
At 3/7/2014 3:29:27 AM, jewelessien wrote:
At 3/6/2014 10:30:12 AM, chui wrote:


No actually, the brain is much more than a collection of neurons. They're connected in a 3D network so complex that dozens of supercomputers working in tandem are unable to replicate it - and the brain manages to run that in the relatively tiny space of your skull while using less power than the average light bulb. The associated glial cells, from microglia to astrocytes, have functions that neurology hasn't even scratched the surface of. And the neuron itself is several orders of magnitude more efficient and advanced than the basic transistor, IMHO.



To continue with my computer analogy: once it has been installed, the operating system becomes so deeply married to that particular computer system that they become practically non-portable. It's why you can't just copy the Windows folder from your friend's laptop and expect Windows to run properly on yours. That having been said, if you take out the entire hard disk and mount it in another, compatible laptop (same processor architecture, etc) the operating system would run as normal.
Now consider that the consciousness isn't installed on a "hard disk", but on the actual freaking processor. I don't know about you, but I'm not very open to people taking out crucial portions of my brain and mounting them in another person's head.

I agree the connection of the neurons forms a neural network but I disagree that we can model the function of the brain by analogy to a typical computer. The architecture of the two systems is very different I thought. Computers are separated into different functional areas such as different types of memory, processors and connecting buses. Software is then added on in layers, lowest layers doing very simple basic tasks such as storing a number to a hard disc, higher layers being user applications and user interfaces. The operating system tends to sit between these layers saving the user from needing to know about low level operations such as what is happening in the various processors present in a modern computer. As far as I am aware there are no analogous structures or systems in the brain: no processors, no software, no operating system. I thought neural networks were essentially hardwired in that an input leads to an output through a network of connections. Learning and memory occur through changing the wiring of the network not a software update.
Iredia
Posts: 1,608
Add as Friend
Challenge to a Debate
Send a Message
3/7/2014 2:40:20 PM
Posted: 2 years ago
@ jewelessien: Great. I am engaged atm but will be back sooner or later. That said, I don't think there a how to consciousness. My approach is weird: I think that the brain both causes and doesn't cause consciousness. I say that because clearly science and experience shows that consciousness is affected by the brain. But how you get from physical systems in the brain to subjective consciousness is simply mysterious: even if organic computers succeed in that quest. The reason is because consciousness is an emergent property that IS NOT physical.
Porn babes be distracting me. Dudes be stealing me stuff. I'm all about the cash from now. I'm not playing Jesus anymore.
jewelessien
Posts: 155
Add as Friend
Challenge to a Debate
Send a Message
3/8/2014 2:34:16 AM
Posted: 2 years ago
At 3/7/2014 10:58:24 AM, chui wrote:
I agree the connection of the neurons forms a neural network but I disagree that we can model the function of the brain by analogy to a typical computer. The architecture of the two systems is very different I thought. Computers are separated into different functional areas such as different types of memory, processors and connecting buses. Software is then added on in layers, lowest layers doing very simple basic tasks such as storing a number to a hard disc, higher layers being user applications and user interfaces. The operating system tends to sit between these layers saving the user from needing to know about low level operations such as what is happening in the various processors present in a modern computer. As far as I am aware there are no analogous structures or systems in the brain: no processors, no software, no operating system. I thought neural networks were essentially hardwired in that an input leads to an output through a network of connections. Learning and memory occur through changing the wiring of the network not a software update.

Processors = areas of the brain concerned with decision-making. Memory = hippocampus, cortex, etc. Operating system/software/data = everything you load onto your brain from when you are a fetus, where the operating system/higher software would be your "mind".
However, neurons being cells are not limited to a single function like hardware components are. Therefore the same basic neuron is a processor, a memory cell, an XOR gate, etc, which is what makes the brain so efficient.
And learning/memory do not change the wiring of the network, per se. Your neurons don't particularly migrate around in your head, neither do axons swing round. Then again, we've never actually seen a brain at work...
Everything is up for questioning. If it won't defend itself, then how do we know it can?
jewelessien
Posts: 155
Add as Friend
Challenge to a Debate
Send a Message
3/8/2014 2:34:19 AM
Posted: 2 years ago
At 3/7/2014 10:15:21 AM, v3nesl wrote:
Yeah, in computer science we'd say that the brain is "massively parallel". The neurons are radically slower than gigaherz transistors, but are all working at once, whereas a CPU is pretty much one instruction at a time.

A processor on its own isn't one instruction at a time - it handles dozens of running services at the same time. But so far all the software we've been able to develop is one instruction at a time.
Everything is up for questioning. If it won't defend itself, then how do we know it can?
jewelessien
Posts: 155
Add as Friend
Challenge to a Debate
Send a Message
3/8/2014 2:34:58 AM
Posted: 2 years ago
At 3/7/2014 10:11:51 AM, v3nesl wrote:
Well, ok, I don't know that this is all that germane anyway. It's just that it's my field, I design embedded processors for a living, and the history as I know it is much more blue collar, if you will. The transistor was a better vacuum tube, the basis for an amplifier, is how I understand its development. It was developed for the telephone industry, and was an analog amp rather than a switch originally.

Yes, and I'm also sure that the people who design planes weren't/aren't directly pulling on birds. The unconscious association remains though.

Pretty sure? I can't imagine how you could be. You're just assuming a story line here, I get the feeling you're not really thinking about it very deeply.

I'm not assuming a story line. I'm looking at the basic fact that of all the organisms on this planet with a brain, only a handful have anything that we could objectively call consciousness, and they were among the last to develop.

Computers can become aware of themselves, yes. Anything that breaks a certain intelligence barrier will eventually begin to ponder itself with its spare processing power.

I find this bizarre, so assume such a thing. This is acid trip talk, not philosophy or science.

It's why consciousness is described as an emergent property.

I thought we weren't going to do evolution? That's where this story line comes from, not from any examination of thought or the brain itself.

No, emergent properties don't come from evolution, and they are not a story line. Emergent properties are properties in a system that only arise when the system is connected as a whole - none of the individual components have any such property or anything that hints at such a property. Take airplanes for example - no-one has yet been able to conclusively describe how they fly (yeah, that stuff with Bernoulli's principle is so much crap when you look at it critically).
And the people who proposed the idea of consciousness as an emergent property are people whose job involves studying the brain, so yeah I think it's safe for me to assume they did some actual examination. It's rather too long to go into here, but I'd recommend you read David Eagleman's Incognito: The Secret Lives of the Brain (if you like I can email it to you).

And as for "mimicking humans", that's massive anthropocentrism at play. What makes your brand of intelligence different from "conditioned responses"?

Please see my first post. Seriously, that's my premise of all this, that we think thought to be something other than the deterministic effect of physical causes.

Even if thought is metaphysical, it doesn't remain the exclusive property of humans.

Fine: neurons are what electronics engineers wish they could make transistors like.

I'm afraid I have to disagree with this too, as an engineer, from this perspective: The thing that makes the brain so awesome is also what makes it unreliable. We forget, we assume, etc. A computer, again, is a machine, and that's what makes the computer so awesome. We don't want neurons in a computer, we want binary switches that are super fast and super low power and super small. And cheap.

I think that's where your loss is, and why computing will probably never be developed to its full potential.
A computer that could forget irrelevant data, that could assume without expressly being told what to do by the programmer, would be far superior to anything we have right now. And neurons are basically binary switches, albeit binary switches that are super fast and super low power and super cheap, binary switches that are self-modifying, and binary switches that have multiple input/output links. Tell me that wouldn't revolutionize computers as we know them.
The one shortcoming of a neuron is that it dies, which I don't think would be a problem with an inorganic hardware component. And no, we don't "forget": information never actually leaves our brains.

There's a science fiction story running through all this, and I do think it's all part of the modern creation myth that is evolution. Sorry, but that's where it comes from, so it has to be dealt with. There's this desire to explain mind as an elaborate accident, and I'm afraid that keeps the evo from being a good philosopher, from thinking clearly about thought.

Evolution is a "modern creation myth"? Awesome.
It's not a desire. And the objective evidence points towards the mind actually being an elaborate accident (read Eagleman's book). The discovery of X-rays was also an elaborate accident, but that doesn't make them any less useful or fascinating.
Everything is up for questioning. If it won't defend itself, then how do we know it can?
jewelessien
Posts: 155
Add as Friend
Challenge to a Debate
Send a Message
3/8/2014 2:35:36 AM
Posted: 2 years ago
At 3/7/2014 2:40:20 PM, Iredia wrote:
@ jewelessien: Great. I am engaged atm but will be back sooner or later. That said, I don't think there a how to consciousness. My approach is weird: I think that the brain both causes and doesn't cause consciousness. I say that because clearly science and experience shows that consciousness is affected by the brain. But how you get from physical systems in the brain to subjective consciousness is simply mysterious: even if organic computers succeed in that quest. The reason is because consciousness is an emergent property that IS NOT physical.

Yup. That's what I think: it's an emergent property that is not tangible. But it's so intimately married to a very tangible organ that they're basically inseparable.
Everything is up for questioning. If it won't defend itself, then how do we know it can?