Can a machine be conscious?

Please compose all posts in Emacs.

Moderators: phlip, Moderators General, Prelates

hammerkrieg
Posts: 16
Joined: Thu Jun 25, 2009 11:48 am UTC

Can a machine be conscious?

Postby hammerkrieg » Fri Jun 26, 2009 9:07 pm UTC

Based on looking around on these fora, I think this thread will probably become, well, an echo chamber, but I'd like to start it anyway for two reasons:

  • It's one I bring up to people a lot recently to troll them. People tend to want to believe they are at the tippy top of nature, so much so that I hear people making bald assertions about things they have limited to no experience with. I need more fodder for my arguments.
  • I'm interested in hearing arguments that machines cannot be conscious, if only to rebut them. :D

My own position, as you might have already guessed, is that 100 billion neurons are 100 billion neurons no matter what they're made of. Granted there are "weird" things about consciousness (qualia) but Occam's razor tells us that a machine could "tap into" whatever makes them work, so they're not really important in this discussion as far as I'm concerned.

stephentyrone
Posts: 778
Joined: Mon Aug 11, 2008 10:58 pm UTC
Location: Palo Alto, CA

Re: Can a machine be conscious?

Postby stephentyrone » Fri Jun 26, 2009 9:21 pm UTC

First, define consciousness. Next, under your definition, is it even clear that humans are conscious?
GENERATION -16 + 31i: The first time you see this, copy it into your sig on any forum. Square it, and then add i to the generation.

hammerkrieg
Posts: 16
Joined: Thu Jun 25, 2009 11:48 am UTC

Re: Can a machine be conscious?

Postby hammerkrieg » Fri Jun 26, 2009 9:25 pm UTC

stephentyrone wrote:First, define consciousness. Next, under your definition, is it even clear that humans are conscious?


I would define it as awareness of self and surroundings.

By this definition, I'm conscious, dunno about you.

samk
Posts: 54
Joined: Mon Feb 09, 2009 12:33 pm UTC

Re: Can a machine be conscious?

Postby samk » Fri Jun 26, 2009 9:59 pm UTC

Define awareness.

hammerkrieg
Posts: 16
Joined: Thu Jun 25, 2009 11:48 am UTC

Re: Can a machine be conscious?

Postby hammerkrieg » Fri Jun 26, 2009 10:07 pm UTC

samk wrote:Define awareness.


Well, I don't subscribe to behaviorism per se, which rejects the idea of "thoughts" entirely, but I feel the need, or at least the desire, to stick to what is empirically testable, and so I would define awareness as changes of internal state in an object mediated by some sensory apparatus.

stephentyrone
Posts: 778
Joined: Mon Aug 11, 2008 10:58 pm UTC
Location: Palo Alto, CA

Re: Can a machine be conscious?

Postby stephentyrone » Fri Jun 26, 2009 10:10 pm UTC

hammerkrieg wrote:
samk wrote:Define awareness.


Well, I don't subscribe to behaviorism per se, which rejects the idea of "thoughts" entirely, but I feel the need, or at least the desire, to stick to what is empirically testable, and so I would define awareness as changes of internal state in an object mediated by some sensory apparatus.


So bacteria are aware? Are they conscious?
GENERATION -16 + 31i: The first time you see this, copy it into your sig on any forum. Square it, and then add i to the generation.

hammerkrieg
Posts: 16
Joined: Thu Jun 25, 2009 11:48 am UTC

Re: Can a machine be conscious?

Postby hammerkrieg » Fri Jun 26, 2009 10:18 pm UTC

stephentyrone wrote:
hammerkrieg wrote:
samk wrote:Define awareness.


Well, I don't subscribe to behaviorism per se, which rejects the idea of "thoughts" entirely, but I feel the need, or at least the desire, to stick to what is empirically testable, and so I would define awareness as changes of internal state in an object mediated by some sensory apparatus.


So bacteria are aware? Are they conscious?


They could be, to some degree

Which is why I entertained the idea of "other kingdoms" (in the taxonomical sense) in the OP

Indeed, my reverie has sometimes led me to believe that even the tiniest subatomic particles are conscious to some very small degree.

User avatar
Berengal
Superabacus Mystic of the First Rank
Posts: 2707
Joined: Thu May 24, 2007 5:51 am UTC
Location: Bergen, Norway
Contact:

Re: Can a machine be conscious?

Postby Berengal » Fri Jun 26, 2009 10:35 pm UTC

hammerkrieg wrote:Indeed, my reverie has sometimes led me to believe that even the tiniest subatomic particles are conscious to some very small degree.
Then your definition of consciousness is pretty useless.
It is practically impossible to teach good programming to students who are motivated by money: As potential programmers they are mentally mutilated beyond hope of regeneration.

hammerkrieg
Posts: 16
Joined: Thu Jun 25, 2009 11:48 am UTC

Re: Can a machine be conscious?

Postby hammerkrieg » Fri Jun 26, 2009 10:41 pm UTC

Berengal wrote:
hammerkrieg wrote:Indeed, my reverie has sometimes led me to believe that even the tiniest subatomic particles are conscious to some very small degree.
Then your definition of consciousness is pretty useless.


What makes it useless?

Consider what John McCarthy has to say about a thermostat: "My thermostat has three beliefs: too cold, too hot, and just right."

So you see it's a matter of degree ...... this explanation is way more simple than arbitrary and ostensibly incorrect classifications of "conscious" and "not conscious" which typically rely on question-begging

samk
Posts: 54
Joined: Mon Feb 09, 2009 12:33 pm UTC

Re: Can a machine be conscious?

Postby samk » Fri Jun 26, 2009 11:23 pm UTC

It makes sense for it to be a matter of degree, but that raises the question of how consciousness could be measured. Bits stored or processed might be a start, but still doesn't account for how the information is processed.

hammerkrieg
Posts: 16
Joined: Thu Jun 25, 2009 11:48 am UTC

Re: Can a machine be conscious?

Postby hammerkrieg » Fri Jun 26, 2009 11:27 pm UTC

samk wrote:It makes sense for it to be a matter of degree, but that raises the question of how consciousness could be measured. Bits stored or processed might be a start, but still doesn't account for how the information is processed.


Now that would be a truly monumental task

strange, I'm reading On Intelligence now haha

User avatar
Amnesiasoft
Posts: 2573
Joined: Tue May 15, 2007 4:28 am UTC
Location: Colorado
Contact:

Re: Can a machine be conscious?

Postby Amnesiasoft » Sat Jun 27, 2009 12:14 am UTC

hammerkrieg wrote:I would define it as awareness of self and surroundings.

MY computer is aware of how much memory it has, what hardware it has, and many other things about itself. I can hook up a GPS, a thermometer and a camera to it. That doesn't make it conscious, even though it would still be aware of "itself" and the surroundings. It certainly wouldn't be Johnny 5.

You're going to have to come up with a better definition than that if you want a serious answer.

User avatar
phillipsjk
Posts: 1213
Joined: Wed Nov 05, 2008 4:09 pm UTC
Location: Edmonton AB Canada
Contact:

Re: Can a machine be conscious?

Postby phillipsjk » Sat Jun 27, 2009 6:57 am UTC

I've said it before: The only reason the (story in the) "Terminator" series of movies can't happen in real life it that time travel doesn't work!

We have the technology to simulate a single neuron. I'm not sure how much computing power is needed for a real-time simulation. If computing capacity keeps increasing at an exponential rate; brute force will be enough for computer consciousness (by simulating a (possibly customized) human brain).

Looked through the forums: looks like a neuron can be run faster than real-time:
Gannier » Mon Oct 27, 2008 11:13 am wrote:under Windows XP Neuron takes 153 seconds to simulate 1hour and 308 seconds under linux.
- "speed of Neuron under Win vs Linux": The NEURON Users' Group Forum # Board index ‹ Making and using models with NEURON ‹ Getting started
Did you get the number on that truck?

hammerkrieg
Posts: 16
Joined: Thu Jun 25, 2009 11:48 am UTC

Re: Can a machine be conscious?

Postby hammerkrieg » Sat Jun 27, 2009 7:19 am UTC

Neurons aren't that fast actually. Each one fires at about 10 times a second.

Amnesiasoft wrote:
hammerkrieg wrote:I would define it as awareness of self and surroundings.

MY computer is aware of how much memory it has, what hardware it has, and many other things about itself. I can hook up a GPS, a thermometer and a camera to it. That doesn't make it conscious, even though it would still be aware of "itself" and the surroundings. It certainly wouldn't be Johnny 5.

You're going to have to come up with a better definition than that if you want a serious answer.


I told you consciousness is on a sliding scale. Your computer is just a tiny bit conscious.

0xBADFEED
Posts: 687
Joined: Mon May 05, 2008 2:14 am UTC

Re: Can a machine be conscious?

Postby 0xBADFEED » Sat Jun 27, 2009 3:57 pm UTC

If you're really serious about this question, I suggest you look at Hofstatder's I am a Strange Loop (if you haven't already).

It's a fantastic book that is largely concerned with exploring the question you just asked. Namely, "What is consciousness and what types of systems can possess it?" Also, the man is a genius and has been thinking about the question for longer than most posters in this forum have been alive. You may not agree with all of his conclusions but they are insightful and interesting.

I would say that my personal views on consciousness are largely compatible with those of Hofstatder. So, yes, a machine can be conscious, or at least as conscious as a human, in that we are merely biological machines with rather complex feedback loops and powerful pattern matching capabilities (gross simplification I know). If those same capabilities could be replicated in a non-biological substrate I would have no problem calling the result conscious.
Last edited by 0xBADFEED on Sat Jun 27, 2009 10:48 pm UTC, edited 4 times in total.

User avatar
Gaydar2000SE
Posts: 210
Joined: Sun Jun 21, 2009 1:43 am UTC

Re: Can a machine be conscious?

Postby Gaydar2000SE » Sat Jun 27, 2009 4:19 pm UTC

Point is:

A: The laws of physics as currently there offer no reason to believe that things can just be 'conscious', in fact, it has no concept of it. It's like some sort of magic to them.
B: The theory of evolution does not require humans to be conscious, it can just be said that they are automated sophisticated machines that came to be via trial-and-error.
C: Humans may claim that they are conscious, but they cannot proof it, this can just be justified by saying that their programming instructs them to claim that they are while they really aren't.

So, Ockam's razor applied, it seems like a really far-fetched and strange assumption to make that humans are concious. That doesn't mean assuming that they are not conscious. It's just a strange, out of the blue, assumption to say that they have some strange and vague property that cannot be explained by the laws of physics, and neither does it solve any open questions, and only creates more. It's like saying a random rock has some magical and unexplained property that allows it to serve as the nexus of time-causality or whatever's vague.

Surely, any objective outsider observing humans, even from outside this universe would conclude not conclude we are concious and most likely hasn't even considered the possibility or is even focussed upon that concept and what it could be.
^ :/

0xBADFEED
Posts: 687
Joined: Mon May 05, 2008 2:14 am UTC

Re: Can a machine be conscious?

Postby 0xBADFEED » Sat Jun 27, 2009 5:06 pm UTC

hammerkrieg wrote: Your computer is just a tiny bit conscious.

Sorry, I cannot agree with this.
I would say that consciousness is a property of software, not hardware. It makes no more sense to say "a computer is conscious" than it does to say "a brain is conscious". The brain may have a potential for consciousness, but the virtue of being a brain does not imbue it with consciousness automatically. I'm sure you can think of lots of examples of human brains that are definitely not conscious. Consciousness of a brain is dependent on having a perception/reasoning loop active in the brain. We don't usually separate the idea of "software" and "hardware" in the brain. But I think the distinction is important. Surely the "hardware" of my brain is incredibly similar to the "hardware" of your brain. Yet, we can have vastly different "conciousnesses". That is why I say that consciousness is largely a property of software not hardware, and it is meaningless to ascribe consciousness to inanimate hardware without also talking about the software.

I wouldn't classify any of today's software as being "conscious". It doesn't reach the level of self-modification and abstraction power that I would say is necessary to really call it conscious, for any interesting definition of "conscious". There is no evidence that our brains are doing magical things that could not be replicated on a computer. Our brains just do it on a massively parallel scale and have incredible coordination and processing power.
hammerkrieg wrote:Indeed, my reverie has sometimes led me to believe that even the tiniest subatomic particles are conscious to some very small degree.

This is "New-Agey", magical thinking that has no basis or evidence. You should disabuse yourself of it.
Gaydar2000SE wrote:Point is:
A: The laws of physics as currently there offer no reason to believe that things can just be 'conscious', in fact, it has no concept of it. It's like some sort of magic to them.
B: The theory of evolution does not require humans to be conscious, it can just be said that they are automated sophisticated machines that came to be via trial-and-error.
C: Humans may claim that they are conscious, but they cannot proof it, this can just be justified by saying that their programming instructs them to claim that they are while they really aren't.
...snip...

I think that you're conflating the ideas of "free will" and "consciousness".

I would say those are two very different questions. The "free will" problem is extremely tricky when you take a strictly reductionist view. It's very difficult to see at what point "free will" would be injected into the system. Consciousness seems rather more constrained. We can say that most higher animals have varying degrees of consciousness, with humans at the top of the heap.

If we define (in a very loose and off-the-cuff sense) consciousness as the ability to:
* Perceive one's environment
* Make decisions based on those perceptions (whether by "free will" or by programming)
* Remember cause and effect relationships based on the perception->decision (i.e. learn)
* Abstract these relationships to create general propositions about our environment
* Maintain an internal model to predict future cause and effect relationships

It's clear that humans possess these abilities in spades and that the higher animals possess them to varying degrees. I would also say that these abilities largely encompass what we tend to think about as "intelligence". Though evolution (I'm not even sure why you brought evolution into it) doesn't require the development of these abilities, it does seem to reward them in many cases.

I would say it's silly to argue that humans aren't conscious. Whether or not we possess "free will" is more of an open question.

User avatar
Amnesiasoft
Posts: 2573
Joined: Tue May 15, 2007 4:28 am UTC
Location: Colorado
Contact:

Re: Can a machine be conscious?

Postby Amnesiasoft » Sun Jun 28, 2009 1:16 am UTC

hammerkrieg wrote:I told you consciousness is on a sliding scale. Your computer is just a tiny bit conscious.

That's low on a sliding scale? I can't walk into a room and go "Hey! It's 22.6 degrees celcius!" or get my geographic coordinates within a meter (however accurate GPS systems are currently, I don't actually know). At what point will it qualify as "more conscious than a human?" Cause I can find a lot of USB gadgets.

hammerkrieg
Posts: 16
Joined: Thu Jun 25, 2009 11:48 am UTC

Re: Can a machine be conscious?

Postby hammerkrieg » Sun Jun 28, 2009 2:10 am UTC

0xBADFEED wrote:I would say that consciousness is a property of software, not hardware. It makes no more sense to say "a computer is conscious" than it does to say "a brain is conscious". The brain may have a potential for consciousness, but the virtue of being a brain does not imbue it with consciousness automatically. I'm sure you can think of lots of examples of human brains that are definitely not conscious. Consciousness of a brain is dependent on having a perception/reasoning loop active in the brain. We don't usually separate the idea of "software" and "hardware" in the brain. But I think the distinction is important. Surely the "hardware" of my brain is incredibly similar to the "hardware" of your brain. Yet, we can have vastly different "conciousnesses". That is why I say that consciousness is largely a property of software not hardware, and it is meaningless to ascribe consciousness to inanimate hardware without also talking about the software.


The distinction between "hardware" and "software" in the brain is not clear at all, which is why most neuroscientists would reject the idea of drawing that distinction.

0xBADFEED wrote:This is "New-Agey", magical thinking that has no basis or evidence. You should disabuse yourself of it.


Not really. I simply took something said by John McCarthy to its logical conclusion.

0xBADFEED
Posts: 687
Joined: Mon May 05, 2008 2:14 am UTC

Re: Can a machine be conscious?

Postby 0xBADFEED » Sun Jun 28, 2009 2:44 am UTC

hammerkrieg wrote:The distinction between "hardware" and "software" in the brain is not clear at all, which is why most neuroscientists would reject the idea of drawing that distinction.

The hardware/software analogy I was making probably requires further explanation.

The main distinction I was trying to make is between the potential to house consciousness versus actual consciousness. Substrate vs. Processes. A brain and a computer (I believe) both have the potential to house consciousness. That doesn't make the substrate conscious.

The "software" I was referring to is the process in the physical substrate (biological or otherwise) that is responsible for interpreting perceptions, making decisions, and establishing cause/effect relationships.

Surely there exists a level of abstraction above the physicality of neurons and neurotransmitters in which one could talk about these processes. We just don't yet grasp it . This is the brain "software" to which I alluded. If we could grasp this level of abstraction we could perhaps map it to another medium. That the specifics of this layer of abstraction are not yet clear to neuroscientists has no bearing on whether or not it exists. Many believe it does. I find it unlikely that the brain's structures are so special purpose that its processes could not be mapped to another medium, (i.e. the software separated from the hardware).

The definition you have given, which comes down to "perception of environment", so dilutes the meaning of "consciousness" that it becomes rather pointless to even talk about.
Not really. I simply took something said by John McCarthy to its logical conclusion.

I don't think that's the logical conclusion at all. If I toss a leaf in a stream, is it conscious? It moves. It seems to pick a path and follow it. It seems to have a mind of its own. But surely you wouldn't say it's conscious. It's just being pushed along by the current. It isn't aware of its path. Subatomic particles are the same way. They can't perceive their surroundings. Even by your rather weak definition of consciousness they fail.
Last edited by 0xBADFEED on Sun Jun 28, 2009 4:05 pm UTC, edited 2 times in total.

hammerkrieg
Posts: 16
Joined: Thu Jun 25, 2009 11:48 am UTC

Re: Can a machine be conscious?

Postby hammerkrieg » Sun Jun 28, 2009 5:26 am UTC

0xBADFEED wrote:I don't think that's the logical conclusion at all. If I toss a leaf in a stream, is it conscious? It moves. It seems to pick a path and follow it. It seems to have a mind of its own. But surely you wouldn't say it's conscious. It's just being pushed along by the current. It isn't aware of its path. Subatomic particles are the same way. They can't perceive their surroundings. Even by your rather weak definition of consciousness they fail.


First of all, do you know what the original quote was?

User avatar
Gaydar2000SE
Posts: 210
Joined: Sun Jun 21, 2009 1:43 am UTC

Re: Can a machine be conscious?

Postby Gaydar2000SE » Sun Jun 28, 2009 1:07 pm UTC

0xBADFEED wrote:I think that you're conflating the ideas of "free will" and "consciousness".

I would say those are two very different questions. The "free will" problem is extremely tricky when you take a strictly reductionist view. It's very difficult to see at what point "free will" would be injected into the system. Consciousness seems rather more constrained. We can say that most higher animals have varying degrees of consciousness, with humans at the top of the heap.

If we define (in a very loose and off-the-cuff sense) consciousness as the ability to:
* Perceive one's environment
* Make decisions based on those perceptions (whether by "free will" or by programming)
* Remember cause and effect relationships based on the perception->decision (i.e. learn)
* Abstract these relationships to create general propositions about our environment
* Maintain an internal model to predict future cause and effect relationships

It's clear that humans possess these abilities in spades and that the higher animals possess them to varying degrees. I would also say that these abilities largely encompass what we tend to think about as "intelligence". Though evolution (I'm not even sure why you brought evolution into it) doesn't require the development of these abilities, it does seem to reward them in many cases.

I would say it's silly to argue that humans aren't conscious. Whether or not we possess "free will" is more of an open question.
No, not really. And I didn't argue that humans are not conscious as in making the axiom 'humans are conscious' but simply not making the axiom that they are concious either. Just making no axiom about it at all. What I mean is, that any objective third party observer shall conclude that humans are no more concious than a very elaborate flowchart being executed by the laws of physics without really having an 'understanding' or 'awareness' what it is doing. Simply evolved like that, like a virus, but a lot more sophisticated.

Point is, it may be tempting to say that 'but wait, we know that we are conscious', but we don't, we know that we say that we are it, we cannot look inside a human mind. There is, for you, no way to see if others are conscious, and for others not of you, seeing that science is to be objective and all scientists must come to the same conclusion, clearly by the reductio ad absurdum, introspection is not a valid tool to determine if one is concious. The experiment is not reproducible.

So counter-intuitive as it is, for the time being in a scientific perspective, to say that humans are concious is as abstract as saying that stones are portals to other dimensions, it's not to say that it is theoretically impossible, but simply that currently there is no reason for why they would be, to say that they are does not solve any questions, and just creates a whole lot more. In the future, perhaps physical theories can incorporate self-awareness.
^ :/

0xBADFEED
Posts: 687
Joined: Mon May 05, 2008 2:14 am UTC

Re: Can a machine be conscious?

Postby 0xBADFEED » Sun Jun 28, 2009 3:42 pm UTC

hammerkrieg wrote:First of all, do you know what the original quote was?

There's only one John McCarthy reference in this thread, the one detailing perceptive powers of a thermostat. I assumed that was the quote to which you were referring. If it was not, please share with the rest of the class. I'm interested what could prompt such a line of thought.
Gaydar2000SE wrote:No, not really. And I didn't argue that humans are not conscious as in making the axiom 'humans are conscious' but simply not making the axiom that they are concious either. Just making no axiom about it at all. What I mean is, that any objective third party observer shall conclude that humans are no more concious than a very elaborate flowchart being executed by the laws of physics without really having an 'understanding' or 'awareness' what it is doing. Simply evolved like that, like a virus, but a lot more sophisticated.

I really think you are conflating free-will and consciousness. You seem to be implying that consciousness requires some kind of freedom of choice, that if something is making decisions mechanically by following an arbitrarily complex formula it cannot also be conscious. Please detail what your requirements for "consciousness" are so that I might better understand where you're coming from.

Point is, it may be tempting to say that 'but wait, we know that we are conscious', but we don't, we know that we say that we are it, we cannot look inside a human mind. There is, for you, no way to see if others are conscious, and for others not of you, seeing that science is to be objective and all scientists must come to the same conclusion, clearly by the reductio ad absurdum, introspection is not a valid tool to determine if one is concious. The experiment is not reproducible.

This argument has been made many times and it is thoroughly silly. My brain is much the same as yours. They have all the same processes and major components. It is utter silliness to believe that my internal experience of consciousness is vastly different from yours due to some unseen, unexplained, apparently non-physical, phenomenon (aka magic).

For example, imagine a set of identical twins A and B who both have what anyone would consider "normal" brains. By A's account he is conscious. He can perceive, react, remember, and learn about his environment to an extraordinary degree. It's silliness for A to conclude that B, who is at many levels almost physically identical to A, may somehow only be pretending to be conscious. That somehow A is really conscious and B is not, even though they exhibit all the same abilities. What magical property is it that would make A's experience so different from B's.
So counter-intuitive as it is, for the time being in a scientific perspective, to say that humans are concious is as abstract as saying that stones are portals to other dimensions, it's not to say that it is theoretically impossible, but simply that currently there is no reason for why they would be, to say that they are does not solve any questions, and just creates a whole lot more. In the future, perhaps physical theories can incorporate self-awareness.

No, it's really not. What you're saying is the very antithesis of science. Let's start with the line of logic of A regarding B:
The logical conclusion for A is along the lines:
1) I believe I am conscious
I think this is rather solid, what point or sense is there in believing that oneself is not conscious.
2) B acts in much the same manner as me and B seems to have all the requisite abilities for consciousness
A can see this from observation. From all outside observers B appears to exhibit all the abilities that A would consider requisite for conscious behavior.
3) B is virtually the same, physically as me
This is especially true of the twins but also true for any two "normal" humans.
4) B must also be conscious
If B is not also conscious, or is not at least as conscious as A, then what is the physical phenomenon that render's B's consciousness a mere imitation? Why should we have any reason to believe B is not also conscious?

Do you think the laws of physics are different on distant planets?
Maybe things fall up in other galaxies.
Maybe the speed of light is only 40 mph.
We can't (yet) test the laws of physics on distant planets. That doesn't mean assuming they're different is a reasonable course of action. We have no evidence that they are different at all. The most reasonable course of action is to hold they're the same and in line with things that we've already observed until evidence to the contrary presents itself.

Now you can, as you point out, remain completely agnostic on the proposition. But the proposition is not a 50/50 split. There is overwhelming evidence that, if I am conscious, then you are also conscious. Believing the contrary requires some kind of magical difference between my brain and all other brains somehow not yet perceived by modern science that renders me special and all other brains mere pretenders.

User avatar
Gaydar2000SE
Posts: 210
Joined: Sun Jun 21, 2009 1:43 am UTC

Re: Can a machine be conscious?

Postby Gaydar2000SE » Sun Jun 28, 2009 4:11 pm UTC

0xBADFEED wrote:I really think you are conflating free-will and consciousness. You seem to be implying that consciousness requires some kind of freedom of choice, that if something is making decisions mechanically by following an arbitrarily complex formula it cannot also be conscious. Please detail what your requirements for "consciousness" are so that I might better understand where you're coming from.
Hmm, not per se a choice. But that the intelligence has an 'awareness' of what it is doing. A perspective, instead of being a machine which can do complex tasks simply because the laws of physics force it to do those. A conscious machine also has a sense of introspection and experience about those tasks and realizes that it is doing them, regardless of there being a choice or deterministically bounded by physical laws.

This argument has been made many times and it is thoroughly silly. My brain is much the same as yours. They have all the same processes and major components. It is utter silliness to believe that my internal experience of consciousness is vastly different from yours due to some unseen, unexplained, apparently non-physical, phenomenon (aka magic).
That's a bold statement coming from a member of a species that has such limited understanding of their own brains. Also, mind that to say that only your brain is conscious by the definition above and the rest is not is equally absurd by current scientific understanding that saying that any are, or all are. According to our understanding of how our brains work as a swarm intelligence of neurons, individual neurons shouldn't have an 'awareness' of what exactly it is doing, and how a swarm intelligence grouped together of them some how has it as a unitary vehicle is even more defying the current laws of electricity and quantum chemistry that physics has to offer.

Physics can neither construct the property of 'experience' and 'first person perspective' or even define it, and to say that humans or any thing has such a thing does not solve any open questions in physics or is needed to make any model work, it only puts more forth. Elementary is then to conclude that humans are not conscious at all. Or rather, to not make such a bold and strange assumption that they are in the first place and just not think about it.

After all, to suppose that humans' feeling that they are concious and argument makes, that would mean that a machine that is simply programmed to follow correct scientific rigour thus concluding not that humans are concious as it has no notion of that has a different answer, all scientists must come to the same conclusion. Thus a method of gathering information that lies inside the mind is not valid evidence. After all, I could just say right now 'I have no experience of what I am doing, I am not conscious, I just type this because the laws of physics ultimately make me do it, in a sense the same as blocks falling on my keyboard to produce this very sentence by chance, your brain is the same, thereto, you aren't either.' and it's your word against mine with no way to objectively test who is right.

For example, given a set of identical twins A and B who both have what anyone would consider "normal" brains. By A's account he is conscious. He can perceive, react, remember, and learn about his environment to an extraordinary degree. It's silliness for A to conclude that B, who is at many levels almost physically identical to A, may somehow only be pretending to be conscious. That somehow A is really conscious and B is not even though they exhibit all the same abilities. What magical property is it that would make A's experience so different from B's.
And proof it neither can to an outside observer with a remarkably different brain structure? Thus to the outside observer, it is as weird as assuming that a piece of paper is concious because it has the text 'I am concious' written on it. It's the difference between 'knowing to be right' and 'proving to be right', the latter one of the pillars of science. As I cannot prove that I am concious, scientific a conclusion is for the time being that I am not until I can prove it.
No, it's really not. What you're saying is the very antithesis of science. Let's start with the line of logic of A regarding B:
The logical conclusion for A is along the lines:
1) I believe I am conscious
I think this is rather solid, what point or sense is there in believing that oneself is not conscious.
2) B acts in much the same manner as me and B seems to have all the requisite abilities for consciousness
We can see this from observation. From all outside observers B appears to be what A would consider conscious be comparing it to its
3) B is virtually the same, physically as me
This is especially true of the twins but also true for any two "normal" humans.
4) B must also be conscious
If B is not also conscious, or is not at least as conscious as A, then what is the physical phenomenon that render's B's consciousness a mere imitation? Why should we have any reason to believe B is not also conscious?
The difference is that 'I believe I am concious' is not an empirically testable fact ,it is in fact a vague construct having no meaning in science. However 'I say that I am conscious' is indeed empirically testable as true or false. Your argument is the same as:

A stone believes he is conscious
...
...
All stones are.

Of course, then you're going to say, 'Well, but stones don't believe that.', and then I'll say 'And neither do you.', it cannot be proven that you believe you are, only that you say you are.

Please tell me, how does your saying a stone is not concious is any more valid than my saying you are not, likewise?

Ultimately, of you saying 'I am concious because I feel it.' is as arbitrary as me saying 'A piece of paper is concious, because it feels it.', one must prove either to feel it, some thing that can hardly be done.

Now you can, as you point out, remain completely agnostic on the proposition. But the proposition is not a 50/50 split. There is overwhelming evidence that, if I am conscious, then you are also conscious. Believing the contrary requires some kind of magical difference between my brain and all other brains somehow not yet perceived by modern science that renders me special and all other brains mere pretenders.
There is no overwhelming evidence, far from, there is only humans continually claiming that they can feel and have emotions and senses and perception, but that's it. And that can easily be explained by that evolution has shaped them to claim that, because other, likewise evolved humans, then do things they want. All other physical theories strongly indicate it's implausible for a human central nervous system to have any sense of 'perception' or 'feeling' or 'emotion'.
^ :/

hammerkrieg
Posts: 16
Joined: Thu Jun 25, 2009 11:48 am UTC

Re: Can a machine be conscious?

Postby hammerkrieg » Sun Jun 28, 2009 7:19 pm UTC

0xBADFEED wrote:There's only one John McCarthy reference in this thread, the one detailing perceptive powers of a thermostat. I assumed that was the quote to which you were referring.


Alright, so, if the thermostat has beliefs corresponding to its changes of state, what else does?

0xBADFEED
Posts: 687
Joined: Mon May 05, 2008 2:14 am UTC

Re: Can a machine be conscious?

Postby 0xBADFEED » Mon Jun 29, 2009 2:36 am UTC

Gaydar2000SE wrote:Hmm, not per se a choice. But that the intelligence has an 'awareness' of what it is doing. A perspective, instead of being a machine which can do complex tasks simply because the laws of physics force it to do those. A conscious machine also has a sense of introspection and experience about those tasks and realizes that it is doing them, regardless of there being a choice or deterministically bounded by physical laws.


OK, so if I'm following you correctly, you require that something that is conscious has an interior perspective, a sense of 'I'. Basically what people usually refer to as being "self-aware". I don't consider this anything special. Self-awareness arises naturally from external awareness once external awareness becomes sufficiently rich. I restate my requirements for consciousness as the following:
Something is "conscious" if it exhibits the following abilities (I'll call these C-Abilities from now on for brevity):
* It can perceive its environment
* It can make decisions based on its perceptions (whether by "free will" or by programming)
* Remember cause and effect relationships based on the perception->decision (i.e. learn)
* Abstract these relationships to create general propositions about its environment
* Maintain an internal model to predict future cause and effect relationships

All of these can be readily observed by an outsider and require no sharing of an internal perspective. Self-awareness emerges from these C-Abilities quite naturally. For an entity possessing these C-Abilities the entity itself is the only constant amid all of its numerous experiences. Naturally, the entity will form many propositions and relationships which center on itself. The aggregation of all of these relationships and propositions is the entity's sense of "self".

Further, I don't think consciousness is an all-or-nothing proposition as you seem to think. I would say that a chimp is conscious. It is conscious to a lesser degree than a human because its C-Abilities are much less developed. Likewise a chimp is much more conscious than say a goldfish.

The crux of your argument seems to be that "If we have something that seems to exhibit all of these C-Abilities, how do we know it has a sense of 'self'?" How do we know it is "really" conscious and not a mere automaton acting like it is conscious. How do we prove beyond a shadow of a doubt that it is not just following a program that just happens to always give the appearance of having C-Abilities?

Strictly speaking (according to your definition), we can't.

We can't "prove" that something is or isn't conscious. Is this the point you're trying to make?

However, there is no evidence that other humans have a non-self-aware interior experience, and believing that they do is simply misplaced skepticism. Absence of proof has no bearing on a proposition's likelihood of truth or falsehood. Even if something is unproven, we can make conjectures about its likelihood. For instance most Computer Scientists believe that P != NP, and it's a good working assumption even though it's unproven. I can't prove that there aren't argon-breathing unicorns on Jupiter. That says nothing about the likelihood of their existence.

According to our understanding of how our brains work as a swarm intelligence of neurons, individual neurons shouldn't have an 'awareness' of what exactly it is doing, and how a swarm intelligence grouped together of them some how has it as a unitary vehicle is even more defying the current laws of electricity and quantum chemistry that physics has to offer.


I have never heard anyone of note claim our brains break the laws of physics or chemistry. Citation please.

Most people don't think that our brains work as a "swarm intelligence". "Swarm intelligence" implies a measure of variety and randomness that our neurons just don't have. No more than the billions of transistors in a CPU constitute a "swarm intelligence". Each transistor is nothing but a mechanism. Properly arranged, a system emerges that exhibits higher-level properties not present in any of the component parts. This is analogous to how many people think the C-Abilities arise from the lower level of neurons and is usually referred to under the umbrella term of an epiphenomenon.

Gaydar2000SE wrote:The difference is that 'I believe I am concious' is not an empirically testable fact ,it is in fact a vague construct having no meaning in science. However 'I say that I am conscious' is indeed empirically testable as true or false. Your argument is the same as:

A stone believes he is conscious
...
...
All stones are.

Of course, then you're going to say, 'Well, but stones don't believe that.', and then I'll say 'And neither do you.', it cannot be proven that you believe you are, only that you say you are.

Please tell me, how does your saying a stone is not concious is any more valid than my saying you are not, likewise?

Ultimately, of you saying 'I am concious because I feel it.' is as arbitrary as me saying 'A piece of paper is concious, because it feels it.', one must prove either to feel it, some thing that can hardly be done.

If a stone could exhibit C-Abilities and indeed claim that it was conscious, I would happily accept it as such. At that point the burden is now on me to present some sort of evidence that it is not conscious. Lacking evidence to dispute the claim, I would have to at least accept it as very likely. Or at the very least remain agnostic (but that's a thoroughly boring position).

You seem to think that the claimer and the doubter are on equal footing. They are not.

Consider Fred who, to an outside observer, exhibits C-Abilities equal to that of a normal human.

If Fred makes the claim "I am conscious" the doubter says "prove it". But Fred's mere appearance to possess C-Abilities is already a mountain of evidence in his defense. If someone wishes to doubt Fred's consciousness they had better have some evidence to the contrary. Doubting Fred's claim of consciousness without evidence to the contrary is foolish. The doubter's skepticism is baseless. If you want to be skeptical about something you also need a basis for your skepticism. Absence of an air-tight proof is not sufficient impetus for skepticism in the face of copious evidence to the contrary.

Gaydar2000SE wrote:There is no overwhelming evidence, far from, there is only humans continually claiming that they can feel and have emotions and senses and perception, but that's it. And that can easily be explained by that evolution has shaped them to claim that, because other, likewise evolved humans, then do things they want. All other physical theories strongly indicate it's implausible for a human central nervous system to have any sense of 'perception' or 'feeling' or 'emotion'.


Sure, if you think "feelings" or "emotions" are something magical or other-worldly. I don't. They're just specific patterns of firing neurons that elicit a particular response. That doesn't mean they aren't being perceived by the being "experiencing" the patterns of firing neurons.

hammerkrieg wrote:Alright, so, if the thermostat has beliefs corresponding to its changes of state, what else does?

I reject the notion that the thermostat has any level of consciousness or beliefs. It does not exhibit C-Abilities to the extent that I require for even weak "consciousness". Further, subatomic particles are right out as they don't even meet the bare minimum of the ability to perceive their environment.
Last edited by 0xBADFEED on Mon Jun 29, 2009 1:35 pm UTC, edited 1 time in total.

User avatar
phillipsjk
Posts: 1213
Joined: Wed Nov 05, 2008 4:09 pm UTC
Location: Edmonton AB Canada
Contact:

Re: Can a machine be conscious?

Postby phillipsjk » Mon Jun 29, 2009 6:54 am UTC

at the risk of quote-sniping:
Gaydar2000SE wrote:
A stone believes he is conscious
...
...
All stones are.



You don't know that stones aren't conscious. Rocks can be thought of as life forms with an extremely long like-cycle: so long we don't ever perceive it. It is sort of like how mayflies (living for a day) wouldn't be able to perceive trees as living things.

Which brings up another question: Does something have to be living to be conscious (however we define consciousness)?
Did you get the number on that truck?

User avatar
OOPMan
Posts: 314
Joined: Mon Oct 15, 2007 10:20 am UTC
Location: Cape Town, South Africa

Re: Can a machine be conscious?

Postby OOPMan » Mon Jun 29, 2009 7:06 am UTC

Humans are biological machines. They are conscious. 'Nuff said.

As for all the philosophical ramblings...stuff that bollocks where the sun don't shine...
Image

Image

User avatar
Isotope_238
Posts: 285
Joined: Fri Mar 27, 2009 9:59 pm UTC
Location: The Galaxy of a Thousand Rubies

Re: Can a machine be conscious?

Postby Isotope_238 » Tue Jun 30, 2009 2:03 am UTC

I, personally, would consider a machine to be self-aware when it, of its own accord, declares itself to be self-aware. That's the same system most of us humans tend to use: "I know I'm self-aware. How about you?"

I now have the urge to go read the Space Odysseys and The Moon is a Harsh Mistress, as well as play Portal.

hammerkrieg
Posts: 16
Joined: Thu Jun 25, 2009 11:48 am UTC

Re: Can a machine be conscious?

Postby hammerkrieg » Wed Jul 01, 2009 2:10 am UTC

Isotope_238 wrote:I, personally, would consider a machine to be self-aware when it, of its own accord, declares itself to be self-aware. That's the same system most of us humans tend to use: "I know I'm self-aware. How about you?"

I now have the urge to go read the Space Odysseys and The Moon is a Harsh Mistress, as well as play Portal.


That's kind of behaviorist of you

I guess we'll never really know

0xBADFEED wrote:I reject the notion that the thermostat has any level of consciousness or beliefs. It does not exhibit C-Abilities to the extent that I require for even weak "consciousness". Further, subatomic particles are right out as they don't even meet the bare minimum of the ability to perceive their environment.


If you view consciousness as existing on a continuum with no real beginning, middle, or end, what McCarthy said really starts to make sense

somebody already took it
Posts: 310
Joined: Wed Jul 01, 2009 3:03 am UTC

Re: Can a machine be conscious?

Postby somebody already took it » Wed Jul 01, 2009 3:35 am UTC

0xBADFEED wrote:Sure, if you think "feelings" or "emotions" are something magical or other-worldly. I don't. They're just specific patterns of firing neurons that elicit a particular response. That doesn't mean they aren't being perceived by the being "experiencing" the patterns of firing neurons.


Feelings probably are the result of neurons firing off, but do you have any ideas about what are they on the abstract "software" level?
They are so different from everything else that they kinda make me wonder if they are magical or other-worldly.

hammerkrieg
Posts: 16
Joined: Thu Jun 25, 2009 11:48 am UTC

Re: Can a machine be conscious?

Postby hammerkrieg » Wed Jul 01, 2009 12:43 pm UTC

somebody already took it wrote:
0xBADFEED wrote:Sure, if you think "feelings" or "emotions" are something magical or other-worldly. I don't. They're just specific patterns of firing neurons that elicit a particular response. That doesn't mean they aren't being perceived by the being "experiencing" the patterns of firing neurons.


Feelings probably are the result of neurons firing off, but do you have any ideas about what are they on the abstract "software" level?
They are so different from everything else that they kinda make me wonder if they are magical or other-worldly.


For all I know that could be true, but if biological brains can tap into whatever causes thoughts, so can nonbiological brains

Really, I have nothing to go on but a vague feeling that there might be another reality beneath this one, so I don't think about it much. I don't even really want to guess.

User avatar
Isotope_238
Posts: 285
Joined: Fri Mar 27, 2009 9:59 pm UTC
Location: The Galaxy of a Thousand Rubies

Re: Can a machine be conscious?

Postby Isotope_238 » Wed Jul 01, 2009 4:35 pm UTC

hammerkrieg wrote:That's kind of behaviorist of you


I never said I was right. I did say that it was merely a personal conclusion.

User avatar
Mr. Samsa
Posts: 144
Joined: Thu Sep 20, 2007 11:14 pm UTC
Location: Down south.

Re: Can a machine be conscious?

Postby Mr. Samsa » Wed Jul 01, 2009 5:33 pm UTC

hammerkrieg wrote:
samk wrote:Define awareness.


Well, I don't subscribe to behaviorism per se, which rejects the idea of "thoughts" entirely, but I feel the need, or at least the desire, to stick to what is empirically testable, and so I would define awareness as changes of internal state in an object mediated by some sensory apparatus.


Just pointing out that this is entirely false unless you feel some desire to associate with the outdated form of behaviourism. Radical behaviourism recognises thoughts and other internal processes, labeling them private behaviours.

VorpalSword
Posts: 15
Joined: Fri Jan 30, 2009 5:15 am UTC

Re: Can a machine be conscious?

Postby VorpalSword » Thu Jul 02, 2009 5:16 am UTC

The difference is that 'I believe I am concious' is not an empirically testable fact ,it is in fact a vague construct having no meaning in science. However 'I say that I am conscious' is indeed empirically testable as true or false. Your argument is the same as:

A stone believes he is conscious
...
...
All stones are.

Of course, then you're going to say, 'Well, but stones don't believe that.', and then I'll say 'And neither do you.', it cannot be proven that you believe you are, only that you say you are.

Please tell me, how does your saying a stone is not concious is any more valid than my saying you are not, likewise?


"I believe I am conscious" can be proven internally with the argument "cogito ergo sum". I do not have a definition of consciousness, and feel the list of C-abilities seems more about ability to react to stimuli than anything else, but surely anything that is capable of reflecting on its own existence and deciding on an answer (and it does have to exist, or else who is doing the questioning?) can be considered conscious. With this I can prove (to my satisfaction anyway, although not to external observers) that I am conscious. I lack the ability to definitively prove that anyone else is conscious, but Occam's razor suggests that anything that claims to be conscious, is constructed in materials and manner similar to something I know is conscious (myself) and engages in behavior similar to things proven conscious is probably conscious. I have only circumstantial evidence for the existence of conscious minds behind other people, but in lacking any reason for them not to be conscious, it seems highly likely.

somebody already took it
Posts: 310
Joined: Wed Jul 01, 2009 3:03 am UTC

Re: Can a machine be conscious?

Postby somebody already took it » Thu Jul 02, 2009 8:45 am UTC

hammerkrieg wrote:If biological brains can tap into whatever causes thoughts, so can nonbiological brains.

I don't think there is a way to establish that it is true without knowing what feelings are.
How would you argue against a theory where feelings are considered other-worldly?
For instance, what if the physical world is tapped into or even constructed by feelings?
Feelings might not be willing to inhabit/create nonbiological brains.

VorpalSword wrote:"I believe I am conscious" can be proven internally with the argument "cogito ergo sum".

How do you arrive at the conclusion that you believe you are concious from cogito ergo sum? I think all that it says is that thinking implies existence (perhaps there is more to it, I only read the Wikipedia summary). Furthermore, you should take a look at some of the critiques of the argument, in particular Kierkegaard's critique, which states that the existence of "I" is presupposed by it.

0xBADFEED
Posts: 687
Joined: Mon May 05, 2008 2:14 am UTC

Re: Can a machine be conscious?

Postby 0xBADFEED » Thu Jul 02, 2009 3:19 pm UTC

VorpalSword wrote:"I believe I am conscious" can be proven internally with the argument "cogito ergo sum".

The "cogito ergo sum" argument is just that, an argument. It is not a proof. And even as an argument it's not entirely convincing.
I do not have a definition of consciousness, and feel the list of C-abilities seems more about ability to react to stimuli than anything else, but surely anything that is capable of reflecting on its own existence and deciding on an answer (and it does have to exist, or else who is doing the questioning?) can be considered conscious.

It' not just about reacting to stimuli. It's about recognizing cause/effect relationships between stimuli and being able to generalize them and suppose new relationships. A computer with a webcam attached and fairly simple software can beep every time something red comes into its field of "vision". Surely this system does not meet any interesting or useful definition of "conscious". Perception and reaction are the bare minimum to enable the other higher-level abilities that mostly have to do with "learning" (remembering relationships) and "thinking" (generalizing observed relationships and predicting relationships that have not yet been observed). Perception and reaction are necessary for consciousness but not sufficient. As I have said before, I think internal "self-awareness" naturally and necessarily arises out of external awareness and that treating "self-awareness" as something special or separate from external awareness is an incorrect distinction.

somebody already took it wrote:I don't think there is a way to establish that it is true without knowing what feelings are.
How would you argue against a theory where feelings are considered other-worldly?
For instance, what if the physical world is tapped into or even constructed by feelings?
Feelings might not be willing to inhabit/create nonbiological brains.

You keep referring to "feelings". I'm not sure what you mean by this. I personally don't find anything special about "feelings" that requires or even points to an other-worldly or magical explanation. Which of these "feelings" do you feel seem "other-worldly" or magical?
* I am hungry.
* I am tired.
* I am horny.
* I am sad.
* I am confused.
* I am in love.
Or can you give me an example so I might know where you're coming from?

User avatar
Josephine
Posts: 2142
Joined: Wed Apr 08, 2009 5:53 am UTC

Re: Can a machine be conscious?

Postby Josephine » Thu Jul 02, 2009 6:31 pm UTC

0xBADFEED wrote:* I am hungry.
* I am tired.
* I am horny.
* I am sad.
* I am confused.
* I am in love.



* You need food.
* you need sleep.
* your body has a drive to reproduce.
* (a little more metaphysical) you need some way to ingrain memories and learn from mistakes, right?
* Your senses or ability to comprehend are conflicting or overloaded.
* Your body chose a prospective mate.

I love reductionism.
Belial wrote:Listen, what I'm saying is that he committed a felony with a zoo animal.

VorpalSword
Posts: 15
Joined: Fri Jan 30, 2009 5:15 am UTC

Re: Can a machine be conscious?

Postby VorpalSword » Sat Jul 04, 2009 5:37 am UTC

somebody already took it wrote:
VorpalSword wrote:"I believe I am conscious" can be proven internally with the argument "cogito ergo sum".

How do you arrive at the conclusion that you believe you are concious from cogito ergo sum? I think all that it says is that thinking implies existence (perhaps there is more to it, I only read the Wikipedia summary). Furthermore, you should take a look at some of the critiques of the argument, in particular Kierkegaard's critique, which states that the existence of "I" is presupposed by it.


Kierkegaard's critique seems to me to be more that the argument assumes that the thinking is done by "I". He doesn't dispute thinking occurs, but gets upset that "I" is already expected and that it only demonstrates that "I" is thinking. I think this is missing the point that clearly something exists to do the thinking; "I" is just a convenient label for whatever this thinker is.

I did jump from existence to consciousness. I assumed that what is being proved is that my mind exists on some level, and that (because of the definition of a mind, and that we already know it can think) that such an "I" would qualify as conscious.

7598462153
Posts: 8
Joined: Sat Jul 04, 2009 4:51 pm UTC

Re: Can a machine be conscious?

Postby 7598462153 » Sat Jul 04, 2009 5:45 pm UTC

First of all, to actually respond to the question, until I see definitive evidence that machines cannot be conscious, I will assume that technological systems have the same potential to become conscious as do biological systems.

However, I think it is important in this debate to make a distinction between "intelligence" and "consciousness."

I would define "intelligent" in a similar way to the "C-abilities" listed above - basically, ability to respond, remember, and learn from external stimuli. Ability to generalize learning, while definitely helpful, may or may not be a necessary criterion. We have certainly already created machines/programs that are "intelligent" by the above definition; they are commonly shipped with modern computer games, among other uses.

"Consciousness" or "sentience," by contrast, is more difficult to pin down. We have no real knowledge of how consciousness develops in humans; for the moment, the only real workable definitions are either along the lines of "Cogito ergo sum" or simply "I know it when I see it" (which in itself can be deceiving; I don't hold much faith in the Turing test.)

This distinction is not only philosophically important; if or when we do create sentient machines, they must be afforded the same rights as humans. As anyone familiar with science fiction knows, the robot uprising problems come when machines are treated unfairly.
Image
Incidentally, does anyone else find it ironic that "Dont your get it?" is missing an apostrophe?
My thanks to anyone who can direct me to a corrected version with which I can replace the above.


Return to “Religious Wars”

Who is online

Users browsing this forum: No registered users and 7 guests