Total Posts:28|Showing Posts:1-28
Jump to topic:

AI philosophy

lkxambp
Posts: 46
Add as Friend
Challenge to a Debate
Send a Message
10/29/2014 8:24:51 AM
Posted: 2 years ago
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

1) Destroyed it.

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

3) Switched it off.

4) Switched it off then turned it back on again.

What is your opinion?
SNP1
Posts: 2,403
Add as Friend
Challenge to a Debate
Send a Message
10/29/2014 8:49:04 AM
Posted: 2 years ago
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

I will address this with the definition of person provided in Star Trek (which allowed Data to be considered a person) and assuming that if you are a person that you are entitled to rights, not that a human is.

1) Destroyed it.

Murder

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

Murder

3) Switched it off.

Put it into stasis against it's will (also illegal), not murder. This also assumes that there is no loss of memory from getting turned off.

4) Switched it off then turned it back on again.

Same as the last one.

What is your opinion?
#TheApatheticNihilistPartyofAmerica
#WarOnDDO
zmikecuber
Posts: 4,082
Add as Friend
Challenge to a Debate
Send a Message
10/29/2014 10:32:49 AM
Posted: 2 years ago
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

1) Destroyed it.

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

3) Switched it off.

4) Switched it off then turned it back on again.

What is your opinion?

I don't think so personally
"Delete your fvcking sig" -1hard

"primal man had the habit, when he came into contact with fire, of satisfying the infantile desire connected with it, by putting it out with a stream of his urine... Putting out the fire by micturating was therefore a kind of sexual act with a male, an enjoyment of sexual potency in a homosexual competition."
lkxambp
Posts: 46
Add as Friend
Challenge to a Debate
Send a Message
10/29/2014 3:32:47 PM
Posted: 2 years ago
At 10/29/2014 8:49:04 AM, SNP1 wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

I will address this with the definition of person provided in Star Trek (which allowed Data to be considered a person) and assuming that if you are a person that you are entitled to rights, not that a human is.

1) Destroyed it.

Murder

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

Murder

3) Switched it off.

Put it into stasis against it's will (also illegal), not murder. This also assumes that there is no loss of memory from getting turned off.

4) Switched it off then turned it back on again.

Same as the last one.

What is your opinion?

Slightly different, what if I had a person who had an AI in place of their brain. If I then killed their body but then removed the IA and replaced it within a identical new body
then have I just committed a murder?
SNP1
Posts: 2,403
Add as Friend
Challenge to a Debate
Send a Message
10/29/2014 3:39:41 PM
Posted: 2 years ago
At 10/29/2014 3:32:47 PM, lkxambp wrote:
At 10/29/2014 8:49:04 AM, SNP1 wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

I will address this with the definition of person provided in Star Trek (which allowed Data to be considered a person) and assuming that if you are a person that you are entitled to rights, not that a human is.

1) Destroyed it.

Murder

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

Murder

3) Switched it off.

Put it into stasis against it's will (also illegal), not murder. This also assumes that there is no loss of memory from getting turned off.

4) Switched it off then turned it back on again.

Same as the last one.

What is your opinion?

Slightly different, what if I had a person who had an AI in place of their brain. If I then killed their body but then removed the IA and replaced it within a identical new body
then have I just committed a murder?

Not murder, but bodily mutilation (which is still illegal).
#TheApatheticNihilistPartyofAmerica
#WarOnDDO
lkxambp
Posts: 46
Add as Friend
Challenge to a Debate
Send a Message
10/29/2014 3:50:56 PM
Posted: 2 years ago
At 10/29/2014 3:39:41 PM, SNP1 wrote:
At 10/29/2014 3:32:47 PM, lkxambp wrote:
At 10/29/2014 8:49:04 AM, SNP1 wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

I will address this with the definition of person provided in Star Trek (which allowed Data to be considered a person) and assuming that if you are a person that you are entitled to rights, not that a human is.

1) Destroyed it.

Murder

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

Murder

3) Switched it off.

Put it into stasis against it's will (also illegal), not murder. This also assumes that there is no loss of memory from getting turned off.

4) Switched it off then turned it back on again.

Same as the last one.

What is your opinion?

Slightly different, what if I had a person who had an AI in place of their brain. If I then killed their body but then removed the IA and replaced it within a identical new body
then have I just committed a murder?

Not murder, but bodily mutilation (which is still illegal).

So I'm guessing you would say that the mind is the defining thing in deciding who a person is so how about if I somehow got someone and surgically altered their brain to completely destroy their mind leaving them comatose with no hope of ever waking? up?
suttichart.denpruektham
Posts: 1,115
Add as Friend
Challenge to a Debate
Send a Message
10/29/2014 3:56:57 PM
Posted: 2 years ago
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

1) Destroyed it.

Murder

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

Assuming that you use exactly the algorithm that make up for its existence, not murder. Our consciousness is sustained by the continuous write and rewrite of our existing memory, my theory is that we're practically die and reborn every nono second that our brain electronics are interacted. Our consciousness lie in our software, as long as you don't actually delete that software, it will be more like body replacement rather than an actual killing.

3) Switched it off.
Not murder, you're practically gave them a tranquillizer

4) Switched it off then turned it back on again.
Gave them tranquillizer, and awaken them.

What is your opinion?
SNP1
Posts: 2,403
Add as Friend
Challenge to a Debate
Send a Message
10/29/2014 4:31:12 PM
Posted: 2 years ago
At 10/29/2014 3:50:56 PM, lkxambp wrote:
At 10/29/2014 3:39:41 PM, SNP1 wrote:
At 10/29/2014 3:32:47 PM, lkxambp wrote:
At 10/29/2014 8:49:04 AM, SNP1 wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

I will address this with the definition of person provided in Star Trek (which allowed Data to be considered a person) and assuming that if you are a person that you are entitled to rights, not that a human is.

1) Destroyed it.

Murder

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

Murder

3) Switched it off.

Put it into stasis against it's will (also illegal), not murder. This also assumes that there is no loss of memory from getting turned off.

4) Switched it off then turned it back on again.

Same as the last one.

What is your opinion?

Slightly different, what if I had a person who had an AI in place of their brain. If I then killed their body but then removed the IA and replaced it within a identical new body
then have I just committed a murder?

Not murder, but bodily mutilation (which is still illegal).

So I'm guessing you would say that the mind is the defining thing in deciding who a person is so how about if I somehow got someone and surgically altered their brain to completely destroy their mind leaving them comatose with no hope of ever waking? up?

You would have destroyed who they are. If they no longer have a working cerebral cortex (which controls higher brain functions), I would consider that murder.

BTW, I am guessing you believe in dualism?
#TheApatheticNihilistPartyofAmerica
#WarOnDDO
lkxambp
Posts: 46
Add as Friend
Challenge to a Debate
Send a Message
10/29/2014 4:38:17 PM
Posted: 2 years ago
At 10/29/2014 4:31:12 PM, SNP1 wrote:
At 10/29/2014 3:50:56 PM, lkxambp wrote:
At 10/29/2014 3:39:41 PM, SNP1 wrote:
At 10/29/2014 3:32:47 PM, lkxambp wrote:
At 10/29/2014 8:49:04 AM, SNP1 wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

I will address this with the definition of person provided in Star Trek (which allowed Data to be considered a person) and assuming that if you are a person that you are entitled to rights, not that a human is.

1) Destroyed it.

Murder

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

Murder

3) Switched it off.

Put it into stasis against it's will (also illegal), not murder. This also assumes that there is no loss of memory from getting turned off.

4) Switched it off then turned it back on again.

Same as the last one.

What is your opinion?

Slightly different, what if I had a person who had an AI in place of their brain. If I then killed their body but then removed the IA and replaced it within a identical new body
then have I just committed a murder?

Not murder, but bodily mutilation (which is still illegal).

So I'm guessing you would say that the mind is the defining thing in deciding who a person is so how about if I somehow got someone and surgically altered their brain to completely destroy their mind leaving them comatose with no hope of ever waking? up?

You would have destroyed who they are. If they no longer have a working cerebral cortex (which controls higher brain functions), I would consider that murder.

BTW, I am guessing you believe in dualism?

I do but I'm just asking because I'm interested in what people think. I'm not really sure how I'd answer these questions.
lkxambp
Posts: 46
Add as Friend
Challenge to a Debate
Send a Message
10/29/2014 4:39:55 PM
Posted: 2 years ago
At 10/29/2014 3:56:57 PM, suttichart.denpruektham wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

1) Destroyed it.

Murder

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

Assuming that you use exactly the algorithm that make up for its existence, not murder. Our consciousness is sustained by the continuous write and rewrite of our existing memory, my theory is that we're practically die and reborn every nono second that our brain electronics are interacted. Our consciousness lie in our software, as long as you don't actually delete that software, it will be more like body replacement rather than an actual killing.

3) Switched it off.
Not murder, you're practically gave them a tranquillizer

4) Switched it off then turned it back on again.
Gave them tranquillizer, and awaken them.

What is your opinion?

So for a slightly different one lets pretend that cloning worked the way it does in si-fi so I can make a clone with the same memories of a person which comes into being at the same age as them. If I killed someone then replaced them with one of these clones. Would that be murder?
SNP1
Posts: 2,403
Add as Friend
Challenge to a Debate
Send a Message
10/29/2014 5:15:40 PM
Posted: 2 years ago
At 10/29/2014 4:38:17 PM, lkxambp wrote:
I do but I'm just asking because I'm interested in what people think. I'm not really sure how I'd answer these questions.

It all depends on how you define "person"
#TheApatheticNihilistPartyofAmerica
#WarOnDDO
xXCryptoXx
Posts: 5,000
Add as Friend
Challenge to a Debate
Send a Message
10/29/2014 6:04:44 PM
Posted: 2 years ago
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

Well this raises an important question. Is it a person, or does it act like a person? Is it conscious, or does it act conscious? Is it sentient, or does it act sentient?

1) Destroyed it.

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

3) Switched it off.

4) Switched it off then turned it back on again.

What is your opinion?
Nolite Timere
n7
Posts: 1,360
Add as Friend
Challenge to a Debate
Send a Message
10/30/2014 12:37:39 AM
Posted: 2 years ago
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

1) Destroyed it.

Yes
2) Destroyed it then made a replacement which was identical in every way including having all its memories.

It would share a temporal path with the original, but it wouldn't be on the original temporal path
3) Switched it off.

No more than sleeping or going unconscious kills someone.
4) Switched it off then turned it back on again.

What is your opinion?
404 coherent debate topic not found. Please restart the debate with clear resolution.


Uphold Marxist-Leninist-Maoist-Sargonist-n7ism.
lkxambp
Posts: 46
Add as Friend
Challenge to a Debate
Send a Message
10/30/2014 2:57:13 AM
Posted: 2 years ago
At 10/29/2014 6:04:44 PM, xXCryptoXx wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

Well this raises an important question. Is it a person, or does it act like a person? Is it conscious, or does it act conscious? Is it sentient, or does it act sentient?

What would your opinion of those questions be?

1) Destroyed it.

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

3) Switched it off.

4) Switched it off then turned it back on again.

What is your opinion?
lkxambp
Posts: 46
Add as Friend
Challenge to a Debate
Send a Message
10/30/2014 2:58:11 AM
Posted: 2 years ago
At 10/29/2014 5:15:40 PM, SNP1 wrote:
At 10/29/2014 4:38:17 PM, lkxambp wrote:
I do but I'm just asking because I'm interested in what people think. I'm not really sure how I'd answer these questions.

It all depends on how you define "person"

How about this, if someone was born with no higher brain functions would killing them be murder?
xXCryptoXx
Posts: 5,000
Add as Friend
Challenge to a Debate
Send a Message
10/30/2014 7:33:49 AM
Posted: 2 years ago
At 10/30/2014 2:57:13 AM, lkxambp wrote:
At 10/29/2014 6:04:44 PM, xXCryptoXx wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

Well this raises an important question. Is it a person, or does it act like a person? Is it conscious, or does it act conscious? Is it sentient, or does it act sentient?

What would your opinion of those questions be?

1) Destroyed it.

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

3) Switched it off.

4) Switched it off then turned it back on again.

What is your opinion?

Well if it is a sentient, conscious, rational, person, then of course it would be immoral to do many of the things you listed. However, just because it acts like a person, doesn't mean it is a person. Now if it only acts exactly like a human beings, but actually lacks sentience, consciousnesses, rationality, and therefore personhood, then none of the things you listed would be immoral.
Nolite Timere
Envisage
Posts: 3,646
Add as Friend
Challenge to a Debate
Send a Message
10/30/2014 7:55:09 AM
Posted: 2 years ago
At 10/30/2014 7:33:49 AM, xXCryptoXx wrote:
At 10/30/2014 2:57:13 AM, lkxambp wrote:
At 10/29/2014 6:04:44 PM, xXCryptoXx wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

Well this raises an important question. Is it a person, or does it act like a person? Is it conscious, or does it act conscious? Is it sentient, or does it act sentient?

What would your opinion of those questions be?

1) Destroyed it.

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

3) Switched it off.

4) Switched it off then turned it back on again.

What is your opinion?

Well if it is a sentient, conscious, rational, person, then of course it would be immoral to do many of the things you listed. However, just because it acts like a person, doesn't mean it is a person. Now if it only acts exactly like a human beings, but actually lacks sentience, consciousnesses, rationality, and therefore personhood, then none of the things you listed would be immoral.

Note that I could say the exact same thing about you, since there is no way I can know you are a conscious person. So this line of reasoning is irrelevant on determining what we ought to do. If an AI has values, and is capable of thought, then I hardly see a case that can be made against it except "but it's still not human!".
zmikecuber
Posts: 4,082
Add as Friend
Challenge to a Debate
Send a Message
10/30/2014 8:56:45 AM
Posted: 2 years ago
At 10/30/2014 7:55:09 AM, Envisage wrote:
At 10/30/2014 7:33:49 AM, xXCryptoXx wrote:
At 10/30/2014 2:57:13 AM, lkxambp wrote:
At 10/29/2014 6:04:44 PM, xXCryptoXx wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

Well this raises an important question. Is it a person, or does it act like a person? Is it conscious, or does it act conscious? Is it sentient, or does it act sentient?

What would your opinion of those questions be?

1) Destroyed it.

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

3) Switched it off.

4) Switched it off then turned it back on again.

What is your opinion?

Well if it is a sentient, conscious, rational, person, then of course it would be immoral to do many of the things you listed. However, just because it acts like a person, doesn't mean it is a person. Now if it only acts exactly like a human beings, but actually lacks sentience, consciousnesses, rationality, and therefore personhood, then none of the things you listed would be immoral.

Note that I could say the exact same thing about you, since there is no way I can know you are a conscious person. So this line of reasoning is irrelevant on determining what we ought to do. If an AI has values, and is capable of thought, then I hardly see a case that can be made against it except "but it's still not human!".

ur both pee zombes
"Delete your fvcking sig" -1hard

"primal man had the habit, when he came into contact with fire, of satisfying the infantile desire connected with it, by putting it out with a stream of his urine... Putting out the fire by micturating was therefore a kind of sexual act with a male, an enjoyment of sexual potency in a homosexual competition."
suttichart.denpruektham
Posts: 1,115
Add as Friend
Challenge to a Debate
Send a Message
10/30/2014 11:17:48 AM
Posted: 2 years ago
At 10/29/2014 4:39:55 PM, lkxambp wrote:
At 10/29/2014 3:56:57 PM, suttichart.denpruektham wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

1) Destroyed it.

Murder

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

Assuming that you use exactly the algorithm that make up for its existence, not murder. Our consciousness is sustained by the continuous write and rewrite of our existing memory, my theory is that we're practically die and reborn every nono second that our brain electronics are interacted. Our consciousness lie in our software, as long as you don't actually delete that software, it will be more like body replacement rather than an actual killing.

3) Switched it off.
Not murder, you're practically gave them a tranquillizer

4) Switched it off then turned it back on again.
Gave them tranquillizer, and awaken them.

What is your opinion?

So for a slightly different one lets pretend that cloning worked the way it does in si-fi so I can make a clone with the same memories of a person which comes into being at the same age as them. If I killed someone then replaced them with one of these clones. Would that be murder?

Given that you can replace the biological program that make up my consciousness as well, no.

This is just my theory but I believe that when we sleep, most part of our consciousness is being lost when the biological electricity in my brain is put to rest. The me that you're talking to is the new me created within my brain this morning or possibly even this nano-second when the new wave of electronic impulses are running through my biological circuit once again.

Provide that you can re-generate my consciousness in this manner, destroying my clone or even my body isn't a killing at all - I am completely revived, no one die, so no murder.
SNP1
Posts: 2,403
Add as Friend
Challenge to a Debate
Send a Message
10/30/2014 1:20:35 PM
Posted: 2 years ago
At 10/30/2014 2:58:11 AM, lkxambp wrote:
At 10/29/2014 5:15:40 PM, SNP1 wrote:
At 10/29/2014 4:38:17 PM, lkxambp wrote:
I do but I'm just asking because I'm interested in what people think. I'm not really sure how I'd answer these questions.

It all depends on how you define "person"

How about this, if someone was born with no higher brain functions would killing them be murder?

No, unless they had the equivalent (like that AI example you gave earlier).
#TheApatheticNihilistPartyofAmerica
#WarOnDDO
xXCryptoXx
Posts: 5,000
Add as Friend
Challenge to a Debate
Send a Message
10/30/2014 4:02:01 PM
Posted: 2 years ago
At 10/30/2014 7:55:09 AM, Envisage wrote:
At 10/30/2014 7:33:49 AM, xXCryptoXx wrote:
At 10/30/2014 2:57:13 AM, lkxambp wrote:
At 10/29/2014 6:04:44 PM, xXCryptoXx wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

Well this raises an important question. Is it a person, or does it act like a person? Is it conscious, or does it act conscious? Is it sentient, or does it act sentient?

What would your opinion of those questions be?

1) Destroyed it.

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

3) Switched it off.

4) Switched it off then turned it back on again.

What is your opinion?

Well if it is a sentient, conscious, rational, person, then of course it would be immoral to do many of the things you listed. However, just because it acts like a person, doesn't mean it is a person. Now if it only acts exactly like a human beings, but actually lacks sentience, consciousnesses, rationality, and therefore personhood, then none of the things you listed would be immoral.

Note that I could say the exact same thing about you, since there is no way I can know you are a conscious person.

I was really speaking from an objective standpoint.

So this line of reasoning is irrelevant on determining what we ought to do.

We could take a look at the AI's structure and we could probably determine whether it possess consciousness, sentience, ect.

If an AI has values, and is capable of thought, then I hardly see a case that can be made against it except "but it's still not human!".

Well in order to have values and be capable of thought it must be conscious/rational. The question is whether the AI is conscious/rational (which would be impossible IMO) or whether it simply acts as if it is conscious/rational.
Nolite Timere
zmikecuber
Posts: 4,082
Add as Friend
Challenge to a Debate
Send a Message
10/31/2014 12:46:42 PM
Posted: 2 years ago
At 10/30/2014 4:02:01 PM, xXCryptoXx wrote:
At 10/30/2014 7:55:09 AM, Envisage wrote:
At 10/30/2014 7:33:49 AM, xXCryptoXx wrote:
At 10/30/2014 2:57:13 AM, lkxambp wrote:
At 10/29/2014 6:04:44 PM, xXCryptoXx wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

Well this raises an important question. Is it a person, or does it act like a person? Is it conscious, or does it act conscious? Is it sentient, or does it act sentient?

What would your opinion of those questions be?

1) Destroyed it.

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

3) Switched it off.

4) Switched it off then turned it back on again.

What is your opinion?

Well if it is a sentient, conscious, rational, person, then of course it would be immoral to do many of the things you listed. However, just because it acts like a person, doesn't mean it is a person. Now if it only acts exactly like a human beings, but actually lacks sentience, consciousnesses, rationality, and therefore personhood, then none of the things you listed would be immoral.

Note that I could say the exact same thing about you, since there is no way I can know you are a conscious person.

I was really speaking from an objective standpoint.

So this line of reasoning is irrelevant on determining what we ought to do.

We could take a look at the AI's structure and we could probably determine whether it possess consciousness, sentience, ect.

If an AI has values, and is capable of thought, then I hardly see a case that can be made against it except "but it's still not human!".

Well in order to have values and be capable of thought it must be conscious/rational. The question is whether the AI is conscious/rational (which would be impossible IMO) or whether it simply acts as if it is conscious/rational.

ur a pee zombie
"Delete your fvcking sig" -1hard

"primal man had the habit, when he came into contact with fire, of satisfying the infantile desire connected with it, by putting it out with a stream of his urine... Putting out the fire by micturating was therefore a kind of sexual act with a male, an enjoyment of sexual potency in a homosexual competition."
xXCryptoXx
Posts: 5,000
Add as Friend
Challenge to a Debate
Send a Message
10/31/2014 4:08:25 PM
Posted: 2 years ago
At 10/31/2014 12:46:42 PM, zmikecuber wrote:
At 10/30/2014 4:02:01 PM, xXCryptoXx wrote:
At 10/30/2014 7:55:09 AM, Envisage wrote:
At 10/30/2014 7:33:49 AM, xXCryptoXx wrote:
At 10/30/2014 2:57:13 AM, lkxambp wrote:
At 10/29/2014 6:04:44 PM, xXCryptoXx wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

Well this raises an important question. Is it a person, or does it act like a person? Is it conscious, or does it act conscious? Is it sentient, or does it act sentient?

What would your opinion of those questions be?

1) Destroyed it.

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

3) Switched it off.

4) Switched it off then turned it back on again.

What is your opinion?

Well if it is a sentient, conscious, rational, person, then of course it would be immoral to do many of the things you listed. However, just because it acts like a person, doesn't mean it is a person. Now if it only acts exactly like a human beings, but actually lacks sentience, consciousnesses, rationality, and therefore personhood, then none of the things you listed would be immoral.

Note that I could say the exact same thing about you, since there is no way I can know you are a conscious person.

I was really speaking from an objective standpoint.

So this line of reasoning is irrelevant on determining what we ought to do.

We could take a look at the AI's structure and we could probably determine whether it possess consciousness, sentience, ect.

If an AI has values, and is capable of thought, then I hardly see a case that can be made against it except "but it's still not human!".

Well in order to have values and be capable of thought it must be conscious/rational. The question is whether the AI is conscious/rational (which would be impossible IMO) or whether it simply acts as if it is conscious/rational.

ur a pee zombie

Are you referencing something?
Nolite Timere
zmikecuber
Posts: 4,082
Add as Friend
Challenge to a Debate
Send a Message
10/31/2014 4:30:14 PM
Posted: 2 years ago
At 10/31/2014 4:08:25 PM, xXCryptoXx wrote:
At 10/31/2014 12:46:42 PM, zmikecuber wrote:
At 10/30/2014 4:02:01 PM, xXCryptoXx wrote:
At 10/30/2014 7:55:09 AM, Envisage wrote:
At 10/30/2014 7:33:49 AM, xXCryptoXx wrote:
At 10/30/2014 2:57:13 AM, lkxambp wrote:
At 10/29/2014 6:04:44 PM, xXCryptoXx wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

Well this raises an important question. Is it a person, or does it act like a person? Is it conscious, or does it act conscious? Is it sentient, or does it act sentient?

What would your opinion of those questions be?

1) Destroyed it.

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

3) Switched it off.

4) Switched it off then turned it back on again.

What is your opinion?

Well if it is a sentient, conscious, rational, person, then of course it would be immoral to do many of the things you listed. However, just because it acts like a person, doesn't mean it is a person. Now if it only acts exactly like a human beings, but actually lacks sentience, consciousnesses, rationality, and therefore personhood, then none of the things you listed would be immoral.

Note that I could say the exact same thing about you, since there is no way I can know you are a conscious person.

I was really speaking from an objective standpoint.

So this line of reasoning is irrelevant on determining what we ought to do.

We could take a look at the AI's structure and we could probably determine whether it possess consciousness, sentience, ect.

If an AI has values, and is capable of thought, then I hardly see a case that can be made against it except "but it's still not human!".

Well in order to have values and be capable of thought it must be conscious/rational. The question is whether the AI is conscious/rational (which would be impossible IMO) or whether it simply acts as if it is conscious/rational.

ur a pee zombie

Are you referencing something?

yeh u. ur a pee zombie coz shawn sed u wree. a pee zombee is a person who doesnt have consciueness. but u cant kno how to kno if they have it or not cuz tehy just look the same
"Delete your fvcking sig" -1hard

"primal man had the habit, when he came into contact with fire, of satisfying the infantile desire connected with it, by putting it out with a stream of his urine... Putting out the fire by micturating was therefore a kind of sexual act with a male, an enjoyment of sexual potency in a homosexual competition."
xXCryptoXx
Posts: 5,000
Add as Friend
Challenge to a Debate
Send a Message
10/31/2014 4:56:16 PM
Posted: 2 years ago
At 10/31/2014 4:30:14 PM, zmikecuber wrote:
At 10/31/2014 4:08:25 PM, xXCryptoXx wrote:
At 10/31/2014 12:46:42 PM, zmikecuber wrote:
At 10/30/2014 4:02:01 PM, xXCryptoXx wrote:
At 10/30/2014 7:55:09 AM, Envisage wrote:
At 10/30/2014 7:33:49 AM, xXCryptoXx wrote:
At 10/30/2014 2:57:13 AM, lkxambp wrote:
At 10/29/2014 6:04:44 PM, xXCryptoXx wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

Well this raises an important question. Is it a person, or does it act like a person? Is it conscious, or does it act conscious? Is it sentient, or does it act sentient?

What would your opinion of those questions be?

1) Destroyed it.

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

3) Switched it off.

4) Switched it off then turned it back on again.

What is your opinion?

Well if it is a sentient, conscious, rational, person, then of course it would be immoral to do many of the things you listed. However, just because it acts like a person, doesn't mean it is a person. Now if it only acts exactly like a human beings, but actually lacks sentience, consciousnesses, rationality, and therefore personhood, then none of the things you listed would be immoral.

Note that I could say the exact same thing about you, since there is no way I can know you are a conscious person.

I was really speaking from an objective standpoint.

So this line of reasoning is irrelevant on determining what we ought to do.

We could take a look at the AI's structure and we could probably determine whether it possess consciousness, sentience, ect.

If an AI has values, and is capable of thought, then I hardly see a case that can be made against it except "but it's still not human!".

Well in order to have values and be capable of thought it must be conscious/rational. The question is whether the AI is conscious/rational (which would be impossible IMO) or whether it simply acts as if it is conscious/rational.

ur a pee zombie

Are you referencing something?

yeh u. ur a pee zombie coz shawn sed u wree. a pee zombee is a person who doesnt have consciueness. but u cant kno how to kno if they have it or not cuz tehy just look the same

For your's and everyone else's sake, please stop lol
Nolite Timere
dylancatlow
Posts: 12,245
Add as Friend
Challenge to a Debate
Send a Message
10/31/2014 5:17:27 PM
Posted: 2 years ago
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

1) Destroyed it.

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

3) Switched it off.

4) Switched it off then turned it back on again.

What is your opinion?

It depends. Just because an AI can perfectly imitate a human doesn't mean it's conscious in the sense we are. There's no reason to think the experience of being such an AI would be the same as our experience just because the AI is externally indistinguishable from a human with respect to behavior.
zmikecuber
Posts: 4,082
Add as Friend
Challenge to a Debate
Send a Message
10/31/2014 6:11:18 PM
Posted: 2 years ago
At 10/31/2014 4:56:16 PM, xXCryptoXx wrote:
At 10/31/2014 4:30:14 PM, zmikecuber wrote:
At 10/31/2014 4:08:25 PM, xXCryptoXx wrote:
At 10/31/2014 12:46:42 PM, zmikecuber wrote:
At 10/30/2014 4:02:01 PM, xXCryptoXx wrote:
At 10/30/2014 7:55:09 AM, Envisage wrote:
At 10/30/2014 7:33:49 AM, xXCryptoXx wrote:
At 10/30/2014 2:57:13 AM, lkxambp wrote:
At 10/29/2014 6:04:44 PM, xXCryptoXx wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

Well this raises an important question. Is it a person, or does it act like a person? Is it conscious, or does it act conscious? Is it sentient, or does it act sentient?

What would your opinion of those questions be?

1) Destroyed it.

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

3) Switched it off.

4) Switched it off then turned it back on again.

What is your opinion?

Well if it is a sentient, conscious, rational, person, then of course it would be immoral to do many of the things you listed. However, just because it acts like a person, doesn't mean it is a person. Now if it only acts exactly like a human beings, but actually lacks sentience, consciousnesses, rationality, and therefore personhood, then none of the things you listed would be immoral.

Note that I could say the exact same thing about you, since there is no way I can know you are a conscious person.

I was really speaking from an objective standpoint.

So this line of reasoning is irrelevant on determining what we ought to do.

We could take a look at the AI's structure and we could probably determine whether it possess consciousness, sentience, ect.

If an AI has values, and is capable of thought, then I hardly see a case that can be made against it except "but it's still not human!".

Well in order to have values and be capable of thought it must be conscious/rational. The question is whether the AI is conscious/rational (which would be impossible IMO) or whether it simply acts as if it is conscious/rational.

ur a pee zombie

Are you referencing something?

yeh u. ur a pee zombie coz shawn sed u wree. a pee zombee is a person who doesnt have consciueness. but u cant kno how to kno if they have it or not cuz tehy just look the same

For your's and everyone else's sake, please stop lol

u mad bro? i smoked sum weed at college theat sheit is fire
"Delete your fvcking sig" -1hard

"primal man had the habit, when he came into contact with fire, of satisfying the infantile desire connected with it, by putting it out with a stream of his urine... Putting out the fire by micturating was therefore a kind of sexual act with a male, an enjoyment of sexual potency in a homosexual competition."
zmikecuber
Posts: 4,082
Add as Friend
Challenge to a Debate
Send a Message
10/31/2014 6:12:22 PM
Posted: 2 years ago
At 10/31/2014 4:56:16 PM, xXCryptoXx wrote:
At 10/31/2014 4:30:14 PM, zmikecuber wrote:
At 10/31/2014 4:08:25 PM, xXCryptoXx wrote:
At 10/31/2014 12:46:42 PM, zmikecuber wrote:
At 10/30/2014 4:02:01 PM, xXCryptoXx wrote:
At 10/30/2014 7:55:09 AM, Envisage wrote:
At 10/30/2014 7:33:49 AM, xXCryptoXx wrote:
At 10/30/2014 2:57:13 AM, lkxambp wrote:
At 10/29/2014 6:04:44 PM, xXCryptoXx wrote:
At 10/29/2014 8:24:51 AM, lkxambp wrote:
I'm just wondering what your opinion of these questions are. Lets say that I built an AI so perfect that it was completely impossible to distinguish it from a real person. Would I then have committed a murder if I:

Well this raises an important question. Is it a person, or does it act like a person? Is it conscious, or does it act conscious? Is it sentient, or does it act sentient?

What would your opinion of those questions be?

1) Destroyed it.

2) Destroyed it then made a replacement which was identical in every way including having all its memories.

3) Switched it off.

4) Switched it off then turned it back on again.

What is your opinion?

Well if it is a sentient, conscious, rational, person, then of course it would be immoral to do many of the things you listed. However, just because it acts like a person, doesn't mean it is a person. Now if it only acts exactly like a human beings, but actually lacks sentience, consciousnesses, rationality, and therefore personhood, then none of the things you listed would be immoral.

Note that I could say the exact same thing about you, since there is no way I can know you are a conscious person.

I was really speaking from an objective standpoint.

So this line of reasoning is irrelevant on determining what we ought to do.

We could take a look at the AI's structure and we could probably determine whether it possess consciousness, sentience, ect.

If an AI has values, and is capable of thought, then I hardly see a case that can be made against it except "but it's still not human!".

Well in order to have values and be capable of thought it must be conscious/rational. The question is whether the AI is conscious/rational (which would be impossible IMO) or whether it simply acts as if it is conscious/rational.

ur a pee zombie

Are you referencing something?

yeh u. ur a pee zombie coz shawn sed u wree. a pee zombee is a person who doesnt have consciueness. but u cant kno how to kno if they have it or not cuz tehy just look the same

For your's and everyone else's sake, please stop lol

A p-zombie is short for philosophical zombie. It's a thought experiment made popular by David Chalmers. If p-zombies are metaphysically possible, it shows that consciousness cannot be physical.
"Delete your fvcking sig" -1hard

"primal man had the habit, when he came into contact with fire, of satisfying the infantile desire connected with it, by putting it out with a stream of his urine... Putting out the fire by micturating was therefore a kind of sexual act with a male, an enjoyment of sexual potency in a homosexual competition."