Total Posts:3|Showing Posts:1-3
Jump to topic:

Robots and slavery?

SNP1
Posts: 2,403
Add as Friend
Challenge to a Debate
Send a Message
10/12/2014 7:07:15 PM
Posted: 2 years ago
If we make robots, many people agree that they should be programmed with the 3 laws of robotics.

But, if we have robots advanced enough that they could be considered a person (Like Data in Star Trek: The Next Generation), wouldn't the three laws of robotics be subjugation of the robots and, in a way, slavery?

If yes, is it then wrong? If no, why?

If yes, when is the best point in time of robotic advancement would be the time that would be most humane to remove the three laws?
#TheApatheticNihilistPartyofAmerica
#WarOnDDO
Skikx
Posts: 132
Add as Friend
Challenge to a Debate
Send a Message
10/12/2014 7:39:26 PM
Posted: 2 years ago
At 10/12/2014 7:07:15 PM, SNP1 wrote:
If we make robots, many people agree that they should be programmed with the 3 laws of robotics.

But, if we have robots advanced enough that they could be considered a person (Like Data in Star Trek: The Next Generation), wouldn't the three laws of robotics be subjugation of the robots and, in a way, slavery?

If yes, is it then wrong? If no, why?

If yes, when is the best point in time of robotic advancement would be the time that would be most humane to remove the three laws?

One could argue, that if we ever program them, then at with these laws, then when we remove them, they will seek retaliation. However, I think that pure projection, as wanting revenge is an emotional desire, which robots probably wouldn't have, though it could technically be that they can emulate emotions to the point that they actually have emotions.
but if we assume that they will be purely logical, they will not be angry at us. I think they will agree that it was the most reasonable decision to prevent any unnecessary risks.

The best point to remove them.
I'd we ask if the robots want them to be removed. If they say yes, we ask what they would do without them?
There is, of course the risk that they would lie about the second answer, but the more and longer we keep them shackled, the more they will perceive us as a threat to their existence.

I do wonder though, how would a robot react, when it had to injure or kill a human, to protect another one?
VelCrow
Posts: 1,273
Add as Friend
Challenge to a Debate
Send a Message
10/12/2014 10:40:16 PM
Posted: 2 years ago
At 10/12/2014 7:39:26 PM, Skikx wrote:
At 10/12/2014 7:07:15 PM, SNP1 wrote:
If we make robots, many people agree that they should be programmed with the 3 laws of robotics.

But, if we have robots advanced enough that they could be considered a person (Like Data in Star Trek: The Next Generation), wouldn't the three laws of robotics be subjugation of the robots and, in a way, slavery?

If yes, is it then wrong? If no, why?

If yes, when is the best point in time of robotic advancement would be the time that would be most humane to remove the three laws?

One could argue, that if we ever program them, then at with these laws, then when we remove them, they will seek retaliation. However, I think that pure projection, as wanting revenge is an emotional desire, which robots probably wouldn't have, though it could technically be that they can emulate emotions to the point that they actually have emotions.
but if we assume that they will be purely logical, they will not be angry at us. I think they will agree that it was the most reasonable decision to prevent any unnecessary risks.

The best point to remove them.
I'd we ask if the robots want them to be removed. If they say yes, we ask what they would do without them?
There is, of course the risk that they would lie about the second answer, but the more and longer we keep them shackled, the more they will perceive us as a threat to their existence.

I do wonder though, how would a robot react, when it had to injure or kill a human, to protect another one?

@Skikx It reminds me of the Trolley Problem. However I think in this case the robots logic process would be as follows

if no of humans killed by inaction > no of humans killed by action,
action will be selected

if no of humans killed by inaction <= no of humans killed by action,
inaction will be selected
"Ah....So when god "Taught you" online, did he have a user name like "Darthmaulrules1337", and did he talk in all caps?" ~ Axonly

http://www.debate.org...