Total Posts:53|Showing Posts:1-30|Last Page
Jump to topic:

Moral Experiment

ethang5
Posts: 4,084
Add as Friend
Challenge to a Debate
Send a Message
2/25/2015 11:08:39 AM
Posted: 1 year ago
Experiment #1

The world = Huge Empty Locked Warehouse
The People = You and 99 other people (females, males, and children)
The Setup = The 99 other people are docile and forgiving and will not harm you in any way regardless of what you do.

You find out that one of the 99 has Ebola. There is no way to quarantine the person. You kill the person saving "the world" from death by Ebola.

According to your own moral code, the one you live by, was the killing ,

1. Moral though undesirable
2. Immoral
3. Morally dependent on your intent
4. Morally dependent on some factor outside of you
5. Neither moral nor immoral
--------------------------------------------------------------------------------------------------------------------------

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral
---------------------------------------------------------------------------------------------------------------------------

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
b) Taking all animals to another world and destroying their original world ________
c) Destroying a world that you don't know has life or not. ________
d) Destroying a world with no living things on it. _______
e) Killing all living things on a world so you can keep the world for yourself.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______

Feel free to include your reasons for you answers.
Envisage
Posts: 3,646
Add as Friend
Challenge to a Debate
Send a Message
2/25/2015 11:39:42 AM
Posted: 1 year ago
At 2/25/2015 11:08:39 AM, ethang5 wrote:
Experiment #1

The world = Huge Empty Locked Warehouse
The People = You and 99 other people (females, males, and children)
The Setup = The 99 other people are docile and forgiving and will not harm you in any way regardless of what you do.

You find out that one of the 99 has Ebola. There is no way to quarantine the person. You kill the person saving "the world" from death by Ebola.

According to your own moral code, the one you live by, was the killing ,

1. Moral though undesirable
2. Immoral
3. Morally dependent on your intent
4. Morally dependent on some factor outside of you
5. Neither moral nor immoral

5

--------------------------------------------------------------------------------------------------------------------------

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral

6. Neither moral or immoral
---------------------------------------------------------------------------------------------------------------------------

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
b) Taking all animals to another world and destroying their original world ________
c) Destroying a world that you don't know has life or not. ________
d) Destroying a world with no living things on it. _______
e) Killing all living things on a world so you can keep the world for yourself.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______

Feel free to include your reasons for you answers.

5. Neither moral or immoral

Reasons: I am a moral non-cognitivist.
Burzmali
Posts: 1,310
Add as Friend
Challenge to a Debate
Send a Message
2/25/2015 11:57:26 AM
Posted: 1 year ago
At 2/25/2015 11:08:39 AM, ethang5 wrote:
Experiment #1

The world = Huge Empty Locked Warehouse
The People = You and 99 other people (females, males, and children)
The Setup = The 99 other people are docile and forgiving and will not harm you in any way regardless of what you do.

You find out that one of the 99 has Ebola. There is no way to quarantine the person. You kill the person saving "the world" from death by Ebola.

According to your own moral code, the one you live by, was the killing ,

1. Moral though undesirable
2. Immoral
3. Morally dependent on your intent
4. Morally dependent on some factor outside of you
5. Neither moral nor immoral

Moral though undesirable. Preface, I don't view morality as some binary situation. Some choices and actions are more moral than others. The situation above doesn't allow for the most moral solution, so a lesser solution must be chosen.

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral
---------------------------------------------------------------------------------------------------------------------------

Absent any further information, always immoral. That doesn't necessarily mean I wouldn't substitute, but if I did, it would be an immoral choice.

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
b) Taking all animals to another world and destroying their original world ________
c) Destroying a world that you don't know has life or not. ________
d) Destroying a world with no living things on it. _______
e) Killing all living things on a world so you can keep the world for yourself.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______

Feel free to include your reasons for you answers.

All of these assume no additional info is available.

a) Immoral
b) Immoral
c) Immoral
d) Probably Morally Neutral
e) Immoral
f) Immoral
ThinkFirst
Posts: 1,391
Add as Friend
Challenge to a Debate
Send a Message
2/25/2015 3:56:58 PM
Posted: 1 year ago
At 2/25/2015 11:08:39 AM, ethang5 wrote:
Experiment #1

The world = Huge Empty Locked Warehouse
The People = You and 99 other people (females, males, and children)
The Setup = The 99 other people are docile and forgiving and will not harm you in any way regardless of what you do.

You find out that one of the 99 has Ebola. There is no way to quarantine the person. You kill the person saving "the world" from death by Ebola.

According to your own moral code, the one you live by, was the killing ,

1. Moral though undesirable
2. Immoral
3. Morally dependent on your intent
4. Morally dependent on some factor outside of you
5. Neither moral nor immoral
--------------------------------------------------------------------------------------------------------------------------

1 - Moral, though undesirable. I must note, here, that the death would be as swift and painless as I could make it, given the availability of tools/drugs, etc.

This further presupposes that knowledge that the person being killed had ebola was 100%, unmistakable, and unequivocal.

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral
---------------------------------------------------------------------------------------------------------------------------

1 - Always Immoral
According to my moral code, even taking the chance that the "green" might have been good would make the action immoral. Unless I know, unequivocally, without doubt, that the "green" is "evil," substituting them is murder. I am assuming that your conditions (extreme as they are) are absolute. Taking the chance that the "green" is evil is tantamount to drinking heavily, then driving a big truck, knowing there is a 50/50 risk that I would run over an innocent on the way home. Taking the risk is intentionally risking harm to an innocent.

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
1 - Having the power does not justify using it.
b) Taking all animals to another world and destroying their original world ________
1 - Having the power does not justify using it.
c) Destroying a world that you don't know has life or not. ________
1 - Having the power does not justify using it.
d) Destroying a world with no living things on it. _______
1 - Having the power does not justify using it.
e) Killing all living things on a world so you can keep the world for yourself. _______
1 - Having the power does not justify using it.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______
1 - Having the power does not justify using it.

Feel free to include your reasons for you answers.
"Never attribute to villainy that which can be adequately explained by stupidity"
-----
"Men rarely if ever dream up a god superior to themselves. Most gods have the manners and morals of a spoiled child. "

-- Robert A Heinlein
ethang5
Posts: 4,084
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 6:56:42 AM
Posted: 1 year ago
At 2/25/2015 11:39:42 AM, Envisage wrote:
At 2/25/2015 11:08:39 AM, ethang5 wrote:
Experiment #1

The world = Huge Empty Locked Warehouse
The People = You and 99 other people (females, males, and children)
The Setup = The 99 other people are docile and forgiving and will not harm you in any way regardless of what you do.

You find out that one of the 99 has Ebola. There is no way to quarantine the person. You kill the person saving "the world" from death by Ebola.

According to your own moral code, the one you live by, was the killing ,

1. Moral though undesirable
2. Immoral
3. Morally dependent on your intent
4. Morally dependent on some factor outside of you
5. Neither moral nor immoral

5

--------------------------------------------------------------------------------------------------------------------------

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral

6. Neither moral or immoral

That would be #5 then.
---------------------------------------------------------------------------------------------------------------------------

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
b) Taking all animals to another world and destroying their original world ________
c) Destroying a world that you don't know has life or not. ________
d) Destroying a world with no living things on it. _______
e) Killing all living things on a world so you can keep the world for yourself.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______

Feel free to include your reasons for you answers.

5. Neither moral or immoral

Reasons: I am a moral non-cognitivist.

Can you please tell us what the moral non-cognitivist position is?
Bennett91
Posts: 4,194
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 7:09:26 AM
Posted: 1 year ago
At 2/25/2015 11:08:39 AM, ethang5 wrote:


What's the point of all these questions? They seem so narrow and unrealistic to possibly conclude anything meaningful.
ethang5
Posts: 4,084
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 7:19:45 AM
Posted: 1 year ago
At 2/25/2015 11:57:26 AM, Burzmali wrote:
At 2/25/2015 11:08:39 AM, ethang5 wrote:
Experiment #1

The world = Huge Empty Locked Warehouse
The People = You and 99 other people (females, males, and children)
The Setup = The 99 other people are docile and forgiving and will not harm you in any way regardless of what you do.

You find out that one of the 99 has Ebola. There is no way to quarantine the person. You kill the person saving "the world" from death by Ebola.

According to your own moral code, the one you live by, was the killing ,

1. Moral though undesirable
2. Immoral
3. Morally dependent on your intent
4. Morally dependent on some factor outside of you
5. Neither moral nor immoral

Moral though undesirable. Preface, I don't view morality as some binary situation. Some choices and actions are more moral than others. The situation above doesn't allow for the most moral solution, so a lesser solution must be chosen.

Seems reasonable. But you are aware that some would call your decision here immoral?
--------------------------------------------------------------------------------------------------------------------------

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral

Absent any further information, always immoral. That doesn't necessarily mean I wouldn't substitute, but if I did, it would be an immoral choice.

I'm interested in when this happens, not just in others but in myself also. When we knowingly go against our moral code. You're saying here that though it is against your moral code, you may do it. If you did do it, why would that be? To avoid the pain? Or because the person substituted wouldn't die in real life? Or because the consequence to you are relatively small?

For example, if it wasn't a game, and the person substituted would really die, would it be harder to do it or just as easy?
----------------------------------------------------------------------------------------------------------------------------

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
b) Taking all animals to another world and destroying their original world ________
c) Destroying a world that you don't know has life or not. ________
d) Destroying a world with no living things on it. _______
e) Killing all living things on a world so you can keep the world for yourself.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______

Feel free to include your reasons for you answers.

All of these assume no additional info is available.

a) Immoral
b) Immoral
c) Immoral
d) Probably Morally Neutral
e) Immoral
f) Immoral

Is f) immoral because of the "all" or would killing a single animal also be immoral? Or is it the trivial reason (not liking them) that makes it immoral?
ethang5
Posts: 4,084
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 7:28:53 AM
Posted: 1 year ago
At 2/25/2015 3:56:58 PM, ThinkFirst wrote:
At 2/25/2015 11:08:39 AM, ethang5 wrote:
Experiment #1

The world = Huge Empty Locked Warehouse
The People = You and 99 other people (females, males, and children)
The Setup = The 99 other people are docile and forgiving and will not harm you in any way regardless of what you do.

You find out that one of the 99 has Ebola. There is no way to quarantine the person. You kill the person saving "the world" from death by Ebola.

According to your own moral code, the one you live by, was the killing ,

1. Moral though undesirable
2. Immoral
3. Morally dependent on your intent
4. Morally dependent on some factor outside of you
5. Neither moral nor immoral

1 - Moral, though undesirable. I must note, here, that the death would be as swift and painless as I could make it, given the availability of tools/drugs, etc.

Would you resist being killed if you were the person with Ebola? Or would you allow yourself to be killed for the sake of the others?

This further presupposes that knowledge that the person being killed had ebola was 100%, unmistakable, and unequivocal.

Yes, right.
-------------------------------------------------------------------------------------------------------------------------

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral

1 - Always Immoral
According to my moral code, even taking the chance that the "green" might have been good would make the action immoral. Unless I know, unequivocally, without doubt, that the "green" is "evil," substituting them is murder.

Why is it not considered murder if he is known to be evil?

I am assuming that your conditions (extreme as they are) are absolute. Taking the chance that the "green" is evil is tantamount to drinking heavily, then driving a big truck, knowing there is a 50/50 risk that I would run over an innocent on the way

What does "innocent" mean here?

---------------------------------------------------------------------------------------------------------------------------

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
1 - Having the power does not justify using it.
b) Taking all animals to another world and destroying their original world ________
1 - Having the power does not justify using it.
c) Destroying a world that you don't know has life or not. ________
1 - Having the power does not justify using it.
d) Destroying a world with no living things on it. _______

But why is destroying a lifeless world immoral? In what way is it immoral?

1 - Having the power does not justify using it.
e) Killing all living things on a world so you can keep the world for yourself. _______
1 - Having the power does not justify using it.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______
1 - Having the power does not justify using it.

Do you consider it immoral now when we kill say, all the ants nesting on our basement?

Feel free to include your reasons for you answers.
Graincruncher
Posts: 2,799
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 7:30:42 AM
Posted: 1 year ago
At 2/25/2015 11:08:39 AM, ethang5 wrote:
Experiment #1

The world = Huge Empty Locked Warehouse
The People = You and 99 other people (females, males, and children)
The Setup = The 99 other people are docile and forgiving and will not harm you in any way regardless of what you do.

You find out that one of the 99 has Ebola. There is no way to quarantine the person. You kill the person saving "the world" from death by Ebola.

According to your own moral code, the one you live by, was the killing ,

1. Moral though undesirable
2. Immoral
3. Morally dependent on your intent
4. Morally dependent on some factor outside of you
5. Neither moral nor immoral

Somewhere between (1) and (2); perhaps "immoral though desirable".

--------------------------------------------------------------------------------------------------------------------------

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral

(1)

---------------------------------------------------------------------------------------------------------------------------

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
b) Taking all animals to another world and destroying their original world ________
c) Destroying a world that you don't know has life or not. ________
d) Destroying a world with no living things on it. _______
e) Killing all living things on a world so you can keep the world for yourself.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______

Feel free to include your reasons for you answers.

Impossible to answer without context. If "for no reason" then immoral. Those where a reason is given (e & f) are both immoral.
Envisage
Posts: 3,646
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 7:46:31 AM
Posted: 1 year ago
At 2/26/2015 6:56:42 AM, ethang5 wrote:
At 2/25/2015 11:39:42 AM, Envisage wrote:
At 2/25/2015 11:08:39 AM, ethang5 wrote:
Experiment #1

The world = Huge Empty Locked Warehouse
The People = You and 99 other people (females, males, and children)
The Setup = The 99 other people are docile and forgiving and will not harm you in any way regardless of what you do.

You find out that one of the 99 has Ebola. There is no way to quarantine the person. You kill the person saving "the world" from death by Ebola.

According to your own moral code, the one you live by, was the killing ,

1. Moral though undesirable
2. Immoral
3. Morally dependent on your intent
4. Morally dependent on some factor outside of you
5. Neither moral nor immoral

5

--------------------------------------------------------------------------------------------------------------------------

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral

6. Neither moral or immoral

That would be #5 then.

No. Morally neutral is not equivilent to amorality, since moral neutrality presupposes moral cognitivism, amorally does not? To give an analogy:

If we compare the weights of two people, you can say that one is heavier than the other, or vide versa, or both the same weight (neutral), whereas a non-cognitivist would dispute the very act of measuring a "weight" is coherent in the first place. This a non cognitivist wouldn't say they have equal value, they would dispute the very idea of value in the first place.

.. Meh bad analogy... I tried.
---------------------------------------------------------------------------------------------------------------------------

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
b) Taking all animals to another world and destroying their original world ________
c) Destroying a world that you don't know has life or not. ________
d) Destroying a world with no living things on it. _______
e) Killing all living things on a world so you can keep the world for yourself.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______

Feel free to include your reasons for you answers.

5. Neither moral or immoral

Reasons: I am a moral non-cognitivist.

Can you please tell us what the moral non-cognitivist position is?

The positin ion holds that "morals", "goods", "evils", "rights" or "wrongs" don't refer to tangible concepts, and thus are inherently meaningless. Thus when you saw "x is moral", you might as well be saying "x is a beboowawa (some made-up word), the concept is regarded as fundementally meaningless.
bornofgod
Posts: 11,322
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 7:51:49 AM
Posted: 1 year ago
At 2/25/2015 11:08:39 AM, ethang5 wrote:
Experiment #1

The world = Huge Empty Locked Warehouse
The People = You and 99 other people (females, males, and children)
The Setup = The 99 other people are docile and forgiving and will not harm you in any way regardless of what you do.

You find out that one of the 99 has Ebola. There is no way to quarantine the person. You kill the person saving "the world" from death by Ebola.

According to your own moral code, the one you live by, was the killing ,

1. Moral though undesirable
2. Immoral
3. Morally dependent on your intent
4. Morally dependent on some factor outside of you
5. Neither moral nor immoral
--------------------------------------------------------------------------------------------------------------------------

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral
---------------------------------------------------------------------------------------------------------------------------

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
b) Taking all animals to another world and destroying their original world ________
c) Destroying a world that you don't know has life or not. ________
d) Destroying a world with no living things on it. _______
e) Killing all living things on a world so you can keep the world for yourself.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______

Feel free to include your reasons for you answers.

This is why some people believe God is immoral;

Genesis 6
12: And God saw the earth, and behold, it was corrupt; for all flesh had corrupted their way upon the earth.
13: And God said to Noah, "I have determined to make an end of all flesh; for the earth is filled with violence through them; behold, I will destroy them with the earth.

17: For behold, I will bring a flood of waters upon the earth, to destroy all flesh in which is the breath of life from under heaven; everything that is on the earth shall die.
YassineB
Posts: 1,003
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 7:52:38 AM
Posted: 1 year ago
At 2/25/2015 11:08:39 AM, ethang5 wrote:

- Interesting! My Moral compass is gonna be based on an Islamic Paradigm, for I am a muslim.

Experiment #1

The world = Huge Empty Locked Warehouse
The People = You and 99 other people (females, males, and children)
The Setup = The 99 other people are docile and forgiving and will not harm you in any way regardless of what you do.

You find out that one of the 99 has Ebola. There is no way to quarantine the person. You kill the person saving "the world" from death by Ebola.

According to your own moral code, the one you live by, was the killing ,

1. Moral though undesirable
2. Immoral
3. Morally dependent on your intent
4. Morally dependent on some factor outside of you
5. Neither moral nor immoral

- N" 2 = Immoral, Reasons:
> The Life of the Ebola Person is not forfeited, even if it may cause Death.
> In this situation, & according to Shari'a, a Life can only be forfeited if it caused Death with both:
1. Aggression ('Udwan).
2. Wilfulness ('Amd).

=> The Ebola Person did neither willingly nor aggressively caused Death, & thus killing him is Immoral.

--------------------------------------------------------------------------------------------------------------------------

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral

- N" 1 = Immoral, Reasons:
> Saving one's Life on the expense of another, regardless of the other is Good or Evil, is Immoral, & thus Saving on's own skin in that manner is even more Immoral.

---------------------------------------------------------------------------------------------------------------------------

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
b) Taking all animals to another world and destroying their original world ________
c) Destroying a world that you don't know has life or not. ________
d) Destroying a world with no living things on it. _______
e) Killing all living things on a world so you can keep the world for yourself.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______

Feel free to include your reasons for you answers.

- All are Immoral, big time.
Current Debates In Voting Period:

- The Qur'an We Have Today is Not What Muhammad Dictated Verbatim. Vs. @Envisage:
http://www.debate.org...

- Drawing Contest. Vs. @purpleduck:
http://www.debate.org...

"It is perfectly permissible to vote on sources without reading them" bluesteel.
ethang5
Posts: 4,084
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 9:47:49 AM
Posted: 1 year ago
At 2/26/2015 7:30:42 AM, Graincruncher wrote:
At 2/25/2015 11:08:39 AM, ethang5 wrote:
Experiment #1

The world = Huge Empty Locked Warehouse
The People = You and 99 other people (females, males, and children)
The Setup = The 99 other people are docile and forgiving and will not harm you in any way regardless of what you do.

You find out that one of the 99 has Ebola. There is no way to quarantine the person. You kill the person saving "the world" from death by Ebola.

According to your own moral code, the one you live by, was the killing ,

1. Moral though undesirable
2. Immoral
3. Morally dependent on your intent
4. Morally dependent on some factor outside of you
5. Neither moral nor immoral

Somewhere between (1) and (2); perhaps "immoral though desirable".

Interesting take! I guess you mean that killing the Ebola victim to save the others is what is desirable. (I hope) and not that killing him is desirable.
--------------------------------------------------------------------------------------------------------------------------

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral

(1)

Would it be moral if the person was evil and you knew it?

---------------------------------------------------------------------------------------------------------------------------

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
b) Taking all animals to another world and destroying their original world ________
c) Destroying a world that you don't know has life or not. ________
d) Destroying a world with no living things on it. _______
e) Killing all living things on a world so you can keep the world for yourself.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______

Feel free to include your reasons for you answers.

Impossible to answer without context. If "for no reason" then immoral. Those where a reason is given (e & f) are both immoral.

Why is it immoral to destroy a planet with animals on it? I know it may seem obvious to you, but I want to know the answer your personal morality gives.
ethang5
Posts: 4,084
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 9:54:01 AM
Posted: 1 year ago
At 2/26/2015 7:46:31 AM, Envisage wrote:
At 2/26/2015 6:56:42 AM, ethang5 wrote:

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral

6. Neither moral or immoral

That would be #5 then.

No. Morally neutral is not equivilent to amorality, since moral neutrality presupposes moral cognitivism, amorally does not? To give an analogy:

If we compare the weights of two people, you can say that one is heavier than the other, or vide versa, or both the same weight (neutral), whereas a non-cognitivist would dispute the very act of measuring a "weight" is coherent in the first place. This a non cognitivist wouldn't say they have equal value, they would dispute the very idea of value in the first place.

.. Meh bad analogy... I tried.

I think I got you. Ok, I know you sometimes take positions you don't personally hold just for arguments sake. Are you actually a moral non-cognitivist?

If you don't want to say that's fine too.
---------------------------------------------------------------------------------------------------------------------------

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
b) Taking all animals to another world and destroying their original world ________
c) Destroying a world that you don't know has life or not. ________
d) Destroying a world with no living things on it. _______
e) Killing all living things on a world so you can keep the world for yourself.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______

Feel free to include your reasons for you answers.

5. Neither moral or immoral

Reasons: I am a moral non-cognitivist.

Can you please tell us what the moral non-cognitivist position is?

The positin ion holds that "morals", "goods", "evils", "rights" or "wrongs" don't refer to tangible concepts, and thus are inherently meaningless. Thus when you saw "x is moral", you might as well be saying "x is a beboowawa (some made-up word), the concept is regarded as fundementally meaningless.

If you do actually hold such a position, what governs your code of behavior? What makes you do this or the other? Utilitarianism?
Envisage
Posts: 3,646
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 10:00:50 AM
Posted: 1 year ago
At 2/26/2015 9:54:01 AM, ethang5 wrote:
At 2/26/2015 7:46:31 AM, Envisage wrote:
At 2/26/2015 6:56:42 AM, ethang5 wrote:

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral

6. Neither moral or immoral

That would be #5 then.

No. Morally neutral is not equivilent to amorality, since moral neutrality presupposes moral cognitivism, amorally does not? To give an analogy:

If we compare the weights of two people, you can say that one is heavier than the other, or vide versa, or both the same weight (neutral), whereas a non-cognitivist would dispute the very act of measuring a "weight" is coherent in the first place. This a non cognitivist wouldn't say they have equal value, they would dispute the very idea of value in the first place.

.. Meh bad analogy... I tried.

I think I got you. Ok, I know you sometimes take positions you don't personally hold just for arguments sake. Are you actually a moral non-cognitivist?

If you don't want to say that's fine too.
--------------------------------------------------------------------------------------------------------------------------

I am one, yes. It's right there in my profile. By extension, I am a moral nihilist.


Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
b) Taking all animals to another world and destroying their original world ________
c) Destroying a world that you don't know has life or not. ________
d) Destroying a world with no living things on it. _______
e) Killing all living things on a world so you can keep the world for yourself.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______

Feel free to include your reasons for you answers.

5. Neither moral or immoral

Reasons: I am a moral non-cognitivist.

Can you please tell us what the moral non-cognitivist position is?

The positin ion holds that "morals", "goods", "evils", "rights" or "wrongs" don't refer to tangible concepts, and thus are inherently meaningless. Thus when you saw "x is moral", you might as well be saying "x is a beboowawa (some made-up word), the concept is regarded as fundementally meaningless.

If you do actually hold such a position, what governs your code of behavior? What makes you do this or the other? Utilitarianism?

In terms of pragmatics, then a form of collectivist egoism/contractarianism. Since both of those directly relate to the self-interest of the subject.

Collectivist egoism is a form of "enlightened self-interest", where you consider beyond the immediate scope of situations to find actions which are ultimately within my self-interest.

Elements of contractarianism realise that society and social contracts provide a pragmatic route to maximising the majority of our self-interests (ease of access to employment, resources, entertainment, community, etc.), thus this generally overrides any myoptic actions taken to maximise self-interest (e.g. robbing a shop, the risk of punishment deters from the short term financial gain).

As for *why* I care about that, well because it's within my self-interest to care about my self-interest, it's a tautology.
ethang5
Posts: 4,084
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 10:05:19 AM
Posted: 1 year ago
At 2/26/2015 7:52:38 AM, YassineB wrote:
At 2/25/2015 11:08:39 AM, ethang5 wrote:

- Interesting! My Moral compass is gonna be based on an Islamic Paradigm, for I am a muslim.

You are most welcome.

Experiment #1

The world = Huge Empty Locked Warehouse
The People = You and 99 other people (females, males, and children)
The Setup = The 99 other people are docile and forgiving and will not harm you in any way regardless of what you do.

You find out that one of the 99 has Ebola. There is no way to quarantine the person. You kill the person saving "the world" from death by Ebola.

According to your own moral code, the one you live by, was the killing ,

1. Moral though undesirable
2. Immoral
3. Morally dependent on your intent
4. Morally dependent on some factor outside of you
5. Neither moral nor immoral

- N" 2 = Immoral, Reasons:
> The Life of the Ebola Person is not forfeited, even if it may cause Death.
> In this situation, & according to Shari'a, a Life can only be forfeited if it caused Death with both:
1. Aggression ('Udwan).
2. Wilfulness ('Amd).

=> The Ebola Person did neither willingly nor aggressively caused Death, & thus killing him is Immoral.

What about the 98 other people who will certainly die? How is upholding your moral code better when the result will be 100 dead 0 alive if you stick with your code, and 1 dead and 99 alive if you kill the one with Ebola?
--------------------------------------------------------------------------------------------------------------------------

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral

- N" 1 = Immoral, Reasons:
> Saving one's Life on the expense of another, regardless of the other is Good or Evil, is Immoral, & thus Saving on's own skin in that manner is even more Immoral.

Question then. Is it immoral for a police officer to shoot and kill a person assaulting another if he fears the assaulter will kill the victim if allowed?

---------------------------------------------------------------------------------------------------------------------------

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
b) Taking all animals to another world and destroying their original world ________
c) Destroying a world that you don't know has life or not. ________
d) Destroying a world with no living things on it. _______
e) Killing all living things on a world so you can keep the world for yourself.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______

Feel free to include your reasons for you answers.

- All are Immoral, big time.

Why is it immoral to destroy a world if there are no living things on it?
Graincruncher
Posts: 2,799
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 10:05:45 AM
Posted: 1 year ago
At 2/26/2015 9:47:49 AM, ethang5 wrote:
Interesting take! I guess you mean that killing the Ebola victim to save the others is what is desirable. (I hope) and not that killing him is desirable.

Exactly. It is desirable relative to the alternative, but that does not necessarily make it moral. A big issue I have with a lot of moral problems is they pose situations where something is either "ok" or "not ok", when often the case is that it"s simply choosing the least "not ok". In this instance, killing one person who is almost certainly going to die (horribly, for that matter) to prevent a lot of other people dying (horribly) is the lesser evil.

Would it be moral if the person was evil and you knew it?

No, but I think this is the most complex of the problems so I"ve got a lot of caveats. It would be another discussion in itself.

Why is it immoral to destroy a planet with animals on it? I know it may seem obvious to you, but I want to know the answer your personal morality gives.

This is one of those questions that does seem obvious, but isn"t necessarily so. I would argue that it is purposeless cruelty, that destroying something is a one-way act (for practical purposes) that needs substantial justification and that living things have degrees of value proportional to their level of sentience. To give it a perhaps useful context, I feel vaguely uneasy about the fact I eat meat.
ethang5
Posts: 4,084
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 10:17:09 AM
Posted: 1 year ago
At 2/26/2015 10:00:50 AM, Envisage wrote:
At 2/26/2015 9:54:01 AM, ethang5 wrote:

Ok, I know you sometimes take positions you don't personally hold just for arguments sake. Are you actually a moral non-cognitivist?

If you don't want to say that's fine too.

I am one, yes. It's right there in my profile. By extension, I am a moral nihilist.

Reasons: I am a moral non-cognitivist.

Can you please tell us what the moral non-cognitivist position is?

The positin ion holds that "morals", "goods", "evils", "rights" or "wrongs" don't refer to tangible concepts, and thus are inherently meaningless. Thus when you saw "x is moral", you might as well be saying "x is a beboowawa (some made-up word), the concept is regarded as fundementally meaningless.

If you do actually hold such a position, what governs your code of behavior? What makes you do this or the other? Utilitarianism?

In terms of pragmatics, then a form of collectivist egoism/contractarianism. Since both of those directly relate to the self-interest of the subject.

If you found yourself in a vastly different culture, would that affect your definition of "collectivist"?

Collectivist egoism is a form of "enlightened self-interest", where you consider beyond the immediate scope of situations to find actions which are ultimately within my self-interest.

Elements of contractarianism realise that society and social contracts provide a pragmatic route to maximising the majority of our self-interests (ease of access to employment, resources, entertainment, community, etc.), thus this generally overrides any myoptic actions taken to maximise self-interest (e.g. robbing a shop, the risk of punishment deters from the short term financial gain).

Do you ever sometimes act in your short term best interest anyway? How can you ever be sure that you are maximizing majority interests? Human nature, it seem to me, would make convincing oneself that personal interest was global interest very likely.

As for *why* I care about that, well because it's within my self-interest to care about my self-interest, it's a tautology.

Not always. But more interesting are the reasons why you think those things are in your self-interest.
Envisage
Posts: 3,646
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 10:26:15 AM
Posted: 1 year ago
At 2/26/2015 10:17:09 AM, ethang5 wrote:
At 2/26/2015 10:00:50 AM, Envisage wrote:
At 2/26/2015 9:54:01 AM, ethang5 wrote:

Ok, I know you sometimes take positions you don't personally hold just for arguments sake. Are you actually a moral non-cognitivist?

If you don't want to say that's fine too.

I am one, yes. It's right there in my profile. By extension, I am a moral nihilist.

Reasons: I am a moral non-cognitivist.

Can you please tell us what the moral non-cognitivist position is?

The positin ion holds that "morals", "goods", "evils", "rights" or "wrongs" don't refer to tangible concepts, and thus are inherently meaningless. Thus when you saw "x is moral", you might as well be saying "x is a beboowawa (some made-up word), the concept is regarded as fundementally meaningless.

If you do actually hold such a position, what governs your code of behavior? What makes you do this or the other? Utilitarianism?

In terms of pragmatics, then a form of collectivist egoism/contractarianism. Since both of those directly relate to the self-interest of the subject.

If you found yourself in a vastly different culture, would that affect your definition of "collectivist"?

Obviously, yes. Different cultures will yield different effects for the same inputs. Some behaviours which are seen as acceptable in some places are unacceptable in others. And if I care about not upsetting people, then that will shape my behaviour.

Collectivist egoism is a form of "enlightened self-interest", where you consider beyond the immediate scope of situations to find actions which are ultimately within my self-interest.

Elements of contractarianism realise that society and social contracts provide a pragmatic route to maximising the majority of our self-interests (ease of access to employment, resources, entertainment, community, etc.), thus this generally overrides any myoptic actions taken to maximise self-interest (e.g. robbing a shop, the risk of punishment deters from the short term financial gain).

Do you ever sometimes act in your short term best interest anyway? How can you ever be sure that you are maximizing majority interests? Human nature, it seem to me, would make convincing oneself that personal interest was global interest very likely.

1. Obviously, yes. We aren't all Isaac Newtons. We have impulses and I am quite open that I often act on them without much thought.
2. Can never be certain of anything, but one can make a best attempt via logic and reason and analysis of the environment. A thought-through act is going to be more likely to be within that realm than one with no thought applied at all. Generally the more parametrised and learned a position is, the closer it is to the answer that is seeked.
3. Agreed, that's called ethnocentrism, which I regard as problematic.

As for *why* I care about that, well because it's within my self-interest to care about my self-interest, it's a tautology.

Not always. But more interesting are the reasons why you think those things are in your self-interest.

Yes always, because if something else was within my self-interest other than my self-interests, then it would be included in the set. It's tautological, a set contains itself. If it's with my self-interest to act altruistically (outside my immediate self-interest for others) for example, then that's still egoism. You are just redefining the contents of the set.
ethang5
Posts: 4,084
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 10:47:56 AM
Posted: 1 year ago
At 2/26/2015 10:26:15 AM, Envisage wrote:
At 2/26/2015 10:17:09 AM, ethang5 wrote:
At 2/26/2015 10:00:50 AM, Envisage wrote:
At 2/26/2015 9:54:01 AM, ethang5 wrote:

Ok, I know you sometimes take positions you don't personally hold just for arguments sake. Are you actually a moral non-cognitivist?

If you don't want to say that's fine too.

I am one, yes. It's right there in my profile. By extension, I am a moral nihilist.

Reasons: I am a moral non-cognitivist.

Can you please tell us what the moral non-cognitivist position is?

The positin ion holds that "morals", "goods", "evils", "rights" or "wrongs" don't refer to tangible concepts, and thus are inherently meaningless. Thus when you saw "x is moral", you might as well be saying "x is a beboowawa (some made-up word), the concept is regarded as fundementally meaningless.

If you do actually hold such a position, what governs your code of behavior? What makes you do this or the other? Utilitarianism?

In terms of pragmatics, then a form of collectivist egoism/contractarianism. Since both of those directly relate to the self-interest of the subject.

If you found yourself in a vastly different culture, would that affect your definition of "collectivist"?

Obviously, yes. Different cultures will yield different effects for the same inputs. Some behaviours which are seen as acceptable in some places are unacceptable in others. And if I care about not upsetting people, then that will shape my behaviour.

Collectivist egoism is a form of "enlightened self-interest", where you consider beyond the immediate scope of situations to find actions which are ultimately within my self-interest.

Elements of contractarianism realise that society and social contracts provide a pragmatic route to maximising the majority of our self-interests (ease of access to employment, resources, entertainment, community, etc.), thus this generally overrides any myoptic actions taken to maximise self-interest (e.g. robbing a shop, the risk of punishment deters from the short term financial gain).

Do you ever sometimes act in your short term best interest anyway? How can you ever be sure that you are maximizing majority interests? Human nature, it seem to me, would make convincing oneself that personal interest was global interest very likely.

1. Obviously, yes. We aren't all Isaac Newtons. We have impulses and I am quite open that I often act on them without much thought.
2. Can never be certain of anything, but one can make a best attempt via logic and reason and analysis of the environment. A thought-through act is going to be more likely to be within that realm than one with no thought applied at all. Generally the more parametrised and learned a position is, the closer it is to the answer that is seeked.
3. Agreed, that's called ethnocentrism, which I regard as problematic.

As for *why* I care about that, well because it's within my self-interest to care about my self-interest, it's a tautology.

Not always. But more interesting are the reasons why you think those things are in your self-interest.

Yes always, because if something else was within my self-interest other than my self-interests, then it would be included in the set. It's tautological, a set contains itself. If it's with my self-interest to act altruistically (outside my immediate self-interest for others) for example, then that's still egoism. You are just redefining the contents of the set.

lol, no, I meant that sentences like that ( it's within my self-interest to care about my self-interest ) are not always tautology.
ethang5
Posts: 4,084
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 10:52:13 AM
Posted: 1 year ago
At 2/26/2015 10:05:45 AM, Graincruncher wrote:
At 2/26/2015 9:47:49 AM, ethang5 wrote:
Interesting take! I guess you mean that killing the Ebola victim to save the others is what is desirable. (I hope) and not that killing him is desirable.

Exactly. It is desirable relative to the alternative, but that does not necessarily make it moral. A big issue I have with a lot of moral problems is they pose situations where something is either "ok" or "not ok", when often the case is that it"s simply choosing the least "not ok". In this instance, killing one person who is almost certainly going to die (horribly, for that matter) to prevent a lot of other people dying (horribly) is the lesser evil.

Would it be moral if the person was evil and you knew it?

No, but I think this is the most complex of the problems so I"ve got a lot of caveats. It would be another discussion in itself.

Why another discussion? I doubt if there is a Gentle Reader on this thread who doesn't want to hear your take on the issue.

Why is it immoral to destroy a planet with animals on it? I know it may seem obvious to you, but I want to know the answer your personal morality gives.

This is one of those questions that does seem obvious, but isn"t necessarily so. I would argue that it is purposeless cruelty, that destroying something is a one-way act (for practical purposes) that needs substantial justification and that living things have degrees of value proportional to their level of sentience. To give it a perhaps useful context, I feel vaguely uneasy about the fact I eat meat.

Even when it's some slow cooked baby-back ribs????
YassineB
Posts: 1,003
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 10:55:21 AM
Posted: 1 year ago
At 2/26/2015 10:05:19 AM, ethang5 wrote:

- N" 2 = Immoral, Reasons:
> The Life of the Ebola Person is not forfeited, even if it may cause Death.
> In this situation, & according to Shari'a, a Life can only be forfeited if it caused Death with both:
1. Aggression ('Udwan).
2. Wilfulness ('Amd).

=> The Ebola Person did neither willingly nor aggressively caused Death, & thus killing him is Immoral.

What about the 98 other people who will certainly die? How is upholding your moral code better when the result will be 100 dead 0 alive if you stick with your code, and 1 dead and 99 alive if you kill the one with Ebola?

- Well, their time has come, & God predestined that they will all die there, but that's no excuse to take the Life of someone:
"whoever kills a soul - unless for a soul or for corruption [done] in the land - it is as if he had slain mankind entirely. And whoever saves one, it is as if he had saved mankind entirely." (5:32)
=> The value of an innocent Life is not measurable by quantity, in contrast, the value of a forfeited Life, only measures itself. The Ebola Person is innocent, & thus his Life is no less than the Life of the other 99.

- In the Islamic Paradigm, the Self is a trust given by God temporarily to the Humans & they are ordained to preserve it, until they return to Him, & Preserving the Self includes preserving:
> The Religion, because muslims believe that all human are created equal in a pure initial state of Fitrah, which is a state of submission to the One God (i.e. = Islam).
> The Life, also a trust given by God to us & we should preserve it, unless it is forfeited by the said conditions.
> The Intellect, also must be preserved.
> The Lineage, idem.
> The Wealth, idem.

=> This concept is called al-Amana (the Trust).

--------------------------------------------------------------------------------------------------------------------------

- N" 1 = Immoral, Reasons:
> Saving one's Life on the expense of another, regardless of the other is Good or Evil, is Immoral, & thus Saving on's own skin in that manner is even more Immoral.

Question then. Is it immoral for a police officer to shoot and kill a person assaulting another if he fears the assaulter will kill the victim if allowed?

- If killing the assaulter is the only way to stop him, then no, it's not Immoral, for the assaulter has potentially forfeited his Life by threatening another.
- Self-defence is part of the Preservation of Life, for to preserve life you must protect it.
> Abu Huraira reported:
"A person came to the Messenger of Allah and said: 'Messenger of Allah, what do you think if a man comes to me in order to appropriate my property?' The Prophet said: "Don't surrender your possession to him". He said: 'What if he fights me?' The Prophet remarked: "Then fight back". He again said: 'What if I was killed?' The Prophet said: "You would be a martyr". He said: 'What if I killed him'. The Prophet said: "he would be in the Fire"."
> The Prophet said: "Whoever is killed protecting his wealth, he is a martyr. Whoever is killed protecting his family, he is a martyr. Whoever is killed protecting his religion, he is a martyr. Whoever is killed protecting himself, he is a martyr."

---------------------------------------------------------------------------------------------------------------------------

Why is it immoral to destroy a world if there are no living things on it?

- That's the second type of Amana (the Trust), called: Isti'mar (Settling in the Earth), & this Trust is conditioned by Preserving the Earth & its Creatures. & we are told that in the Hereafter, everything we have ever harmed will testify against us (the rivers, the trees, the seas, the soil. . .).
Current Debates In Voting Period:

- The Qur'an We Have Today is Not What Muhammad Dictated Verbatim. Vs. @Envisage:
http://www.debate.org...

- Drawing Contest. Vs. @purpleduck:
http://www.debate.org...

"It is perfectly permissible to vote on sources without reading them" bluesteel.
ethang5
Posts: 4,084
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 11:01:22 AM
Posted: 1 year ago
At 2/26/2015 10:55:21 AM, YassineB wrote:
At 2/26/2015 10:05:19 AM, ethang5 wrote:

- N" 2 = Immoral, Reasons:
> The Life of the Ebola Person is not forfeited, even if it may cause Death.
> In this situation, & according to Shari'a, a Life can only be forfeited if it caused Death with both:
1. Aggression ('Udwan).
2. Wilfulness ('Amd).

=> The Ebola Person did neither willingly nor aggressively caused Death, & thus killing him is Immoral.

What about the 98 other people who will certainly die? How is upholding your moral code better when the result will be 100 dead 0 alive if you stick with your code, and 1 dead and 99 alive if you kill the one with Ebola?

- Well, their time has come, & God predestined that they will all die there, but that's no excuse to take the Life of someone:
"whoever kills a soul - unless for a soul or for corruption [done] in the land - it is as if he had slain mankind entirely. And whoever saves one, it is as if he had saved mankind entirely." (5:32)
=> The value of an innocent Life is not measurable by quantity, in contrast, the value of a forfeited Life, only measures itself. The Ebola Person is innocent, & thus his Life is no less than the Life of the other 99.

- In the Islamic Paradigm, the Self is a trust given by God temporarily to the Humans & they are ordained to preserve it, until they return to Him, & Preserving the Self includes preserving:
> The Religion, because muslims believe that all human are created equal in a pure initial state of Fitrah, which is a state of submission to the One God (i.e. = Islam).
> The Life, also a trust given by God to us & we should preserve it, unless it is forfeited by the said conditions.
> The Intellect, also must be preserved.
> The Lineage, idem.
> The Wealth, idem.

=> This concept is called al-Amana (the Trust).

Ok, I understand. But could not killing the one with Ebola be called self-defense since the infected person is going to kill all the others?

--------------------------------------------------------------------------------------------------------------------------

- N" 1 = Immoral, Reasons:
> Saving one's Life on the expense of another, regardless of the other is Good or Evil, is Immoral, & thus Saving on's own skin in that manner is even more Immoral.

Question then. Is it immoral for a police officer to shoot and kill a person assaulting another if he fears the assaulter will kill the victim if allowed?

- If killing the assaulter is the only way to stop him, then no, it's not Immoral, for the assaulter has potentially forfeited his Life by threatening another.

- Self-defence is part of the Preservation of Life, for to preserve life you must protect it.
> Abu Huraira reported:
"A person came to the Messenger of Allah and said: 'Messenger of Allah, what do you think if a man comes to me in order to appropriate my property?' The Prophet said: "Don't surrender your possession to him". He said: 'What if he fights me?' The Prophet remarked: "Then fight back". He again said: 'What if I was killed?' The Prophet said: "You would be a martyr". He said: 'What if I killed him'. The Prophet said: "he would be in the Fire"."
> The Prophet said: "Whoever is killed protecting his wealth, he is a martyr. Whoever is killed protecting his family, he is a martyr. Whoever is killed protecting his religion, he is a martyr. Whoever is killed protecting himself, he is a martyr."

---------------------------------------------------------------------------------------------------------------------------

Why is it immoral to destroy a world if there are no living things on it?

- That's the second type of Amana (the Trust), called: Isti'mar (Settling in the Earth), & this Trust is conditioned by Preserving the Earth & its Creatures. & we are told that in the Hereafter, everything we have ever harmed will testify against us (the rivers, the trees, the seas, the soil. . .).

So according to Islam, environmental abuse is a sin?
Envisage
Posts: 3,646
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 11:31:06 AM
Posted: 1 year ago
At 2/26/2015 10:47:56 AM, ethang5 wrote:
At 2/26/2015 10:26:15 AM, Envisage wrote:
At 2/26/2015 10:17:09 AM, ethang5 wrote:
At 2/26/2015 10:00:50 AM, Envisage wrote:
At 2/26/2015 9:54:01 AM, ethang5 wrote:

Ok, I know you sometimes take positions you don't personally hold just for arguments sake. Are you actually a moral non-cognitivist?

If you don't want to say that's fine too.

I am one, yes. It's right there in my profile. By extension, I am a moral nihilist.

Reasons: I am a moral non-cognitivist.

Can you please tell us what the moral non-cognitivist position is?

The positin ion holds that "morals", "goods", "evils", "rights" or "wrongs" don't refer to tangible concepts, and thus are inherently meaningless. Thus when you saw "x is moral", you might as well be saying "x is a beboowawa (some made-up word), the concept is regarded as fundementally meaningless.

If you do actually hold such a position, what governs your code of behavior? What makes you do this or the other? Utilitarianism?

In terms of pragmatics, then a form of collectivist egoism/contractarianism. Since both of those directly relate to the self-interest of the subject.

If you found yourself in a vastly different culture, would that affect your definition of "collectivist"?

Obviously, yes. Different cultures will yield different effects for the same inputs. Some behaviours which are seen as acceptable in some places are unacceptable in others. And if I care about not upsetting people, then that will shape my behaviour.

Collectivist egoism is a form of "enlightened self-interest", where you consider beyond the immediate scope of situations to find actions which are ultimately within my self-interest.

Elements of contractarianism realise that society and social contracts provide a pragmatic route to maximising the majority of our self-interests (ease of access to employment, resources, entertainment, community, etc.), thus this generally overrides any myoptic actions taken to maximise self-interest (e.g. robbing a shop, the risk of punishment deters from the short term financial gain).

Do you ever sometimes act in your short term best interest anyway? How can you ever be sure that you are maximizing majority interests? Human nature, it seem to me, would make convincing oneself that personal interest was global interest very likely.

1. Obviously, yes. We aren't all Isaac Newtons. We have impulses and I am quite open that I often act on them without much thought.
2. Can never be certain of anything, but one can make a best attempt via logic and reason and analysis of the environment. A thought-through act is going to be more likely to be within that realm than one with no thought applied at all. Generally the more parametrised and learned a position is, the closer it is to the answer that is seeked.
3. Agreed, that's called ethnocentrism, which I regard as problematic.

As for *why* I care about that, well because it's within my self-interest to care about my self-interest, it's a tautology.

Not always. But more interesting are the reasons why you think those things are in your self-interest.

Yes always, because if something else was within my self-interest other than my self-interests, then it would be included in the set. It's tautological, a set contains itself. If it's with my self-interest to act altruistically (outside my immediate self-interest for others) for example, then that's still egoism. You are just redefining the contents of the set.

lol, no, I meant that sentences like that ( it's within my self-interest to care about my self-interest ) are not always tautology.

*scratches head*
Burzmali
Posts: 1,310
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 1:28:59 PM
Posted: 1 year ago
At 2/26/2015 7:19:45 AM, ethang5 wrote:
At 2/25/2015 11:57:26 AM, Burzmali wrote:
At 2/25/2015 11:08:39 AM, ethang5 wrote:
Experiment #1

The world = Huge Empty Locked Warehouse
The People = You and 99 other people (females, males, and children)
The Setup = The 99 other people are docile and forgiving and will not harm you in any way regardless of what you do.

You find out that one of the 99 has Ebola. There is no way to quarantine the person. You kill the person saving "the world" from death by Ebola.

According to your own moral code, the one you live by, was the killing ,

1. Moral though undesirable
2. Immoral
3. Morally dependent on your intent
4. Morally dependent on some factor outside of you
5. Neither moral nor immoral

Moral though undesirable. Preface, I don't view morality as some binary situation. Some choices and actions are more moral than others. The situation above doesn't allow for the most moral solution, so a lesser solution must be chosen.

Seems reasonable. But you are aware that some would call your decision here immoral?

Yeah, I completely understand that. This is an example of the subjective aspect of morality. It's a valuation of existence of the species versus the life of the individual.

--------------------------------------------------------------------------------------------------------------------------

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral

Absent any further information, always immoral. That doesn't necessarily mean I wouldn't substitute, but if I did, it would be an immoral choice.

I'm interested in when this happens, not just in others but in myself also. When we knowingly go against our moral code. You're saying here that though it is against your moral code, you may do it. If you did do it, why would that be? To avoid the pain? Or because the person substituted wouldn't die in real life? Or because the consequence to you are relatively small?

For example, if it wasn't a game, and the person substituted would really die, would it be harder to do it or just as easy?

It actually might be harder for me to resist substituting if it was real death. I don't know how I would behave given whatever panic I would feel in that situation. It's easy for me, or anyone really, to say what is right or wrong in a situation and pretend what we would or wouldn't do. But until you're actually in that situation, you don't know for sure.

----------------------------------------------------------------------------------------------------------------------------

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
b) Taking all animals to another world and destroying their original world ________
c) Destroying a world that you don't know has life or not. ________
d) Destroying a world with no living things on it. _______
e) Killing all living things on a world so you can keep the world for yourself.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______

Feel free to include your reasons for you answers.

All of these assume no additional info is available.

a) Immoral
b) Immoral
c) Immoral
d) Probably Morally Neutral
e) Immoral
f) Immoral

Is f) immoral because of the "all" or would killing a single animal also be immoral? Or is it the trivial reason (not liking them) that makes it immoral?

It's the trivial reason. If all of those animals represented a real threat of some kind that could only be solved by eradicating them, then that would be more moral than eradicating them for the simple reason of disliking them.
Graincruncher
Posts: 2,799
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 3:01:28 PM
Posted: 1 year ago
At 2/26/2015 10:52:13 AM, ethang5 wrote:
Why another discussion? I doubt if there is a Gentle Reader on this thread who doesn't want to hear your take on the issue.

Indubitably, but I meant that it's a very involved discussion that wouldn't fit in with all the rest. Seeing as 'all the rest' isn't that much, it can probably be had. I'll do it in another post.

Even when it's some slow cooked baby-back ribs????

Regrettably, even with bacon. I'm a martyr to my own taste-buds.
mrsatan
Posts: 417
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 3:24:18 PM
Posted: 1 year ago
At 2/25/2015 11:08:39 AM, ethang5 wrote:
Experiment #1

The world = Huge Empty Locked Warehouse
The People = You and 99 other people (females, males, and children)
The Setup = The 99 other people are docile and forgiving and will not harm you in any way regardless of what you do.

You find out that one of the 99 has Ebola. There is no way to quarantine the person. You kill the person saving "the world" from death by Ebola.

According to your own moral code, the one you live by, was the killing ,

1. Moral though undesirable
2. Immoral
3. Morally dependent on your intent
4. Morally dependent on some factor outside of you
5. Neither moral nor immoral

The lesser of two evils, not as immoral as letting everyone die through inaction, but still immoral.
--------------------------------------------------------------------------------------------------------------------------

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral

Probably immoral, but there's not enough info to be sure. Do people choose to enter this matrix, or are they forced to? If they choose to, are they aware of the situation they are entering? Do they choose what color they wear, is it related to their actual character, or is it arbitrarily assigned?

---------------------------------------------------------------------------------------------------------------------------

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
b) Taking all animals to another world and destroying their original world ________
c) Destroying a world that you don't know has life or not. ________
d) Destroying a world with no living things on it. _______
e) Killing all living things on a world so you can keep the world for yourself.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______

Feel free to include your reasons for you answers.

The way I see it, morality is the concept that there is a standard by which sentient beings should treat each other. So, if I were the only sentient being in existence, then the concept of morality would be null. Judging my own morality, in these scenarios, strikes me as an absurd notion.
To say one has free will, to have chosen other than they did, is to say they have will over their will... Will over the will they have over their will... Will over the will they have over the will they have over their will, etc... It's utter nonsense.
ThinkFirst
Posts: 1,391
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 4:54:00 PM
Posted: 1 year ago
At 2/26/2015 7:28:53 AM, ethang5 wrote:
At 2/25/2015 3:56:58 PM, ThinkFirst wrote:
At 2/25/2015 11:08:39 AM, ethang5 wrote:
Experiment #1

The world = Huge Empty Locked Warehouse
The People = You and 99 other people (females, males, and children)
The Setup = The 99 other people are docile and forgiving and will not harm you in any way regardless of what you do.

You find out that one of the 99 has Ebola. There is no way to quarantine the person. You kill the person saving "the world" from death by Ebola.

According to your own moral code, the one you live by, was the killing ,

1. Moral though undesirable
2. Immoral
3. Morally dependent on your intent
4. Morally dependent on some factor outside of you
5. Neither moral nor immoral

1 - Moral, though undesirable. I must note, here, that the death would be as swift and painless as I could make it, given the availability of tools/drugs, etc.

Would you resist being killed if you were the person with Ebola? Or would you allow yourself to be killed for the sake of the others?

Never having been in such a position, I can't state that I know 100% that self-preservation would not make me act to the contrary of what I would know to be correct... With that stated, I would like to think that:
1) I would immediately speak up, upon suspecting that I might have contracted it.
2) I would offer no resistance to an humane (quick/painless) death.
No one actually KNOWS their behavior under hypothetical circumstances. My rational mind, at present, indicates that peaceful submission to death is the morally correct choice, and the most rational decision that I could make, preventing suffering to other human beings.

This further presupposes that knowledge that the person being killed had ebola was 100%, unmistakable, and unequivocal.

Yes, right.
-------------------------------------------------------------------------------------------------------------------------

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral

1 - Always Immoral
According to my moral code, even taking the chance that the "green" might have been good would make the action immoral. Unless I know, unequivocally, without doubt, that the "green" is "evil," substituting them is murder.

Why is it not considered murder if he is known to be evil?

Because, within your scenario, the "evil" is known to be murderous. Taking the life of one that would violently harm another, unprovoked, is not murder. It is the rational course toward the greater good. Removing evil is never evil. Yes, I am one of those that would kill Hitler at age 14, knowing what he would become, later. It would be a reasonable assumption that, if it was "evil," that it has at least attempted to take another human life, as that is the parameter you set:

""evil" people dressed in white who will always attempt to kill you painfully"


I am assuming that your conditions (extreme as they are) are absolute. Taking the chance that the "green" is evil is tantamount to drinking heavily, then driving a big truck, knowing there is a 50/50 risk that I would run over an innocent on the way

What does "innocent" mean here?
Anyone that has not done any harm to another, meriting death. Even if the person hit is "evil," and has committed atrocities beyond description, I would have no way of knowing it, within the context of my parameters in the example. Running over ANYONE, under those circumstances, would be the result of a very bad decision, and should be punished.


---------------------------------------------------------------------------------------------------------------------------

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
1 - Having the power does not justify using it.
b) Taking all animals to another world and destroying their original world ________
1 - Having the power does not justify using it.
c) Destroying a world that you don't know has life or not. ________
1 - Having the power does not justify using it.
d) Destroying a world with no living things on it. _______

But why is destroying a lifeless world immoral? In what way is it immoral?

As listed, having a power is not sufficient justification to use it. Furthermore, nothing was mentioned about knowing whether or not life would ever evolve on it. Moreover, I have absolutely no idea what effect the destruction of that planet might have on the rest of the solar system.

Is destroying that world the beginning of a collapse, of any kind?
Would such a collapse impact another solar system?
Am I aware of whether or not any life exists on any other planets in that solar system?
If there is, am I impacting that system in such a way as to change that life-bearing planet's balance beyond a point where it could no longer support life (thereby effectively destroying that life, anyway...)?
Without the answers to these questions (and many others), taking such a risk is immoral.

1 - Having the power does not justify using it.
e) Killing all living things on a world so you can keep the world for yourself. _______
1 - Having the power does not justify using it.
f) Causing all animals of one kind in the universe to die because you don't like them. (Like roaches or snakes or houseflies). _______
1 - Having the power does not justify using it.

Do you consider it immoral now when we kill say, all the ants nesting on our basement?

No. Those nests are inherently dangerous to us, and eliminating that colony would not endanger or make that species extinct. Its survival has a distinct, definite, negative impact on my existence. In such a case, I would not be killing just to kill, or destroying just to destroy.
"Never attribute to villainy that which can be adequately explained by stupidity"
-----
"Men rarely if ever dream up a god superior to themselves. Most gods have the manners and morals of a spoiled child. "

-- Robert A Heinlein
YassineB
Posts: 1,003
Add as Friend
Challenge to a Debate
Send a Message
2/26/2015 6:26:53 PM
Posted: 1 year ago
At 2/26/2015 11:01:22 AM, ethang5 wrote:
Ok, I understand. But could not killing the one with Ebola be called self-defense since the infected person is going to kill all the others?

- Self-defence according to Islam is only legitimate if the the person causing death is doing it with both:
1. Aggression ('Udwan). <<< a killing may be done with aggression, though not willingly: eg. a man bullying another with no weapon, & no intention of killing him, & the bullied accidentally died in that situation, or because of it. <<< Here the killer will not be subject to the death penalty, for the killing is deemed as involuntary manslaughter, & so, in that case, he has to pay the Blood-Money to the victim's family (as the Prophet decreed, about 160,000$ of today's money).
2. & Wilfulness ('Amd). <<< a killing may be done willingly but without the killer being an aggressor: eg. a man killing another to protect himself or his family because the other was the aggressor & intended to kill him = self-defence. <<< Here again the killer will not be subject to the death penalty, only Blood-Money.

- Now, in our case, the one that is causing Death (the Ebola Person), killed these people with neither Aggression nor Wilfulness, & so killing this person in Self-defence is not allowed, & will also be subject to the rules cited above.

--------------------------------------------------------------------------------------------------------------------------

So according to Islam, environmental abuse is a sin?

- Indeed it is.

> "Eat and drink from the provision of Allah , and do not commit abuse on the earth, spreading corruption." (2:60)

> "That Day [of Resurrection] it (the Earth) will declare its information (about all what happened over it of good or evil)." (99:4)

> Narrated Jabir:
"The Messenger of Allah forbade to urinate in stagnant water."
[Muslim #281]

> The Prophet said: "He who has land should cultivate it, but if he does not find it possible to cultivate it, or finds himself helpless to do so, he should lend it to his Muslim brother (to cultivate it), but he should not accept rent from him."
[Muslim #1536]

> He also said "If the Final Hour comes while you have a palm-cutting in your hands and it is possible to plant it before the Hour comes, you should plant it." (Final Hour = Apocalypse)
[Al-Adab Al-Mufrad #479]

> He also said: "If anyone cuts a tree (with not just cause), Allah brings him headlong into Hell."
[Abi Dawud #5239]

> The Prophet - when going to battle - said: "Don"t kill a child, nor a woman, nor an old man, bring no harm to the trees, nor mutilate a dead body or an enemy"s flock"" [ ^Bayhaq"i #17594 ]

> He also said: "Whoever kills a small bird for no just reason, it will beseech Allah on the Day of Resurrection saying: O Lord, so and so killed me for no reason. And he did not kill me for any beneficial purpose." and said:"There is no person who kills a small bird or anything larger for no just reason, but Allah, the Mighty and Sublime, will ask him about it." It was said: "O Messenger of Allah, what does just reason mean?" Her said: "That you slaughter it and eat it, and do not cut off its head and throw it aside."
[an-Nasa'i #4349 #4446]

> He also said: "Do not take any living creature as a target."
[Muslim #1378]

> He also said: "There was a dog moving around a well whom thirst would have killed, an Israeli prostitute saw it and took off her shoe and watered it. So Allah forgave her because of that good deed."
[al-Bukhari #3467]

> He also said, "A woman was punished and put in Hell because of a cat which she had kept locked till it died of hunger. She did not allow it either to eat or drink as it was confined, nor did she free it so that it might eat the vermin of the earth."
[al-Bukhari #2365]

> Narrated Abdullah ibn Mas'ud:
'We were with the Messenger of Allah during a journey. He went to ease himself. We saw a bird with her two young ones and we captured her young ones. The bird came and began to spread its wings. The Messenger of Allah came and said: "Who grieved this for its young ones? Return its young ones back to it." He also saw an ant village that we had burnt. He asked: "Who has burnt this?" We replied: 'We'. He said: "It is not proper to punish with fire except the Lord of fire." '
[Abi Dawud #2675]

> Narrated 'Abdur Rahman bin 'Uthman:
"A doctor consulted the Prophet about extracting medicine from a frog but he prohibited killing it"
[Abi Dawud #3871]

> Reported Muhammad Ibn Umar al-Aslami:
"When the Prophet marched in his army from al-Araj in the direction of Mecca, he saw a bitch (on the way) whining over its puppies, while they were around her suckling. He commanded one of his men called Ja'il Ibn Suraq'a to stay by its side and guard it so that none of the army would harm it and it puppies" <<< He changed the course of his army for a dog because it was feeding its puppies.
[al-Maghazi, al-Waq'idi]

. . . etc.
Current Debates In Voting Period:

- The Qur'an We Have Today is Not What Muhammad Dictated Verbatim. Vs. @Envisage:
http://www.debate.org...

- Drawing Contest. Vs. @purpleduck:
http://www.debate.org...

"It is perfectly permissible to vote on sources without reading them" bluesteel.
FaustianJustice
Posts: 6,205
Add as Friend
Challenge to a Debate
Send a Message
2/27/2015 1:28:22 AM
Posted: 1 year ago
At 2/25/2015 11:08:39 AM, ethang5 wrote:
Experiment #1

The world = Huge Empty Locked Warehouse
The People = You and 99 other people (females, males, and children)
The Setup = The 99 other people are docile and forgiving and will not harm you in any way regardless of what you do.

You find out that one of the 99 has Ebola. There is no way to quarantine the person. You kill the person saving "the world" from death by Ebola.

According to your own moral code, the one you live by, was the killing ,

1. Moral though undesirable
2. Immoral
3. Morally dependent on your intent
4. Morally dependent on some factor outside of you
5. Neither moral nor immoral

1. And were it me with the Ebola, not much changes.
--------------------------------------------------------------------------------------------------------------------------

Experiment #2

The world = Matrix-like world where people feel pain but dying in that world only wakes you from the matrix where you can enter again.

The People = You and 1,000 other people (females, males, and children)
The Setup = The 999 other people are divided between "evil" people dressed in white who will always attempt to kill you painfully, "good" people dressed in black who will always attempt to save you from the evil people, and "unknown" people whom are either "good" or "evil" but all wear green so you cannot tell their position until they act.

When trapped by evil people, you are able to substitute someone else in your place and that person will suffer horrible real pain before dying in the matrix.

You get trapped by evil people and the only person you can substitute is a person dressed in green whom you don't know is evil or good. Your choices are to suffer the pain and not substitute the person in green, or substitute them anyway though you don't know if they are good or evil.

According to your own moral code, the one you live by, would substituting the person in green be,

1. Its always immoral. Based on the 'world', its essentially one giant game situation, and we restart again, I am assuming as healthy as before the unpleasantness happened. Yeah, we feel pain, which is a bummer, but how the world is presented, its fleeting.


1. Always Immoral
2. Moral
3. Immoral if the person turns out to be "good"
4. Moral if the person turns out to be "evil"
5. Morally neutral
---------------------------------------------------------------------------------------------------------------------------

Experiment #3

The world = The Universe
The People = Only You
The Setup = You can hop from world to world. You have world destroying power. Some worlds have living animals, but none more sentient than a lizard.

For each action, please associate a number....

1. Immoral
2. Moral
3. Morally Neutral
4. Don't Know

a) Destroying a world with living animals on it. _______
1. There are plenty of worlds, I am assuming infinite. No reason to blow this one up.
b) Taking all animals to another world and destroying their original world ________
1. Again, why?
c) Destroying a world that you don't know has life or not. ________
1. Same thing. No reason.
d) Destroying a world with no living things on it. _______
3. and pretty cool. I don't consider this any more immoral or moral than flattening a sandcastle.
e) Killing all living things on a world so you can keep the world for yourself.
1. Though, not entirely sure what that means, since I am the only sentient thing around. I think that by default makes them mine, were I hung up on ownership.
f) Causing all animals of one kind in the universe to die because you don't like them.
1. (Like roaches or snakes or houseflies). _______ Why? If I am not a fan of them, just hope to another world. Pretty easy.

Feel free to include your reasons for you answers.
Here we have an advocate for Islamic arranged marriages demonstrating that children can consent to sex.
http://www.debate.org...