Total Posts:107|Showing Posts:1-30|Last Page
Jump to topic:

Is the Trolley problem practical?

FaustianJustice
Posts: 6,235
Add as Friend
Challenge to a Debate
Send a Message
10/24/2016 3:03:37 PM
Posted: 1 month ago
We tend to look at the trolley problem as a mock up of ethical behavior, but does the concept of practical morality over-ride a model one?

That is to say what are the odds of the trolley problem ever coming to fruition? Will such moral dilemmas ever be so clear cut with no ramifications outward?

I don't feel as though morality can really be summed up in simple thought experiment. Surely, there must be a better way to express morality through topical means than by inventing a line of bad people to put infront of a run away train.

Thoughts?
Here we have an advocate for Islamic arranged marriages demonstrating that children can consent to sex.
http://www.debate.org...
Benshapiro
Posts: 3,966
Add as Friend
Challenge to a Debate
Send a Message
10/24/2016 3:50:57 PM
Posted: 1 month ago
At 10/24/2016 3:03:37 PM, FaustianJustice wrote:
We tend to look at the trolley problem as a mock up of ethical behavior, but does the concept of practical morality over-ride a model one?

That is to say what are the odds of the trolley problem ever coming to fruition? Will such moral dilemmas ever be so clear cut with no ramifications outward?

I don't feel as though morality can really be summed up in simple thought experiment. Surely, there must be a better way to express morality through topical means than by inventing a line of bad people to put infront of a run away train.

Thoughts?

I think the problem with the Trolley problem is that one can't deduce a moral solution because whether the agent's decision is moral, amoral, or immoral depends on the agent's motive.
keithprosser
Posts: 2,042
Add as Friend
Challenge to a Debate
Send a Message
10/24/2016 5:38:57 PM
Posted: 1 month ago
I never tought of trolley problems as anything oter than 'stepping off points' for thinking about 'morality'. I think their real purose is to lay open issues such as how much of our 'moral sense' is intuitive and how much is strictly rational.

The important thing is not whether one would (or should) switch the points but to analyse the mental processes that lead to one decision or the other. The basic form of the original trolley problem is quite minimalist, so it is possible to modify it in subtle ways to see what seems to matter and what does not - does it matter if the 'one' is female, or a child for instance. That makes trolley problems near idea for expoloring moral issues in a systematic way.

One problem with trolley problems is that do encourage the notion of reducing moral judgements to simple arithmetic and over-emphasise utilitarian and consequentiaiist approaches to morality. But I think that as means to begin exploring moral issues trolley problems have their uses, but tey are only tools for exploring wider issues - they are not ends in themselves. They do not demand a solution.
Subutai
Posts: 3,235
Add as Friend
Challenge to a Debate
Send a Message
10/24/2016 8:53:58 PM
Posted: 1 month ago
At 10/24/2016 3:03:37 PM, FaustianJustice wrote:
We tend to look at the trolley problem as a mock up of ethical behavior, but does the concept of practical morality over-ride a model one?

That is to say what are the odds of the trolley problem ever coming to fruition? Will such moral dilemmas ever be so clear cut with no ramifications outward?

I don't feel as though morality can really be summed up in simple thought experiment. Surely, there must be a better way to express morality through topical means than by inventing a line of bad people to put infront of a run away train.

Thoughts?

There have been, and will certainly continue to be, real life examples of the trolley problem. In the vast majority of cases, the option was taken to sacrifice the one for the lives of the many. There were probably ramifications that arose from such decisions. Nevertheless, it's obviously not the final say on morality, as there are many other ethical dilemmas than just the trolley problem, and there is more to ethics than ethical dilemmas.
I'm becoming less defined as days go by, fading away, and well you might say, I'm losing focus, kinda drifting into the abstract in terms of how I see myself.
R0b1Billion
Posts: 3,733
Add as Friend
Challenge to a Debate
Send a Message
10/24/2016 9:02:35 PM
Posted: 1 month ago
At 10/24/2016 3:50:57 PM, Benshapiro wrote:

I think the problem with the Trolley problem is that one can't deduce a moral solution because whether the agent's decision is moral, amoral, or immoral depends on the agent's motive.

That is exactly right Ben. Without that crucial piece of information - something the TP absolutely cannot address - the TP cannot be said to be a moral dilemma.

The TP is pretty much useless in real life, it is more indicative of the type of problem-solving a fictional superhero would face.
Beliefs in a nutshell:
- The Ends never justify the Means.
- Objectivity is secondary to subjectivity.
- The War on Drugs is the worst policy in the U.S.
- Most people worship technology as a religion.
- Computers will never become sentient.
R0b1Billion
Posts: 3,733
Add as Friend
Challenge to a Debate
Send a Message
10/25/2016 2:54:05 AM
Posted: 1 month ago
At 10/24/2016 8:53:58 PM, Subutai wrote:
At 10/24/2016 3:03:37 PM, FaustianJustice wrote:
We tend to look at the trolley problem as a mock up of ethical behavior, but does the concept of practical morality over-ride a model one?

That is to say what are the odds of the trolley problem ever coming to fruition? Will such moral dilemmas ever be so clear cut with no ramifications outward?

I don't feel as though morality can really be summed up in simple thought experiment. Surely, there must be a better way to express morality through topical means than by inventing a line of bad people to put infront of a run away train.

Thoughts?

There have been, and will certainly continue to be, real life examples of the trolley problem. In the vast majority of cases, the option was taken to sacrifice the one for the lives of the many. There were probably ramifications that arose from such decisions. Nevertheless, it's obviously not the final say on morality, as there are many other ethical dilemmas than just the trolley problem, and there is more to ethics than ethical dilemmas.

In my opinion the TP doesn't even touch morality. If you want to say that it concerns ethics then that's fine, because ethics is a much looser term that can be applied to whatever you'd like it to. But morality concerns what is right and what is wrong, and there simply is not a right or wrong choice in the TP. It's not because the TP is complicated either; it's that rightness/wrongness doesn't apply to critical thinking. An easy way to prove that is to imagine somebody inept, like a retarded individual, being tested and not being able to take the best course of action. Nobody would call an inept person "immoral" for not being able to produce benefit.
Beliefs in a nutshell:
- The Ends never justify the Means.
- Objectivity is secondary to subjectivity.
- The War on Drugs is the worst policy in the U.S.
- Most people worship technology as a religion.
- Computers will never become sentient.
keithprosser
Posts: 2,042
Add as Friend
Challenge to a Debate
Send a Message
10/25/2016 5:28:04 AM
Posted: 1 month ago
But surely if the TP does not touch on morality, it is still useful because one can ask why it doesn't touch morality, or ask why some peole think it does, how it can be changed to ask a valid moral question and so on.

As a self-contained moral dilemma it might be uninteresting, but as a mean of starting an investigation into the can of worms that is morality it is as good a place to start as any, and better than most.

As we move towards driverless cars the software controlling them will have to be able to handle scenarios analogous to the trolley problem (such situations shoudn't occur, but they have to handled in case they do). In that case the TP will not be purely theoretical.
R0b1Billion
Posts: 3,733
Add as Friend
Challenge to a Debate
Send a Message
10/25/2016 5:32:58 PM
Posted: 1 month ago
At 10/25/2016 5:28:04 AM, keithprosser wrote:
But surely if the TP does not touch on morality, it is still useful because one can ask why it doesn't touch morality, or ask why some peole think it does, how it can be changed to ask a valid moral question and so on.

Ok.

As a self-contained moral dilemma it might be uninteresting, but as a mean of starting an investigation into the can of worms that is morality it is as good a place to start as any, and better than most.

Well if somebody approaches the TP looking to "do the right thing," then the moral test is finished before the actual problem is discussed - is it not? Most people would judge the morality of the situation by how able the subject is in minimizing harm, but that is obviously flawed reasoning as I have already explained. I would much rather judge people based on simple, every-day scenarios. How does the subject react when being cut off by a motorist? When being heckled? When it is time to tip a waiter or donate to charity? One extremely useful scenario is when one is dealing with perceived interiors. Fast food workers, gas station attendants, maids, pets, subordinates... Somebody could perform horribly with all these scenarios with REAL people yet get flying colors on some game (the TP) which simply puts them in a position to basically answer a riddle. In this sense, I think the TP is absolutely worthless in this regard. Your first paragraph I cannot strictly disagree with because of your wording.

As we move towards driverless cars the software controlling them will have to be able to handle scenarios analogous to the trolley problem (such situations shoudn't occur, but they have to handled in case they do). In that case the TP will not be purely theoretical.

Well it certainly could be used for purposes of liability, but I never associate legal, business, insurance, or political interests with morality.
Beliefs in a nutshell:
- The Ends never justify the Means.
- Objectivity is secondary to subjectivity.
- The War on Drugs is the worst policy in the U.S.
- Most people worship technology as a religion.
- Computers will never become sentient.
keithprosser
Posts: 2,042
Add as Friend
Challenge to a Debate
Send a Message
10/25/2016 5:44:27 PM
Posted: 1 month ago
At 10/25/2016 5:32:58 PM, R0b1Billion wrote:

Well it certainly could be used for purposes of liability, but I never associate legal, business, insurance, or political interests with morality.

I'll have to think about whether it make sense to talk about 'liability' withour reference to 'moral responibility'....
dylancatlow
Posts: 12,249
Add as Friend
Challenge to a Debate
Send a Message
10/25/2016 7:37:43 PM
Posted: 1 month ago
The Trolley Problem forces us to decide how much weight we give to utilitarian concerns over "rights", which is certainly an important thing to consider, both practically and abstractly.
R0b1Billion
Posts: 3,733
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 12:42:23 AM
Posted: 1 month ago
At 10/25/2016 5:44:27 PM, keithprosser wrote:
At 10/25/2016 5:32:58 PM, R0b1Billion wrote:

Well it certainly could be used for purposes of liability, but I never associate legal, business, insurance, or political interests with morality.

I'll have to think about whether it make sense to talk about 'liability' withour reference to 'moral responibility'....

Liability is simply a legal/business term. I wouldn't use that term as a synonym for responsibility, nor would I equate it with culpability.
Beliefs in a nutshell:
- The Ends never justify the Means.
- Objectivity is secondary to subjectivity.
- The War on Drugs is the worst policy in the U.S.
- Most people worship technology as a religion.
- Computers will never become sentient.
R0b1Billion
Posts: 3,733
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 12:44:39 AM
Posted: 1 month ago
At 10/25/2016 7:37:43 PM, dylancatlow wrote:
The Trolley Problem forces us to decide how much weight we give to utilitarian concerns over "rights", which is certainly an important thing to consider, both practically and abstractly.

Great. It's not morality though. Call it important, call it practical, call it anything you want. Just don't call it morality and we'll be cool.
Beliefs in a nutshell:
- The Ends never justify the Means.
- Objectivity is secondary to subjectivity.
- The War on Drugs is the worst policy in the U.S.
- Most people worship technology as a religion.
- Computers will never become sentient.
NHN
Posts: 624
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 1:02:04 PM
Posted: 1 month ago
At 10/24/2016 3:03:37 PM, FaustianJustice wrote:
We tend to look at the trolley problem as a mock up of ethical behavior, but does the concept of practical morality over-ride a model one?

That is to say what are the odds of the trolley problem ever coming to fruition? Will such moral dilemmas ever be so clear cut with no ramifications outward?

I don't feel as though morality can really be summed up in simple thought experiment. Surely, there must be a better way to express morality through topical means than by inventing a line of bad people to put infront of a run away train.

Thoughts?
The one who would actively opt to kill one person in order to save five -- again, not through self-defense but through active engagement where a victim is picked out -- is either deluded by conviction (ideology/religion) or a manipulative psychopath looking to justify an unjustifiable act.

As such, the trolley problem can help distinguish anything from ethical confusion to outright psychopathy in any given individual. Moreover, if there is a general consensus among utilitarians that this could be justified as a course of action in a thought experiment, the conformist/authoritarian personality will naturally agree.

The real question -- beyond ethical confusion and utilitarian conformism -- is whether there is a functional platform for psychopathy, as it occasionally serves a greater civilizational purpose (the nuking of Hiroshima and Nagasaki; the carpet-bombing of Dresden; "Operation Paperclip," i.e., the U.S. government employing thousands of Nazi scientists).
keithprosser
Posts: 2,042
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 5:33:57 PM
Posted: 1 month ago
The one who would actively opt to kill one person in order to save five -- again, not through self-defense but through active engagement where a victim is picked out -- is either deluded by conviction (ideology/religion) or a manipulative psychopath looking to justify an unjustifiable act.

Does that apply if it not 5 but 500, or 5 million?
NHN
Posts: 624
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 7:11:25 PM
Posted: 1 month ago
At 10/26/2016 5:33:57 PM, keithprosser wrote:
The one who would actively opt to kill one person in order to save five -- again, not through self-defense but through active engagement where a victim is picked out -- is either deluded by conviction (ideology/religion) or a manipulative psychopath looking to justify an unjustifiable act.

Does that apply if it not 5 but 500, or 5 million?
The trolley problem cites five casualties, but you could enter any number -- feel free to use "all of humanity" as your quantity. Either way, you're stuck with the same logical construction: kill X amount of people to save Y, which can only be known ex post facto.

As you can see, the trolley problem goes from unjustifiable to preposterous once you inflate the number of people saved. In effect, it becomes a fool's defense.
dylancatlow
Posts: 12,249
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 7:12:59 PM
Posted: 1 month ago
At 10/26/2016 12:44:39 AM, R0b1Billion wrote:
At 10/25/2016 7:37:43 PM, dylancatlow wrote:
The Trolley Problem forces us to decide how much weight we give to utilitarian concerns over "rights", which is certainly an important thing to consider, both practically and abstractly.

Great. It's not morality though. Call it important, call it practical, call it anything you want. Just don't call it morality and we'll be cool.

Morality pertains to what we ought to do, so I don't see how your demand makes any sense.
keithprosser
Posts: 2,042
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 7:24:35 PM
Posted: 1 month ago
At 10/26/2016 7:11:25 PM, NHN wrote:
At 10/26/2016 5:33:57 PM, keithprosser wrote:
The one who would actively opt to kill one person in order to save five -- again, not through self-defense but through active engagement where a victim is picked out -- is either deluded by conviction (ideology/religion) or a manipulative psychopath looking to justify an unjustifiable act.

Does that apply if it not 5 but 500, or 5 million?
The trolley problem cites five casualties, but you could enter any number -- feel free to use "all of humanity" as your quantity. Either way, you're stuck with the same logical construction: kill X amount of people to save Y, which can only be known ex post facto.

As you can see, the trolley problem goes from unjustifiable to preposterous once you inflate the number of people saved. In effect, it becomes a fool's defense.

I think that is 'yes', the long way around! Interesting that killing one person to save all of humanity is something only a manipulative pychopath would do.
sdavio
Posts: 1,798
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 7:54:38 PM
Posted: 1 month ago
At 10/26/2016 7:11:25 PM, NHN wrote:
At 10/26/2016 5:33:57 PM, keithprosser wrote:
The one who would actively opt to kill one person in order to save five -- again, not through self-defense but through active engagement where a victim is picked out -- is either deluded by conviction (ideology/religion) or a manipulative psychopath looking to justify an unjustifiable act.

Does that apply if it not 5 but 500, or 5 million?
The trolley problem cites five casualties, but you could enter any number -- feel free to use "all of humanity" as your quantity. Either way, you're stuck with the same logical construction: kill X amount of people to save Y, which can only be known ex post facto.

As you can see, the trolley problem goes from unjustifiable to preposterous once you inflate the number of people saved. In effect, it becomes a fool's defense.

I don't understand your criticism here.

We could even make the example more extreme. Let's say I need to defuse a bomb (or whatever) to stop the whole world from exploding, but in the process of getting to the location I would brush someone's arm. It's the same "utilitarian" decision between everyone dying and a guy's arm being brushed past. He might be a bit startled, but in the end my ideology would be that it's worth it.

The one who would actively opt to kill one person in order to save five -- again, not through self-defense but through active engagement where a victim is picked out -- is either deluded by conviction (ideology/religion) or a manipulative psychopath looking to justify an unjustifiable act.

It seems if anything, even more ideologically biased to abstract from two given options and base one's evaluation upon comparison to some third perfect situation that doesn't exist. Of course, it is unfortunate to opt to kill one person to save five. But this is in a situation where the only other option is to kill five people in order to save one.
"Logic is the money of the mind." - Karl Marx
NHN
Posts: 624
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 8:16:22 PM
Posted: 1 month ago
At 10/26/2016 7:24:35 PM, keithprosser wrote:
At 10/26/2016 7:11:25 PM, NHN wrote:
At 10/26/2016 5:33:57 PM, keithprosser wrote:
The one who would actively opt to kill one person in order to save five -- again, not through self-defense but through active engagement where a victim is picked out -- is either deluded by conviction (ideology/religion) or a manipulative psychopath looking to justify an unjustifiable act.

Does that apply if it not 5 but 500, or 5 million?
The trolley problem cites five casualties, but you could enter any number -- feel free to use "all of humanity" as your quantity. Either way, you're stuck with the same logical construction: kill X amount of people to save Y, which can only be known ex post facto.

As you can see, the trolley problem goes from unjustifiable to preposterous once you inflate the number of people saved. In effect, it becomes a fool's defense.
I think that is 'yes', the long way around! Interesting that killing one person to save all of humanity is something only a manipulative pychopath would do.
You're missing the point. Since we don't have access to hindsight, a rationale must be constructed. This, in turn, leads to a preposterous scenario that only reveals the extent to which an individual would justify an unjustifiable act.

And no, beyond psychopaths I also left this justification open to conformists/authoritarians, those easily manipulated by ideology/religion/structures of power, as well as the ethically confused.
dylancatlow
Posts: 12,249
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 8:24:32 PM
Posted: 1 month ago
At 10/26/2016 7:24:35 PM, keithprosser wrote:
At 10/26/2016 7:11:25 PM, NHN wrote:
At 10/26/2016 5:33:57 PM, keithprosser wrote:
The one who would actively opt to kill one person in order to save five -- again, not through self-defense but through active engagement where a victim is picked out -- is either deluded by conviction (ideology/religion) or a manipulative psychopath looking to justify an unjustifiable act.

Does that apply if it not 5 but 500, or 5 million?
The trolley problem cites five casualties, but you could enter any number -- feel free to use "all of humanity" as your quantity. Either way, you're stuck with the same logical construction: kill X amount of people to save Y, which can only be known ex post facto.

As you can see, the trolley problem goes from unjustifiable to preposterous once you inflate the number of people saved. In effect, it becomes a fool's defense.

I think that is 'yes', the long way around! Interesting that killing one person to save all of humanity is something only a manipulative pychopath would do.

Those god damn psychopaths running around with their corrupt ideas of trying to REDUCE human suffering. If they had any compassion at all, they wouldn't attempt to think their way out of their initial moral knee-jerk reaction, and would rejoice at their own moral superiority as the whole of humanity minus one perished.
Graincruncher
Posts: 2,799
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 8:35:30 PM
Posted: 1 month ago
At 10/24/2016 3:03:37 PM, FaustianJustice wrote:
We tend to look at the trolley problem as a mock up of ethical behavior, but does the concept of practical morality over-ride a model one?

That is to say what are the odds of the trolley problem ever coming to fruition? Will such moral dilemmas ever be so clear cut with no ramifications outward?

I don't feel as though morality can really be summed up in simple thought experiment. Surely, there must be a better way to express morality through topical means than by inventing a line of bad people to put infront of a run away train.

Thoughts?

I think you're misunderstanding the value of trolley problems.

1) Moral reasoning

It gets us thinking about the elements involved in our 'moral process'. Is it just about numbers? Are we as morally responsible for inaction as action? How does intent relate to actions in our moral understanding?

2) Imperfect knowledge

Quite the opposite of your objection that we never encounter examples like it in tbe real world, I'd argue we do. By changing facts in the problem - is the solo victim a doctor and the group of five all paedophiles? - we have to confront our limited knowledge in real situations. It should also make us think about how even perceived 'good' actions may have bad consequences. Or intentionally bad acts have good ones. What is our scope of judgement for an act? How far do we go with 'should have known' when deciding whether an act was moral or not?

3) Classification

What kind of moral picture do I hold? Am I consequentialist? Why? What issues may that raise? Can I address them? How?

Basically, the problems are great foundations for exploring what we mean by moral discourse, how we process moral reasoning and what that might mean for any 'better world' we try to build.
NHN
Posts: 624
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 8:43:19 PM
Posted: 1 month ago
At 10/26/2016 7:54:38 PM, sdavio wrote:
As you can see, the trolley problem goes from unjustifiable to preposterous once you inflate the number of people saved. In effect, it becomes a fool's defense.

I don't understand your criticism here.

We could even make the example more extreme. Let's say I need to defuse a bomb (or whatever) to stop the whole world from exploding, but in the process of getting to the location I would brush someone's arm. It's the same "utilitarian" decision between everyone dying and a guy's arm being brushed past. He might be a bit startled, but in the end my ideology would be that it's worth it.
That's not what the trolley problem entails. Feel free to make adjustments, but don't eliminate its significance in the process.

You stand before a trolley headed for a group of X individuals standing by the track. (1) Do nothing, and the trolley kills all of the people on the track. (2) Pull the lever, diverting the trolley onto the side track where it will kill one person.

Knowing the outcome requires foresight, which we don't have. In that sense, we can see an accident occur before us, or we can actively kill someone due to a utilitarian imperative regarding a likely future scenario.

The one who would actively opt to kill one person in order to save five -- again, not through self-defense but through active engagement where a victim is picked out -- is either deluded by conviction (ideology/religion) or a manipulative psychopath looking to justify an unjustifiable act.
It seems if anything, even more ideologically biased to abstract from two given options and base one's evaluation upon comparison to some third perfect situation that doesn't exist. Of course, it is unfortunate to opt to kill one person to save five. But this is in a situation where the only other option is to kill five people in order to save one.
You are not actively killing people by seeing them hit by a train in front of you. But if you pull the lever to kill one person, you are committing murder while deluded by an imperative. That says something about the person in question.

"Participants who indicated greater endorsement of utilitarian solutions had higher scores on measures of Psychopathy, machiavellianism, and life meaninglessness. These results question the widely-used methods by which lay moral judgments are evaluated, as these approaches lead to the counterintuitive conclusion that those individuals who are least prone to moral errors also possess a set of psychological characteristics that many would consider prototypically immoral."
(http://leeds-faculty.colorado.edu...)

As Thoreau put it, "If I knew for a certainty that a man was coming to my house with the conscious design of doing me good, I should run for my life."
dylancatlow
Posts: 12,249
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 8:45:45 PM
Posted: 1 month ago
At 10/26/2016 7:11:25 PM, NHN wrote:
At 10/26/2016 5:33:57 PM, keithprosser wrote:
The one who would actively opt to kill one person in order to save five -- again, not through self-defense but through active engagement where a victim is picked out -- is either deluded by conviction (ideology/religion) or a manipulative psychopath looking to justify an unjustifiable act.

Does that apply if it not 5 but 500, or 5 million?
The trolley problem cites five casualties, but you could enter any number -- feel free to use "all of humanity" as your quantity. Either way, you're stuck with the same logical construction: kill X amount of people to save Y, which can only be known ex post facto.

As you can see, the trolley problem goes from unjustifiable to preposterous once you inflate the number of people saved. In effect, it becomes a fool's defense.

At first you were opposed to the idea of killing one to save many out of principle, but now you're opposing it on the grounds that we can never know how many people, if any, will be saved by killing someone, so any attempt to implement utilitarian morality is pure, psychopathic speculation, which requires a rationale derived from unlikely thought experiments. It's a bad defense, and in fact doesn't even address what Keith was really challenging in your original post.
NHN
Posts: 624
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 9:43:17 PM
Posted: 1 month ago
At 10/26/2016 8:45:45 PM, dylancatlow wrote:
At first you were opposed to the idea of killing one to save many out of principle, but now you're opposing it on the grounds that we can never know how many people, if any, will be saved by killing someone, so any attempt to implement utilitarian morality is pure, psychopathic speculation, which requires a rationale derived from unlikely thought experiments. It's a bad defense, and in fact doesn't even address what Keith was really challenging in your original post.
Opposing hindsight bias does not prohibit me from simultaneously opposing the utilitarian imperative, i.e., the active choice to kill one in order to save many. And yes, the trolley problem is psychopathic speculation, but that doesn't make those who agree with this thought experiment into psychopaths -- although it most likely makes them conformists and toadies.

As I suggested above, I view psychopathy and its practical application as a far more interesting topic for discussion, as in the case with the Dresden bombing or the nuking of Hiroshima and Nagasaki or the implementation of Operation Paperclip. But utilitarians presuppose that these actions, which are pure expressions of power, are guided by reason. They are not. Calculations of power build on the perspective of one and are only accidentally to the benefit of all.
R0b1Billion
Posts: 3,733
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 10:18:53 PM
Posted: 1 month ago
At 10/26/2016 7:12:59 PM, dylancatlow wrote:
At 10/26/2016 12:44:39 AM, R0b1Billion wrote:
At 10/25/2016 7:37:43 PM, dylancatlow wrote:
The Trolley Problem forces us to decide how much weight we give to utilitarian concerns over "rights", which is certainly an important thing to consider, both practically and abstractly.

Great. It's not morality though. Call it important, call it practical, call it anything you want. Just don't call it morality and we'll be cool.

Morality pertains to what we ought to do, so I don't see how your demand makes any sense.

No, morality pertains to what we ought NOT to do.
Beliefs in a nutshell:
- The Ends never justify the Means.
- Objectivity is secondary to subjectivity.
- The War on Drugs is the worst policy in the U.S.
- Most people worship technology as a religion.
- Computers will never become sentient.
dylancatlow
Posts: 12,249
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 10:30:29 PM
Posted: 1 month ago
At 10/26/2016 10:18:53 PM, R0b1Billion wrote:
At 10/26/2016 7:12:59 PM, dylancatlow wrote:
At 10/26/2016 12:44:39 AM, R0b1Billion wrote:
At 10/25/2016 7:37:43 PM, dylancatlow wrote:
The Trolley Problem forces us to decide how much weight we give to utilitarian concerns over "rights", which is certainly an important thing to consider, both practically and abstractly.

Great. It's not morality though. Call it important, call it practical, call it anything you want. Just don't call it morality and we'll be cool.

Morality pertains to what we ought to do, so I don't see how your demand makes any sense.

No, morality pertains to what we ought NOT to do.

It pertains to right and wrong, which can be expressed both as what we ought to do and what we ought not to do. We ought to act in ways which do not run counter to what we ought not to do, and vice versa. If you ought not to stand by and let 5 innocent people die instead of 1, then you are morally obliged to intervene.
R0b1Billion
Posts: 3,733
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 10:57:13 PM
Posted: 1 month ago
At 10/26/2016 10:30:29 PM, dylancatlow wrote:
At 10/26/2016 10:18:53 PM, R0b1Billion wrote:
At 10/26/2016 7:12:59 PM, dylancatlow wrote:
At 10/26/2016 12:44:39 AM, R0b1Billion wrote:
At 10/25/2016 7:37:43 PM, dylancatlow wrote:
The Trolley Problem forces us to decide how much weight we give to utilitarian concerns over "rights", which is certainly an important thing to consider, both practically and abstractly.

Great. It's not morality though. Call it important, call it practical, call it anything you want. Just don't call it morality and we'll be cool.

Morality pertains to what we ought to do, so I don't see how your demand makes any sense.

No, morality pertains to what we ought NOT to do.

It pertains to right and wrong, which can be expressed both as what we ought to do and what we ought not to do. We ought to act in ways which do not run counter to what we ought not to do, and vice versa. If you ought not to stand by and let 5 innocent people die instead of 1, then you are morally obliged to intervene.

Moral imperatives must always be negative. Otherwise, can we ever really know if we are positively moral? If my morality is based upon an infinite number of choices of what I could do to create this arbitrary and ridiculous standard of utility or happiness or whatever it is you would like to call it, then I can never know if there was something else I could have done to create even more utility. Morality becomes infinite and meaningless, subject to any person's interpretation no matter how ridiculous. Furthermore, it becomes based on ability. Meek individuals are never as capable as powerful individuals to simply be moral people, because they are incapable of wielding the power to create as much utility. What a disaster this way of thinking is!

Morality is a sentient being's willpower to resist selfishness and indulgence. When you resist your immoral tendencies, you succeed in being a moral person. Simple. Easy. No fallacies. No infinite hazy ability-based complex explanations which are impossible to measure. No illogical advantages for the powerful over the meek. My explanation has all the elements of a perfect theory while yours has all the elements of an atomic bomb shitstorm of illogic.
Beliefs in a nutshell:
- The Ends never justify the Means.
- Objectivity is secondary to subjectivity.
- The War on Drugs is the worst policy in the U.S.
- Most people worship technology as a religion.
- Computers will never become sentient.
dylancatlow
Posts: 12,249
Add as Friend
Challenge to a Debate
Send a Message
10/26/2016 11:41:00 PM
Posted: 1 month ago
At 10/26/2016 10:57:13 PM, R0b1Billion wrote:
At 10/26/2016 10:30:29 PM, dylancatlow wrote:
At 10/26/2016 10:18:53 PM, R0b1Billion wrote:
At 10/26/2016 7:12:59 PM, dylancatlow wrote:
At 10/26/2016 12:44:39 AM, R0b1Billion wrote:
At 10/25/2016 7:37:43 PM, dylancatlow wrote:
The Trolley Problem forces us to decide how much weight we give to utilitarian concerns over "rights", which is certainly an important thing to consider, both practically and abstractly.

Great. It's not morality though. Call it important, call it practical, call it anything you want. Just don't call it morality and we'll be cool.

Morality pertains to what we ought to do, so I don't see how your demand makes any sense.

No, morality pertains to what we ought NOT to do.

It pertains to right and wrong, which can be expressed both as what we ought to do and what we ought not to do. We ought to act in ways which do not run counter to what we ought not to do, and vice versa. If you ought not to stand by and let 5 innocent people die instead of 1, then you are morally obliged to intervene.

Moral imperatives must always be negative. Otherwise, can we ever really know if we are positively moral? If my morality is based upon an infinite number of choices of what I could do to create this arbitrary and ridiculous standard of utility or happiness or whatever it is you would like to call it, then I can never know if there was something else I could have done to create even more utility. Morality becomes infinite and meaningless, subject to any person's interpretation no matter how ridiculous. Furthermore, it becomes based on ability. Meek individuals are never as capable as powerful individuals to simply be moral people, because they are incapable of wielding the power to create as much utility. What a disaster this way of thinking is!

Well, there's a difference between a person being moral and an action being moral. In my opinion, a person can be perfectly moral without necessarily carrying out moral actions, assuming they mean well, and have made sincere efforts to formulate a moral standard which they try to live up to and which they honestly believe is right (emphasis on sincere). What more could you ask of a person? When I call an action moral all I'm saying is that it's better if it were done than not done. It's what we ought to think is right, in a sense. Someone can be moral and fail to behave in the way they should, assuming they don't knowingly do wrong. Achieving a perfect state of affairs is almost certainly beyond our reach, so we are always left wondering if there was something more we could have done, but that doesn't mean we can't decide between things, and it doesn't mean that we are immoral if we can't figure out exactly how to behave. So long as we try, I don't see how we can be called immoral. So your criticism of utilitarianism as a system in which no one can be moral makes no sense. Your suggestion that meek individuals couldn't be as moral since they wouldn't wield the power to effect positive change forgets that morality requires choice. If someone can't help, then we don't criticize them for not helping. But if someone can help, we do.

Morality is a sentient being's willpower to resist selfishness and indulgence. When you resist your immoral tendencies, you succeed in being a moral person. Simple. Easy. No fallacies. No infinite hazy ability-based complex explanations which are impossible to measure. No illogical advantages for the powerful over the meek. My explanation has all the elements of a perfect theory while yours has all the elements of an atomic bomb shitstorm of illogic.

I think it goes beyond that.
FaustianJustice
Posts: 6,235
Add as Friend
Challenge to a Debate
Send a Message
10/27/2016 11:09:25 AM
Posted: 1 month ago
At 10/26/2016 8:35:30 PM, Graincruncher wrote:
At 10/24/2016 3:03:37 PM, FaustianJustice wrote:
We tend to look at the trolley problem as a mock up of ethical behavior, but does the concept of practical morality over-ride a model one?

That is to say what are the odds of the trolley problem ever coming to fruition? Will such moral dilemmas ever be so clear cut with no ramifications outward?

I don't feel as though morality can really be summed up in simple thought experiment. Surely, there must be a better way to express morality through topical means than by inventing a line of bad people to put infront of a run away train.

Thoughts?

I think you're misunderstanding the value of trolley problems.

Quite possibly.

1) Moral reasoning

It gets us thinking about the elements involved in our 'moral process'. Is it just about numbers? Are we as morally responsible for inaction as action? How does intent relate to actions in our moral understanding?

Sure, but why can't such a thinking be broken down, immediately, to real world application? The Michigan water issues, drone strikes, responsibility of fracking... to me, this all takes into consideration things beyond simple life and death, rely on the opiner to research the issue, and look to more complicated issues than death/life for a solution. Wouldn't that, as a means of morality, be more "true" than (almost literally) the Joker's Ferry Problem from "The Dark Knight"?

2) Imperfect knowledge

Quite the opposite of your objection that we never encounter examples like it in tbe real world, I'd argue we do. By changing facts in the problem - is the solo victim a doctor and the group of five all paedophiles? - we have to confront our limited knowledge in real situations. It should also make us think about how even perceived 'good' actions may have bad consequences. Or intentionally bad acts have good ones. What is our scope of judgement for an act? How far do we go with 'should have known' when deciding whether an act was moral or not?

I am not sure I have seen to many instances where a collection of pedophiles were in one perilous location, much less one that required action or inaction on some one's behalf to 'save' them, or leverage their life off against some one else's.

3) Classification

What kind of moral picture do I hold? Am I consequentialist? Why? What issues may that raise? Can I address them? How?

Basically, the problems are great foundations for exploring what we mean by moral discourse, how we process moral reasoning and what that might mean for any 'better world' we try to build.

So is what you are stating is we are not specifically finding good or bad, but a hierarchy (possibly personal one, maybe universal) as to how good or bad a problem might be.
Here we have an advocate for Islamic arranged marriages demonstrating that children can consent to sex.
http://www.debate.org...