Total Posts:59|Showing Posts:1-30|Last Page
Jump to topic:

Utilitarianism

Envisage
Posts: 3,646
Add as Friend
Challenge to a Debate
Send a Message
1/15/2015 6:11:04 PM
Posted: 1 year ago
At 1/15/2015 6:06:56 PM, darthebearnc wrote:
The best moral system. Any objections?

Why should I care about well-being? Perhaps I like being in and causing pain to others. It would be a pretty useless moral system to me then.
Amoranemix
Posts: 521
Add as Friend
Challenge to a Debate
Send a Message
1/17/2015 4:29:02 AM
Posted: 1 year ago
One should care about well-being because it has intrinsic value. Most morally aware entities care about their own well-being and morality is a concept that describes the importance one gives to the well-being of others in relation to one's own well-being. Being benevolent implies assigning value to what we think others value.

One should only do something in relation to a goal, standard or opinion, that is something one cares about.

Christian : You should adhere to God's morality.
Atheist : Why should I care about God's morality ?
Christian : Because if you don't, you'll burn in Hell for eternity.
Atheist : Why should I care about burning in Hell ?
Christian : Because it would cause you immense suffering.
Atheist : Why should I care about my suffering ?
Christian : . . .

If you don't care about anything, then nothing matters to you and you shouldn't do anything.

Utilitarianism isn't perfect, but that is no sufficient reason to dismiss it. You should only dismiss something if you have a better alternative. A morality where raping babies for fun is all right has its own problems.
The earth does not belong to man; man belongs to the earth.
XVIII18
Posts: 71
Add as Friend
Challenge to a Debate
Send a Message
1/17/2015 12:42:28 PM
Posted: 1 year ago
At 1/15/2015 6:11:04 PM, Envisage wrote:
At 1/15/2015 6:06:56 PM, darthebearnc wrote:
The best moral system. Any objections?

Why should I care about well-being? Perhaps I like being in and causing pain to others. It would be a pretty useless moral system to me then.

Then you could very well me deemed a psychopath and incapable of empathy.
XVIII18
Posts: 71
Add as Friend
Challenge to a Debate
Send a Message
1/17/2015 12:43:40 PM
Posted: 1 year ago
At 1/17/2015 12:42:28 PM, XVIII18 wrote:
At 1/15/2015 6:11:04 PM, Envisage wrote:
At 1/15/2015 6:06:56 PM, darthebearnc wrote:
The best moral system. Any objections?

Why should I care about well-being? Perhaps I like being in and causing pain to others. It would be a pretty useless moral system to me then.

Then you could very well me deemed a psychopath and incapable of empathy.

Plus in this system, since you are limiting happiness for others you would be thrown in jail.
R0b1Billion
Posts: 3,726
Add as Friend
Challenge to a Debate
Send a Message
1/20/2015 1:52:32 PM
Posted: 1 year ago
Utilitarianism has several huge pitfalls. First, it's too complex. If morality is to be useful, it must be simple enough to be used. Putting aside for the moment the fatal attribute that utility is impossible to measure, how does one meaningfully and succinctly apply such ideals to every-day actions? How does one, furthermore, pass this wisdom on to others in a form digestible-enough to be useful? A utilitarian is forever unsure about any and all actions, inept at creating any sort of principle that may guide them for future reference. Morality to the consequentialist is as elusive and complex as quantum mechanics is to the statistician. All knowledge is a posteriori.

Also, how do we know if we are correctly viewing the outcomes of any act? If I beat my dog, and assign a value of +1 towards the quality of "increasing discipline of dog," how can I know that I'm not making bogus assumptions? Beating a dog may increase its discipline, or it may not. A consequentialist is impotent in every scenario because they are constantly trying to use their intellect to determine value. Morality is, at its core, a recognition that we tend naturally towards vice (e.g., selfishness, self-superiority, indulgence) and our values of utility cannot cope with such inherent biases. Morality should be aimed at eliminating bias, not trying to incorporate bias into the moral process.

Morality should be based on principles which are simple and useful, as well as objective from bias. The Categorical Imperative is a good place to start, and works as a good test of our principles as we create them even though it isn't very prescriptive in its own right. The principles I have noticed work every time are the Seven Deadly Sins. If you are taking an action, ask yourself if the MAIN impetus behind your action is one of the following (if there is a logical reason that supports it independent of these intentions, it is justified):
Am I acting this way because I feel
- Superior to others? [pride]
- self-interested at the expense of others? [greed]
- angry? [wrath]
- hurt at the success of others? [envy]
- unreasonably consumptive? [gluttony]
- unreasonably lazy? [sloth]
- unreasonably desireful? [lust]
Beliefs in a nutshell:
- The Ends never justify the Means.
- Objectivity is secondary to subjectivity.
- The War on Drugs is the worst policy in the U.S.
- Most people worship technology as a religion.
- Computers will never become sentient.
MyDinosaurHands
Posts: 203
Add as Friend
Challenge to a Debate
Send a Message
1/22/2015 8:28:35 AM
Posted: 1 year ago
At 1/15/2015 6:11:04 PM, Envisage wrote:
At 1/15/2015 6:06:56 PM, darthebearnc wrote:
The best moral system. Any objections?

Why should I care about well-being? Perhaps I like being in and causing pain to others. It would be a pretty useless moral system to me then.

Your objection is an extreme minority view however.

I think this system of morality is the best for the majority of people.
Guess what I used to type this..

Careful! Don't laugh too hard.
Envisage
Posts: 3,646
Add as Friend
Challenge to a Debate
Send a Message
1/22/2015 8:29:49 AM
Posted: 1 year ago
At 1/22/2015 8:28:35 AM, MyDinosaurHands wrote:
At 1/15/2015 6:11:04 PM, Envisage wrote:
At 1/15/2015 6:06:56 PM, darthebearnc wrote:
The best moral system. Any objections?

Why should I care about well-being? Perhaps I like being in and causing pain to others. It would be a pretty useless moral system to me then.

Your objection is an extreme minority view however.

I think this system of morality is the best for the majority of people.

What kind of crap argument is that?

The exact same arguments were used to maintain slavery and black discrimination. Wtf.
MyDinosaurHands
Posts: 203
Add as Friend
Challenge to a Debate
Send a Message
1/22/2015 8:31:56 AM
Posted: 1 year ago
At 1/20/2015 1:52:32 PM, R0b1Billion wrote:
Utilitarianism has several huge pitfalls. First, it's too complex. If morality is to be useful, it must be simple enough to be used. Putting aside for the moment the fatal attribute that utility is impossible to measure, how does one meaningfully and succinctly apply such ideals to every-day actions?

Well in situations where you cannot be sure of an act's utility, the act could be deferred to other kinds of morality. Just because it cannot be easily applied in all situations does not mean we should instantly discard it.
Guess what I used to type this..

Careful! Don't laugh too hard.
R0b1Billion
Posts: 3,726
Add as Friend
Challenge to a Debate
Send a Message
1/22/2015 8:37:26 AM
Posted: 1 year ago
At 1/22/2015 8:31:56 AM, MyDinosaurHands wrote:
At 1/20/2015 1:52:32 PM, R0b1Billion wrote:
Utilitarianism has several huge pitfalls. First, it's too complex. If morality is to be useful, it must be simple enough to be used. Putting aside for the moment the fatal attribute that utility is impossible to measure, how does one meaningfully and succinctly apply such ideals to every-day actions?

Well in situations where you cannot be sure of an act's utility,

Any situation...

the act could be deferred to other kinds of morality.

That's like saying that, if I can't figure out a phenomenon with science, I should defer to other kinds of science. Either you are using correct science or morality, or else you are not.

Just because it cannot be easily applied in all situations does not mean we should instantly discard it.

Yes, it does. We should replace it with a theory that can be easily applied in all situations. That's like saying you shouldn't discard and automobile that only runs when the temperature is about 70 degrees; why not just get a car that always runs?
Beliefs in a nutshell:
- The Ends never justify the Means.
- Objectivity is secondary to subjectivity.
- The War on Drugs is the worst policy in the U.S.
- Most people worship technology as a religion.
- Computers will never become sentient.
MyDinosaurHands
Posts: 203
Add as Friend
Challenge to a Debate
Send a Message
1/22/2015 8:39:16 AM
Posted: 1 year ago
At 1/22/2015 8:29:49 AM, Envisage wrote:
At 1/22/2015 8:28:35 AM, MyDinosaurHands wrote:
At 1/15/2015 6:11:04 PM, Envisage wrote:
At 1/15/2015 6:06:56 PM, darthebearnc wrote:
The best moral system. Any objections?

Why should I care about well-being? Perhaps I like being in and causing pain to others. It would be a pretty useless moral system to me then.

Your objection is an extreme minority view however.

I think this system of morality is the best for the majority of people.

What kind of crap argument is that?

The exact same arguments were used to maintain slavery and black discrimination. Wtf.

When you take my statements out of context, sure. But when you acknowledge that those statements are in defense of a system of morality that is against those things you mentioned, you look like an a-hole.

Utilitarianism takes the general human want for people to be happy, and attempts to measure out actions in accordance with meeting that want as well as possible. Now, you mentioned, "What if I want there to be as much pain as possible?" Obviously this would not be compatible with Utilitarianism, however, because most people do not think the way you do, in other words, not as a-holes, Utilitarianism will meet the wants of most people.

And yes, meeting the wants of most people is what moral justifications for slavery did, HOWEVER, the wants that moral justifications for slavery meet and the wants that Utilitarianism meet are different, in that the former is generally held to be bad, while the latter is generally held to be good.
Guess what I used to type this..

Careful! Don't laugh too hard.
Envisage
Posts: 3,646
Add as Friend
Challenge to a Debate
Send a Message
1/22/2015 8:51:05 AM
Posted: 1 year ago
At 1/22/2015 8:39:16 AM, MyDinosaurHands wrote:
At 1/22/2015 8:29:49 AM, Envisage wrote:
At 1/22/2015 8:28:35 AM, MyDinosaurHands wrote:
At 1/15/2015 6:11:04 PM, Envisage wrote:
At 1/15/2015 6:06:56 PM, darthebearnc wrote:
The best moral system. Any objections?

Why should I care about well-being? Perhaps I like being in and causing pain to others. It would be a pretty useless moral system to me then.

Your objection is an extreme minority view however.

I think this system of morality is the best for the majority of people.

What kind of crap argument is that?

The exact same arguments were used to maintain slavery and black discrimination. Wtf.

When you take my statements out of context, sure. But when you acknowledge that those statements are in defense of a system of morality that is against those things you mentioned, you look like an a-hole.

I literally quoted your entire post in my post. How the f did I take it out of context. Your entire 2 sentence "argument" committed the following fallacies:

1. Appeal to minority
Whether or not my objection is a minority view is completely irrelevant to whether or not it is sound.

2. Bare assertion fallacy
You literally declare your own opinion baldly " I think this system of morality is the best for the majority of people.". I think we should start selling baby foetuses as fertiliser, does that make my argument a good one.

So no, your argument is a load of crap, you didn't bring up context or other arguments when you made your piece of crap argument. I responded to exactly what you wrote. To try and switch things around is both dishonest and disingenuous.

Utilitarianism takes the general human want for people to be happy, and attempts to measure out actions in accordance with meeting that want as well as possible. Now, you mentioned, "What if I want there to be as much pain as possible?" Obviously this would not be compatible with Utilitarianism, however, because most people do not think the way you do, in other words, not as a-holes, Utilitarianism will meet the wants of most people.

The OP is about the "best moral system", thus EVEN IF the majority of people's wants are met by utilitarianism (even that I doubt), it doesn't demonstrate what the OP is claiming. You need to compare other ethical systems to do that. For the sake of argument, if you want to maximise the wants of people, then collectivist egoism is going to be a much better system to utilitarianism, since it by definition regards self-interest (and thus the wants), and not well-being (although it will entail it for most people, as that would be a self-interest).

And yes, meeting the wants of most people is what moral justifications for slavery did, HOWEVER, the wants that moral justifications for slavery meet and the wants that Utilitarianism meet are different, in that the former is generally held to be bad, while the latter is generally held to be good.

They weren't at the time of slavery, you are just asserting your subjective opinion on the matter. It doesn't change the fact your crap 2 sentence argument was equally applied then, and validly so. We hold it to be generally bad now, but that's exactly the point I am making. Your crap argument entails absurd conclusions.
MyDinosaurHands
Posts: 203
Add as Friend
Challenge to a Debate
Send a Message
1/22/2015 10:08:56 AM
Posted: 1 year ago
At 1/22/2015 8:37:26 AM, R0b1Billion wrote:
At 1/22/2015 8:31:56 AM, MyDinosaurHands wrote:
At 1/20/2015 1:52:32 PM, R0b1Billion wrote:
Utilitarianism has several huge pitfalls. First, it's too complex. If morality is to be useful, it must be simple enough to be used. Putting aside for the moment the fatal attribute that utility is impossible to measure, how does one meaningfully and succinctly apply such ideals to every-day actions?

Well in situations where you cannot be sure of an act's utility,

Any situation...
True, but you'll hardly deny that there are situations where the odds are strongly in favor of one kind of outcome. In these situations, it would be better to bet on the outcome with high odds of good utility than to not. You're ensuring the best possible outcomes as often as possible by doing this.
the act could be deferred to other kinds of morality.

That's like saying that, if I can't figure out a phenomenon with science, I should defer to other kinds of science.
Yeah.. why not? If I can't solve something from a physiological perspective, why would it be wrong of me to look at it from a psychological perspective?
Either you are using correct science or morality, or else you are not.
I guess that's where we're going to differ. I don't think morality is objective.
Just because it cannot be easily applied in all situations does not mean we should instantly discard it.

Yes, it does. We should replace it with a theory that can be easily applied in all situations.
Just because a solution is simple does not mean it is the best. An eye for an eye is simple, but it is often not the best solution, as it only furthers animosity and other attitudes that lead to more conflict.
That's like saying you shouldn't discard and automobile that only runs when the temperature is about 70 degrees; why not just get a car that always runs?
No, what I am proposing is like using different tools for different situations, each one best suited for its situation. What you're saying is we try to get some kind of fix-all tool. The problem with that though is that the tool, having focused on range, cannot focus on complex skill.
Guess what I used to type this..

Careful! Don't laugh too hard.
n7
Posts: 1,358
Add as Friend
Challenge to a Debate
Send a Message
1/22/2015 1:06:19 PM
Posted: 1 year ago
Here's one problem I've found with Utilitarianism.

It would hold that one death is preferable to 100 mutilations. 100 mutilations would cause more harm than one death. It seems that one big harm is better than many many smaller harms. This is where utilitarianism runs into problems. What if letting one man suffer for a day will stop 5 billion migraines? A migraine seems minor compared to suffering. When dealing with a lot of small injuries, utilitarianism causes unproportional results.
404 coherent debate topic not found. Please restart the debate with clear resolution.


Uphold Marxist-Leninist-Maoist-Sargonist-n7ism.
Sidewalker
Posts: 3,713
Add as Friend
Challenge to a Debate
Send a Message
1/22/2015 3:08:51 PM
Posted: 1 year ago
At 1/15/2015 6:06:56 PM, darthebearnc wrote:
The best moral system. Any objections?

The problem with Utilitarianism is that it is inconsistent with the moral experience and it is impractical.

When we are confronted with a moral situation, what occurs is not a process of calculation and decision, it is a matter of discernment and response. Morality operates as if it is a matter of apprehending a moral dimension of reality, it involves more than our rational faculties. Even if it were a simple matter of utilitarian logic, it isn"t really practical to even attempt to calculate the greatest good every time we are faced with a situation that requires a moral response.
"It is one of the commonest of mistakes to consider that the limit of our power of perception is also the limit of all there is to perceive." " C. W. Leadbeater
MyDinosaurHands
Posts: 203
Add as Friend
Challenge to a Debate
Send a Message
1/22/2015 3:57:35 PM
Posted: 1 year ago
At 1/22/2015 1:06:19 PM, n7 wrote:

When dealing with a lot of small injuries, utilitarianism causes unproportional results.
I have to disagree. You don't lose proportion because the injuries are numerous. Like you said, 100 mutilations versus 1 death. A death is how many times worse than mutilation? Obviously this question doesn't have a very solid answer, but assuming there is one, or assuming we get to a number so high that we are fairly certain of the outcome, we can see if killing the one person would produce good results.
Guess what I used to type this..

Careful! Don't laugh too hard.
n7
Posts: 1,358
Add as Friend
Challenge to a Debate
Send a Message
1/22/2015 5:01:30 PM
Posted: 1 year ago
At 1/22/2015 3:57:35 PM, MyDinosaurHands wrote:
At 1/22/2015 1:06:19 PM, n7 wrote:

When dealing with a lot of small injuries, utilitarianism causes unproportional results.
I have to disagree. You don't lose proportion because the injuries are numerous. Like you said, 100 mutilations versus 1 death. A death is how many times worse than mutilation

Well, one death would cause hardship for a certain group of people and removes one life. 100 mutilations causes hardship for hundreds of people plus their families and removes quality of life for hundreds. It at the very least shows it has a hard to answer problem for utilitarianism.
404 coherent debate topic not found. Please restart the debate with clear resolution.


Uphold Marxist-Leninist-Maoist-Sargonist-n7ism.
R0b1Billion
Posts: 3,726
Add as Friend
Challenge to a Debate
Send a Message
1/23/2015 7:56:09 PM
Posted: 1 year ago
At 1/22/2015 10:08:56 AM, MyDinosaurHands wrote:

Well in situations where you cannot be sure of an act's utility,

Any situation...
True, but you'll hardly deny that there are situations where the odds are strongly in favor of one kind of outcome. In these situations, it would be better to bet on the outcome with high odds of good utility than to not. You're ensuring the best possible outcomes as often as possible by doing this.

Well in such situations, where the choice is rather obvious, we're not actively thinking about ethics, are we? If my friend is drowning, I'm not going to be confused about what to do and require a theory to guide me. Ethics is most interesting in situations where we are presented with a challenge. There are (usually) two courses of action and asking different people would yield different results. There are probably two main ways of handling the situation: 1) think consequentially, and pick the course of action which yields the best ends or 2) think deontologically, and look at a principle to adhere to (means). If your principles are strong, the deontological thinker is going to identify one course of action which is selfish and one which is selfless (compare this to the utilitarian, which must compare values of results and can make endless mistakes). The former will seem good in some intellectual sense; it will appear to yield ends we desire, if we can handle questionable means. The latter will require discipline; we might not see how the ends are desirable right away. But without fail, the latter choice will, in the end, provide us with something to be proud of and help us avoid guilt. However, if one believes they are a know-it-all (like utilitarians usually are), they are going to insist they can comprehend the ends fine and don't need principles to work by. Such thinking also has the very convenient effect of allowing cognitive dissonance to settle in (i.e., allowing us to minimize things we don't wish to see and justify results we don't wish to take seriously).

the act could be deferred to other kinds of morality.

That's like saying that, if I can't figure out a phenomenon with science, I should defer to other kinds of science.
Yeah.. why not? If I can't solve something from a physiological perspective, why would it be wrong of me to look at it from a psychological perspective?

My analogy was worded terribly. You are either using morality, or you are not. You are either using the scientific method, or you are not. I didn't mean to say different kinds of science, I meant to say different kinds of reasoning other than science.

Either you are using correct science or morality, or else you are not.
I guess that's where we're going to differ. I don't think morality is objective.

Are murder and rape wrong?

Just because it cannot be easily applied in all situations does not mean we should instantly discard it.

Yes, it does. We should replace it with a theory that can be easily applied in all situations.
Just because a solution is simple does not mean it is the best. An eye for an eye is simple, but it is often not the best solution, as it only furthers animosity and other attitudes that lead to more conflict.

An eye for an eye can be worded simply, but it is far from simple if you break it down logically. Why, for example, would hurting somebody produce me good results? If I am hurt by somebody, what is the logic that brings me to conclude "therefore, I should hurt them back"? Compare that to my personal principle, the principle Jesus proposes, which is to forgive those who trespass against you. There are logical reasons that hurting others is not a good thing, and it is simple to show.

That's like saying you shouldn't discard and automobile that only runs when the temperature is about 70 degrees; why not just get a car that always runs?
No, what I am proposing is like using different tools for different situations, each one best suited for its situation. What you're saying is we try to get some kind of fix-all tool. The problem with that though is that the tool, having focused on range, cannot focus on complex skill.

My tool works for everything. And it works better for any and all situations better than a thousand utilitarian tools. If you want to test our tools on hypothetical scenarios I would love to prove it to you. I'll even let you construct the scenarios to suit your tools perfectly if you'd like...
Beliefs in a nutshell:
- The Ends never justify the Means.
- Objectivity is secondary to subjectivity.
- The War on Drugs is the worst policy in the U.S.
- Most people worship technology as a religion.
- Computers will never become sentient.
Amoranemix
Posts: 521
Add as Friend
Challenge to a Debate
Send a Message
1/24/2015 5:16:48 AM
Posted: 1 year ago
R0b1billion 6
Utilitarianism has several huge pitfalls. First, it's too complex. If morality is to be useful, it must be simple enough to be used. Putting aside for the moment the fatal attribute that utility is impossible to measure, how does one meaningfully and succinctly apply such ideals to every-day actions? How does one, furthermore, pass this wisdom on to others in a form digestible-enough to be useful? A utilitarian is forever unsure about any and all actions, inept at creating any sort of principle that may guide them for future reference. Morality to the consequentialist is as elusive and complex as quantum mechanics is to the statistician. All knowledge is a posteriori.
Utilitarianism isn't a method, but a goal or measure for success. That the most good for the most people cannot be achieved perfectly is not good reason not to strive for it.
Raising a child is very complicated. Does that mean a parent shouldn't try ?
Eliminating poverty from the world is very complicated. Does that mean we shouldn't try ?
Preserving all species from the earth's ecosystem is very complicated. Does that mean we shouldn't try ?
Running a country is very complicated. Does that mean we shouldn't try ?
I think almost everyone agrees the answer is a resounding no every time. We may not be able to achieve the most good for most people, but at least we will achieve more good for more people by trying.

R0b1billion 6
Also, how do we know if we are correctly viewing the outcomes of any act? If I beat my dog, and assign a value of +1 towards the quality of "increasing discipline of dog," how can I know that I'm not making bogus assumptions? Beating a dog may increase its discipline, or it may not. A consequentialist is impotent in every scenario because they are constantly trying to use their intellect to determine value.[1] Morality is, at its core, a recognition that we tend naturally towards vice (e.g., selfishness, self-superiority, indulgence) and our values of utility cannot cope with such inherent biases. Morality should be aimed at eliminating bias, not trying to incorporate bias into the moral process.[2]
[1] Animals don't work that way. They are able to take reasonably accurate decisions with insufficient information. That's what they have evolved for. Humans have developed as social animals by millions of years of evolution and have years of training while growing up. That allows them to gauge in most circumstances what the morally right thing to do is. Following a simpler moral standard that ignores context may be easier and thus behaviour will be closer to the ideal of that standard, but will it also be better ?
[2] You seem to be saying that morality should aim at valueing the intrests of others as much as one's own intrests, which happens to be what utilitarianism promotes. Where do you get that values of utility cannot cope with inherent biases ?

R0b1billion 6
Morality should be based on principles which are simple and useful, as well as objective from bias. The Categorical Imperative is a good place to start, and works as a good test of our principles as we create them even though it isn't very prescriptive in its own right. The principles I have noticed work every time are the Seven Deadly Sins. If you are taking an action, ask yourself if the MAIN impetus behind your action is one of the following (if there is a logical reason that supports it independent of these intentions, it is justified):
Of course it is useful to work out how one could achieve the most good for the most people and set up intermediate goals and guidelines to make decisions. In a chess game the goal is to checkmate your opponent. Yet chess players don't think about that all the time. They consider many aspects of the position that are relevant to checkmating their opponent (and not getting checkmated themselves), like material, space and pawn structure. All that doesn't prevent the ultimate goal being to checkmate your opponent.
The categorical imperative is a rule that may be useful to achieve more good, but it isn't always practical either. Your seven sins may also serve that purpose. What is the goal of those rules if not the achieve the most good for the most people ?

Sidewalker 16
The problem with Utilitarianism is that it is inconsistent with the moral experience and it is impractical.

When we are confronted with a moral situation, what occurs is not a process of calculation and decision, it is a matter of discernment and response. Morality operates as if it is a matter of apprehending a moral dimension of reality, it involves more than our rational faculties. Even if it were a simple matter of utilitarian logic, it isn"t really practical to even attempt to calculate the greatest good every time we are faced with a situation that requires a moral response.
See my response to R0b1Billion's second paragraph.

n7 17
Well, one death would cause hardship for a certain group of people and removes one life. 100 mutilations causes hardship for hundreds of people plus their families and removes quality of life for hundreds. It at the very least shows it has a hard to answer problem for utilitarianism.
Indeed, the world is complicated, but as I said, that doesn't imply we shouldn't try to deal with it. Legislation is very complicated and yet that is the set of rules democracies rely on to guide the behaviour of their citizens. Besides, what is bad about complexity ?

R0b1billion 18
Well in such situations, where the choice is rather obvious, we're not actively thinking about ethics, are we? If my friend is drowning, I'm not going to be confused about what to do and require a theory to guide me.
The best course of action, probably saving your friend, would be in accordance with utilitarianism. Whether you know that and how you decide for that action are different issues.

R0b1billion 18
My analogy was worded terribly. You are either using morality, or you are not. You are either using the scientific method, or you are not. I didn't mean to say different kinds of science, I meant to say different kinds of reasoning other than science.
You can more or less divide sciences in fundamental science and practical science. Fundamental sciences are more accurate, but also more complicated and therefore harder to apply. Describing a lion with quantum mechanics wouldn't work in practice because it would be too complicated. Biology works better for that. To describe the motions of planets it is usually not worth the trouble to use the general theory of relativity.
So if we use science as an analogy for morality, utilitarianism could be the most accurate science, while other moralities would be simplifications that can be applied in practice or even pseudoscience.

R0b1billion 18
An eye for an eye can be worded simply, but it is far from simple if you break it down logically. Why, for example, would hurting somebody produce me good results? If I am hurt by somebody, what is the logic that brings me to conclude "therefore, I should hurt them back"? Compare that to my personal principle, the principle Jesus proposes, which is to forgive those who trespass against you. There are logical reasons that hurting others is not a good thing, and it is simple to show.
Forgiving others seems more benevolent than utilitarianism would recommend, but by doing so you may hurt yourself as people could exploit you.
The earth does not belong to man; man belongs to the earth.
Graincruncher
Posts: 2,799
Add as Friend
Challenge to a Debate
Send a Message
1/24/2015 5:53:06 AM
Posted: 1 year ago
At 1/15/2015 6:06:56 PM, darthebearnc wrote:
The best moral system. Any objections?

Yep - it isn't actually a moral system, rather a description of what it is to be a moral system. It borders on tautology: better is better.
R0b1Billion
Posts: 3,726
Add as Friend
Challenge to a Debate
Send a Message
1/24/2015 11:31:33 AM
Posted: 1 year ago
At 1/24/2015 5:16:48 AM, Amoranemix wrote:

Utilitarianism isn't a method, but a goal or measure for success. That the most good for the most people cannot be achieved perfectly is not good reason not to strive for it.

If it's not a method, that means it doesn't tell you what to do. It's worthless.

Raising a child is very complicated. Does that mean a parent shouldn't try ?
Eliminating poverty from the world is very complicated. Does that mean we shouldn't try ?
Preserving all species from the earth's ecosystem is very complicated. Does that mean we shouldn't try ?
Running a country is very complicated. Does that mean we shouldn't try ?
I think almost everyone agrees the answer is a resounding no every time. We may not be able to achieve the most good for most people, but at least we will achieve more good for more people by trying.

So let me get this straight. There exists complicated goals in the world, therefore we should remove any efforts to be succinct in all endeavors? Maybe instead of red, yellow, and green lights at intersections we can institute 1,000 different shades of colors which all mean different things. Because, you know, raising children and running a country are complex so so should all our rules.

Also, how do we know if we are correctly viewing the outcomes of any act? If I beat my dog, and assign a value of +1 towards the quality of "increasing discipline of dog," how can I know that I'm not making bogus assumptions? Beating a dog may increase its discipline, or it may not. A consequentialist is impotent in every scenario because they are constantly trying to use their intellect to determine value.[1] Morality is, at its core, a recognition that we tend naturally towards vice (e.g., selfishness, self-superiority, indulgence) and our values of utility cannot cope with such inherent biases. Morality should be aimed at eliminating bias, not trying to incorporate bias into the moral process.
[1] Animals don't work that way. They are able to take reasonably accurate decisions with insufficient information. That's what they have evolved for. Humans have developed as social animals by millions of years of evolution and have years of training while growing up. That allows them to gauge in most circumstances what the morally right thing to do is. Following a simpler moral standard that ignores context may be easier and thus behaviour will be closer to the ideal of that standard, but will it also be better ?

Yes.

[2] You seem to be saying that morality should aim at valueing the intrests of others as much as one's own intrests, which happens to be what utilitarianism promotes. Where do you get that values of utility cannot cope with inherent biases ?

Utilitarianism promotes whatever you want it to promote. Evil, money, astrology, etc.

Of course it is useful to work out how one could achieve the most good for the most people and set up intermediate goals and guidelines to make decisions. In a chess game the goal is to checkmate your opponent. Yet chess players don't think about that all the time. They consider many aspects of the position that are relevant to checkmating their opponent (and not getting checkmated themselves), like material, space and pawn structure. All that doesn't prevent the ultimate goal being to checkmate your opponent.

Chess is an inherently-opposite thing than morality. In chess, your only goal is your opponent's demise...

R0b1billion 18
Well in such situations, where the choice is rather obvious, we're not actively thinking about ethics, are we? If my friend is drowning, I'm not going to be confused about what to do and require a theory to guide me.
The best course of action, probably saving your friend, would be in accordance with utilitarianism. Whether you know that and how you decide for that action are different issues.

No utilitarianism doesn't necessarily dictate that. What if the actual best possibility is that you leave immediately, uncover buried treasure that is a mile away, sell it and invest the money in some little-known medical technology, and then that technology is used to save many people? Let's further assume that if you don't discover the treasure immediately, somebody else will and use that money to hurt people (i.e., you don't have time to save your friend first).

R0b1billion 18
My analogy was worded terribly. You are either using morality, or you are not. You are either using the scientific method, or you are not. I didn't mean to say different kinds of science, I meant to say different kinds of reasoning other than science.
You can more or less divide sciences in fundamental science and practical science. Fundamental sciences are more accurate, but also more complicated and therefore harder to apply. Describing a lion with quantum mechanics wouldn't work in practice because it would be too complicated. Biology works better for that. To describe the motions of planets it is usually not worth the trouble to use the general theory of relativity.
So if we use science as an analogy for morality, utilitarianism could be the most accurate science, while other moralities would be simplifications that can be applied in practice or even pseudoscience.

No, utilitarianism would be a complex, inaccurate form of science and my version would be a simple, accurate form.

R0b1billion 18
An eye for an eye can be worded simply, but it is far from simple if you break it down logically. Why, for example, would hurting somebody produce me good results? If I am hurt by somebody, what is the logic that brings me to conclude "therefore, I should hurt them back"? Compare that to my personal principle, the principle Jesus proposes, which is to forgive those who trespass against you. There are logical reasons that hurting others is not a good thing, and it is simple to show.
Forgiving others seems more benevolent than utilitarianism would recommend, but by doing so you may hurt yourself as people could exploit you.

Exploit me for what? I don't have much money, I live simply and what I have I spend on food or give to charity. Only those who are not living morally are able to be exploited (i.e., the wealthy).
Beliefs in a nutshell:
- The Ends never justify the Means.
- Objectivity is secondary to subjectivity.
- The War on Drugs is the worst policy in the U.S.
- Most people worship technology as a religion.
- Computers will never become sentient.
Juan_Pablo
Posts: 2,052
Add as Friend
Challenge to a Debate
Send a Message
1/24/2015 11:48:25 AM
Posted: 1 year ago
At 1/15/2015 6:11:04 PM, Envisage wrote:
At 1/15/2015 6:06:56 PM, darthebearnc wrote:
The best moral system. Any objections?

Why should I care about well-being? Perhaps I like being in and causing pain to others. It would be a pretty useless moral system to me then.

Those of us that wanted peace and well-being for good citizens would then require us to place you in prison. So there is a place for your perspective in our civilization: it's behind prison walls.

Now, hopefully that experience will transform to rethink this philosophical approach in life. If not however, then prison will have to be your permanent dwelling place.
Juan_Pablo
Posts: 2,052
Add as Friend
Challenge to a Debate
Send a Message
1/24/2015 11:49:59 AM
Posted: 1 year ago
At 1/24/2015 11:48:25 AM, Juan_Pablo wrote:
At 1/15/2015 6:11:04 PM, Envisage wrote:
At 1/15/2015 6:06:56 PM, darthebearnc wrote:
The best moral system. Any objections?

Why should I care about well-being? Perhaps I like being in and causing pain to others. It would be a pretty useless moral system to me then.

Those of us that wanted peace and well-being for good citizens would then require us to place you in prison. So there is a place for your perspective in our civilization: it's behind prison walls.

Now, hopefully that experience will transform to rethink this philosophical approach in life. If not however, then prison will have to be your permanent dwelling place.


Correction:

transform = transform you

If not however = If not, however,
Juan_Pablo
Posts: 2,052
Add as Friend
Challenge to a Debate
Send a Message
1/24/2015 12:25:28 PM
Posted: 1 year ago
Now, I also want to be mindful of the small percentage of people in our society who are wrongly placed in prison: who are convicted for crimes they did not commit. For this reason I do believe it's useful for the justice to re-examine cases to let these people go free.

Also, it's absolutely critical that people in prison are treated with respect, provided with human rights, and are allowed to participate in rehabilitative therapy and processes. The goal of any prison should be to incarcerate the convicted, but also to rehabilitate them back into our communities.

If a prisoner is provided with a life sentence, or is placed on death row (to be executed), he must still be treated with dignity and provided with human rights up until the very moment that he is executed humanely.
Juan_Pablo
Posts: 2,052
Add as Friend
Challenge to a Debate
Send a Message
1/24/2015 12:26:19 PM
Posted: 1 year ago
At 1/24/2015 12:25:28 PM, Juan_Pablo wrote:
Now, I also want to be mindful of the small percentage of people in our society who are wrongly placed in prison: who are convicted for crimes they did not commit. For this reason I do believe it's useful for the justice to re-examine cases to let these people go free.

Also, it's absolutely critical that people in prison are treated with respect, provided with human rights, and are allowed to participate in rehabilitative therapy and processes. The goal of any prison should be to incarcerate the convicted, but also to rehabilitate them back into our communities.

If a prisoner is provided with a life sentence, or is placed on death row (to be executed), he must still be treated with dignity and provided with human rights up until the very moment that he is executed humanely.


Correction:

justice = justice system
Amoranemix
Posts: 521
Add as Friend
Challenge to a Debate
Send a Message
1/25/2015 5:14:53 AM
Posted: 1 year ago
- Amoranemix 19
Utilitarianism isn't a method, but a goal or measure for success. That the most good for the most people cannot be achieved perfectly is not good reason not to strive for it.
- R0b1Billion 21
If it's not a method, that means it doesn't tell you what to do. It's worthless.
That does not follow. Utilitarianism tells you what to strive for.

R0b1Billion 21
So let me get this straight. There exists complicated goals in the world, therefore we should remove any efforts to be succinct in all endeavors?[3] Maybe instead of red, yellow, and green lights at intersections we can institute 1,000 different shades of colors which all mean different things.[4] Because, you know, raising children and running a country are complex so so should all our rules.
[3] I already answered that with no.
[4] That does not follow. That some goals are complex is no good reason to add complexity to the rules for unrelated goals.

- Amoranemix 19
[1] Animals don't work that way. They are able to take reasonably accurate decisions with insufficient information. That's what they have evolved for. Humans have developed as social animals by millions of years of evolution and have years of training while growing up. That allows them to gauge in most circumstances what the morally right thing to do is. Following a simpler moral standard that ignores context may be easier and thus behaviour will be closer to the ideal of that standard, but will it also be better ?
- R0b1Billion 21
Yes.
Prove it.

- Amoranemix 19
[2] You seem to be saying that morality should aim at valuing the interests of others as much as one's own interests, which happens to be what utilitarianism promotes. Where do you get that values of utility cannot cope with inherent biases ?
- R0b1Billion 21
Utilitarianism promotes whatever you want it to promote. Evil, money, astrology, etc.
No. Utilitarianism promotes well-being.

- R0b1Billion 21
Chess is an inherently-opposite thing than morality. In chess, your only goal is your opponent's demise...
Which is completely beside the point.

- Amoranemix 19
The categorical imperative is a rule that may be useful to achieve more good, but it isn't always practical either. Your seven sins may also serve that purpose. What is the goal of those rules if not the achieve the most good for the most people ?
- R0b1Billion 21
[no response]
You forgot to answer my question.

- Amoranemix 19
The best course of action, probably saving your friend, would be in accordance with utilitarianism. Whether you know that and how you decide for that action are different issues.
- R0b1Billion 21
No utilitarianism doesn't necessarily dictate that.[5] What if the actual best possibility is that you leave immediately, uncover buried treasure that is a mile away, sell it and invest the money in some little-known medical technology, and then that technology is used to save many people? Let's further assume that if you don't discover the treasure immediately, somebody else will and use that money to hurt people (i.e., you don't have time to save your friend first).[6]
[5] Like you I develop my own moral philosophy rather than just adhering to an existing one. Utilitarianism/consequentialism seems to most closely adhere to mine. Your version of utilitarianism that you seem to purposefully deprave is not one I adhere to.
[6] You are describing extraordinary and seemingly unrealistic circumstances, but if going for the treasure is the best thing to do, then that is what you should do.

R0b1Billion 21
No, utilitarianism would be a complex, inaccurate form of science and my version would be a simple, accurate form.
How is utilitarianism inaccurate and how is your version accurate ?

- Amoranemix 19
Forgiving others seems more benevolent than utilitarianism would recommend, but by doing so you may hurt yourself as people could exploit you.
- R0b1Billion 21
Exploit me for what? I don't have much money, I live simply and what I have I spend on food or give to charity. Only those who are not living morally are able to be exploited (i.e., the wealthy).
I am not arguing that it would be best for the world that you personally adhere to utilitarianism as I don't know you and you may be an exception. My remark however applied to most people.
To your claim about the wealthy : utilitarianism promotes the distribution of wealth.
The earth does not belong to man; man belongs to the earth.
R0b1Billion
Posts: 3,726
Add as Friend
Challenge to a Debate
Send a Message
1/25/2015 1:36:55 PM
Posted: 1 year ago
At 1/25/2015 5:14:53 AM, Amoranemix wrote:

Before we continue, I have to say that I see you are a new member and I want to introduce you to some forum etiquette. I'm not trying to be difficult, I am saying this because I am having trouble communicating with you and I want to help you better use the site going forward. There are four main issues I see with your posts so far:

1) You're deleting the top line of the reply. The top line, ": At x/x/x x:x:x AM, Rob1billion wrote:", is important to keep because it alerts the person you're arguing with that there is a reply waiting for them. Since you are not leaving it, I don't get notified when you reply and this site is way too big to go scouring through looking for responses. I understand you want to delete the unnecessary thread of replies for character limit concerns, but make sure you leave the very top line.

2) You're combining replies between different people. I doubt Sidewalker has any idea you rebutted him in your second-to-last post because you included it in your response to me. Reply to each user separately, again leaving the very top line, so that they know they've been rebutted. Users commonly skip rebuts to other users because of the sheer number of replies in each thread.

3) Instead of inserting numbers into my responses and then numbering your responses to correspond to them, you should interrupt my response with the "enter" key and then re-insert a colon before the rest of it. So, for example, let's say you took exception with my point right which ends right here ->. You should put the cursor at the arrow, hit enter twice, then go to the beginning of the next line and insert one colon before the new paragraph you created (i.e., the other half of my response that you just split up). I will know my responses because mine are all the ones with one colon before them (you don't have to keep re-writing my name). And you won't need to number them because you've created a new paragraph and are simply replying to the whole thing. If this is confusing, look at the way I and others use the site. Believe me, it's much easier for you and especially the people you are communicating with!

4) Be courteous to other users by doing the leg-work for them, as opposed to expecting them to connect the dots and research your points. If you rebut them, explain your point completely so they don't have to guess what you mean. In one of your responses, you simply posted "Miller-Urey Experiment." Instead of making me look the experiment up and try and figure out what you mean, include an explanation of why exactly this experiment rebuts my point. When I look the experiment up myself, I should be looking to confirm your reasoning, not completely deduce your reasoning from scratch. Remember, the reason we disagree is because we have a different perspective on the issue - don't assume we know the issue from your POV.
Beliefs in a nutshell:
- The Ends never justify the Means.
- Objectivity is secondary to subjectivity.
- The War on Drugs is the worst policy in the U.S.
- Most people worship technology as a religion.
- Computers will never become sentient.
R0b1Billion
Posts: 3,726
Add as Friend
Challenge to a Debate
Send a Message
1/25/2015 1:58:00 PM
Posted: 1 year ago
At 1/25/2015 5:14:53 AM, Amoranemix wrote:

- Amoranemix 19
Utilitarianism isn't a method, but a goal or measure for success. That the most good for the most people cannot be achieved perfectly is not good reason not to strive for it.
- R0b1Billion 21
If it's not a method, that means it doesn't tell you what to do. It's worthless.
That does not follow. Utilitarianism tells you what to strive for.

Please be more specific as to why my logic "does not follow." Explain how utilitarianism "tells you what to strive for."

R0b1Billion 21
So let me get this straight. There exists complicated goals in the world, therefore we should remove any efforts to be succinct in all endeavors?[3] Maybe instead of red, yellow, and green lights at intersections we can institute 1,000 different shades of colors which all mean different things.[4] Because, you know, raising children and running a country are complex so so should all our rules.
[3] I already answered that with no.

I'm simply taking your logic to it's insinuated ends. I said my system was simple, and therefore useful, and you brought up a bunch of complex life scenarios and exclaimed that, since they exist, we shouldn't feel bad about using complex systems of analysis. If a simple one works, (e.g., only three lights) why make it more complex?

[4] That does not follow. That some goals are complex is no good reason to add complexity to the rules for unrelated goals.

Of course it doesn't follow, it is an analog to your own logic!
Following a simpler moral standard that ignores context may be easier and thus behaviour will be closer to the ideal of that standard, but will it also be better ?
- R0b1Billion 21
Yes.
Prove it.

OK so this guy got me really mad because he disagreed with me about morality on DDO. I got angry, and I want to find him and murder him because I'm angry. Let's explore my system's methodology versus utilitarianism.

My system: anger, being one of the seven cardinal vices of a conscious being, is an intention that should never form the basis of an action. Therefore I should not commit the murder.

Utilitarianism: we must weigh the utility of killing the offensive member versus letting him live. How one proceeds from here has infinite possibilities, as we are going to weigh all the ends of my choices as far out as we can to determine which one, in the end, yields the best consequences. Perhaps my opinion of the offensive member is so low that I perceive a grave disutility from them being alive, and I can justify the world being a better place with them dead. I could write an infinite number of scenarios here on in about all the judgments I can make and the perception of results based on those judgments that would occur. And all the while, my anger fuels the analysis without being identified... heck, perhaps even, my anger - being sated, that is - is perceived as a positive.

- Amoranemix 19
[2] You seem to be saying that morality should aim at valuing the interests of others as much as one's own interests, which happens to be what utilitarianism promotes. Where do you get that values of utility cannot cope with inherent biases ?
- R0b1Billion 21
Utilitarianism promotes whatever you want it to promote. Evil, money, astrology, etc.
No. Utilitarianism promotes well-being.

That is the theoretical goal, but the problem lies in how one arrives there in practice. Murdering an offensive DDO member perhaps creates a net-positive "well-being" in the world.

- R0b1Billion 21
Chess is an inherently-opposite thing than morality. In chess, your only goal is your opponent's demise...
Which is completely beside the point.

I can't follow the thread back to what the original point was because you deleted it. I believe you were drawing a parallel between chess and morality... I would seem to have rebutted that quite nicely.

- Amoranemix 19
The categorical imperative is a rule that may be useful to achieve more good, but it isn't always practical either. Your seven sins may also serve that purpose. What is the goal of those rules if not the achieve the most good for the most people ?
- R0b1Billion 21
[no response]
You forgot to answer my question.

Well, conscious, sentient beings like ourselves necessarily have vice. We are selfish. Our selfishness is the basis of immorality. Animals, while self-interested, occupy a perfect equilibrium with the environment because they are not artificially-capable, as we are with our high intelligences. The "goal" of my rules is to eliminate vice from the decision-making process, thus leaving only pure, unadulterated reason. Can this be achieved 100% Probably not, but one can achieve a very high % if they are extremely enlightened and disciplined.

- Amoranemix 19
The best course of action, probably saving your friend, would be in accordance with utilitarianism. Whether you know that and how you decide for that action are different issues.
- R0b1Billion 21
No utilitarianism doesn't necessarily dictate that.[5] What if the actual best possibility is that you leave immediately, uncover buried treasure that is a mile away, sell it and invest the money in some little-known medical technology, and then that technology is used to save many people? Let's further assume that if you don't discover the treasure immediately, somebody else will and use that money to hurt people (i.e., you don't have time to save your friend first).[6]
[5] Like you I develop my own moral philosophy rather than just adhering to an existing one. Utilitarianism/consequentialism seems to most closely adhere to mine. Your version of utilitarianism that you seem to purposefully deprave is not one I adhere to.

I'm unclear as to exactly how your moral system is useful at all. Does it work in every-day life? You are dating a girl. You're not sure if you love her anymore. You meet another girl, and you feel like you may be in love (or is it passion of the moment?). Describe how your moral system guides you here.

[6] You are describing extraordinary and seemingly unrealistic circumstances, but if going for the treasure is the best thing to do, then that is what you should do.

But the point is that there's always going to be a hypothetical better scenario. How can you ever rest? How can you ever know you're doing the "right" thing? Is it "immoral" for you right now not to devise some elaborate scheme to create a spike of positive utility in your community? If you can never know if you're doing the best thing, how can you establish right from wrong?

R0b1Billion 21
No, utilitarianism would be a complex, inaccurate form of science and my version would be a simple, accurate form.
How is utilitarianism inaccurate and how is your version accurate ?

My system is based on either acting on vice or not. Your system requires establishing values of utility... very unwieldy indeed!

- Amoranemix 19
Forgiving others seems more benevolent than utilitarianism would recommend, but by doing so you may hurt yourself as people could exploit you.
- R0b1Billion 21
Exploit me for what? I don't have much money, I live simply and what I have I spend on food or give to charity. Only those who are not living morally are able to be exploited (i.e., the wealthy).
I am not arguing that it would be best for the world that you personally adhere to utilitarianism as I don't know you and you may be an exception. My remark however applied to most people.

It's troublesome that your system would include the caveat that it is useful for some unknown % of the population.

To your claim about the wealthy : utilitarianism promotes the distribution of wealth

To you, maybe, but to somebody else - probably not!
Beliefs in a nutshell:
- The Ends never justify the Means.
- Objectivity is secondary to subjectivity.
- The War on Drugs is the worst policy in the U.S.
- Most people worship technology as a religion.
- Computers will never become sentient.
R0b1Billion
Posts: 3,726
Add as Friend
Challenge to a Debate
Send a Message
1/26/2015 12:07:17 AM
Posted: 1 year ago
At 1/25/2015 5:14:53 AM, Amoranemix wrote:

Excuse me, I got you confused with another poster about the experiment I referenced. You're probably wondering where I pulled that out of!
Beliefs in a nutshell:
- The Ends never justify the Means.
- Objectivity is secondary to subjectivity.
- The War on Drugs is the worst policy in the U.S.
- Most people worship technology as a religion.
- Computers will never become sentient.
Amoranemix
Posts: 521
Add as Friend
Challenge to a Debate
Send a Message
1/27/2015 4:09:03 AM
Posted: 1 year ago
At 1/25/2015 1:36:55 PM, R0b1Billion wrote:
1) You're deleting the top line of the reply. The top line, ": At x/x/x x:x:x AM, Rob1billion wrote:", is important to keep because it alerts the person you're arguing with that there is a reply waiting for them. Since you are not leaving it, I don't get notified when you reply and this site is way too big to go scouring through looking for responses. I understand you want to delete the unnecessary thread of replies for character limit concerns, but make sure you leave the very top line.
I don't delete anything. I start with a blank canvas and add. I didn't know the forums quote header triggers the alert. Thanks for the info. I'll try it out. It would allow me alert several people with a single post. You should have gotten an alert this time.
I agree there is a problem with the forum's thread alert system and for some reason they can't get it fixed, which is a reason to use the quote alert system in stead.

R0b1Billion 27
2) You're combining replies between different people. I doubt Sidewalker has any idea you rebutted him in your second-to-last post because you included it in your response to me. Reply to each user separately, again leaving the very top line, so that they know they've been rebutted. Users commonly skip rebuts to other users because of the sheer number of replies in each thread.
I agree most people lack the basic etiquette to read other posters' posts. Sidewalker is probably one of those people. Feel free to educate him. If people would respect that basic etiquette there would be fewer and higher quality posts. I may try the trick to use several quote headers in a single post.

R0b1Billion 27
3) Instead of inserting numbers into my responses and then numbering your responses to correspond to them, you should interrupt my response with the "enter" key and then re-insert a colon before the rest of it. So, for example, let's say you took exception with my point right which ends right here ->. You should put the cursor at the arrow, hit enter twice, then go to the beginning of the next line and insert one colon before the new paragraph you created (i.e., the other half of my response that you just split up). I will know my responses because mine are all the ones with one colon before them (you don't have to keep re-writing my name). And you won't need to number them because you've created a new paragraph and are simply replying to the whole thing. If this is confusing, look at the way I and others use the site. Believe me, it's much easier for you and especially the people you are communicating with!
I find breaking up texts reduces their readability. It separates fragments that belong together and may even rob them of context. It also increases gaps, spreading texts out and thus requiring more scrolling. But I shall make an exception for you.

- Amoranemix 26
That does not follow. Utilitarianism tells you what to strive for.
- R0b1Billion 28
Please be more specific as to why my logic "does not follow." Explain how utilitarianism "tells you what to strive for."
You are shifting the burden of proof. It is up to you to show your conclusion (utilitarianism is worthless) follows from your premise. However, counterexamples that refute your deduction can easily be found.
The most good for the most people is a goal. A goal is something to strive for. For me the ultimate goal is to maximise the world's well-being.

- R0b1Billion 21
So let me get this straight. There exists complicated goals in the world, therefore we should remove any efforts to be succinct in all endeavors?[3] Maybe instead of red, yellow, and green lights at intersections we can institute 1,000 different shades of colors which all mean different things.[4] Because, you know, raising children and running a country are complex so so should all our rules.
- Amoranemix 26
[3] I already answered that with no.
- R0b1Billion 28
I'm simply taking your logic to it's insinuated ends. I said my system was simple, and therefore useful, and you brought up a bunch of complex life scenarios and exclaimed that, since they exist, we shouldn't feel bad about using complex systems of analysis. If a simple one works, (e.g., only three lights) why make it more complex?
You seem to misunderstand my analogy. You want to dismiss utilitarianism because it is complex. I gave counterexamples of complex goals that should not be dismissed and replaced with simpler goals. Is utilitarianism too complex to achieve perfectly ? Yes, it is. So are eliminating poverty, saving the ecosystem and raising children. Wouldn't it be great if those things were easy ? Yes, it would be. Unfortunately things are the way they are and we'll just have to manage the complexity.

- Amoranemix 26
[4] That does not follow. That some goals are complex is no good reason to add complexity to the rules for unrelated goals.
- R0b1Billion 28
Of course it doesn't follow, it is an analog to your own logic!
My logic does not include non-sequiturs, nor promote spreading the bad aspects of situations to other situations. In addition, utilitarianism is against complexity. The simpler rules, the less time (a resource) they require and the less likely mistakes are to occur.

- Amoranemix 19
Following a simpler moral standard that ignores context may be easier and thus behaviour will be closer to the ideal of that standard, but will it also be better ?
- R0b1Billion 21
Yes.
- Amoranemix 26
Prove it.
- R0b1Billion 28
OK so this guy got me really mad because he disagreed with me about morality on DDO. I got angry, and I want to find him and murder him because I'm angry. Let's explore my system's methodology versus utilitarianism.

My system: anger, being one of the seven cardinal vices of a conscious being, is an intention that should never form the basis of an action. Therefore I should not commit the murder.

Utilitarianism: we must weigh the utility of killing the offensive member versus letting him live. How one proceeds from here has infinite possibilities, as we are going to weigh all the ends of my choices as far out as we can to determine which one, in the end, yields the best consequences. Perhaps my opinion of the offensive member is so low that I perceive a grave disutility from them being alive, and I can justify the world being a better place with them dead. I could write an infinite number of scenarios here on in about all the judgments I can make and the perception of results based on those judgments that would occur. And all the while, my anger fuels the analysis without being identified... heck, perhaps even, my anger - being sated, that is - is perceived as a positive.
Again, you seem to be assuming that utilitarianism excludes simple rules. You can dismiss murder as bad with the rule : murder is wrong. Very simple. In fact you also propose a rule to follow. Negative emotions rarely lead to good deeds. So, in addition to murder being wrong, the motive is anger. That is not a good justification.
However, under utilitarianism, these rules are tricks to deal with the world's complexity. They are not absolute. There may be circumstances where murder is not wrong. If murder serves the greater good, then you should murder. In addition, in common circumstances (those people evolved for and were raised for) people can handle complex rules intuitively. Of course people can make mistakes, just like with raising children, eliminating poverty and saving the ecosystem.
The earth does not belong to man; man belongs to the earth.