Liberals are winning the culture war with the Affordable Care Act and the promulgation of gay rights in America. The GOP has lost two straight presidential elections and there are arguments that George W. Bush should never have been put in the White House in the first place. Liberals are winning the culture war with elections and lawsuits that give more victories to the left.
The liberals are now winning the cultural war in American pokitics.This is probably because they are seen as being more tolerant than conservatives.They are seen as the ones that can help when somebody is in trouble financially and needs the help of the government to get back on track.in their lives.
Liberal values have take hold of the US today, in the social sense. The issue comes after people agree on the social issues; people don't believe that Liberal views of governemnt are right, and the majority of Americans today believe that the government is the single biggest threat to American life.
I believe liberals are winning some things when it comes to American politics, but they're certainly far from winning the battle or taking over. Universal health care is a major liberal ideal that has gained recognition and a plan has been implemented. It's far from what we wanted, but it's a start. Legalizing marijuana has been a strong liberal ideal and it is gaining traction. Gay marriage rights are also coming to pass.
No, liberals are not winning the culture war in American politics, because they underestimate how very conservative the American public still is. One great example of this is the reelection of George W. Bush, even during a war time. When Defense of Marriage Acts are proposed, they always pass. Americans are still a very conservative group of people.