"Our Constitution was made only for a moral and religious people. It is wholly inadequate to the government of any other." -- John Adams
Just because some people believe the United States is not a Christian nation doesn't make it so. The United States' founding documents demonstrate a basis on Judeo-Chistian laws and principles.
In the Declaration of Independence we mention God as whom we petition to guide our nation.
Just because this is 'modern-day America' does not mean things are better or more Constitutional. Also, just because it doesn't specifically mention Christianity specifically in the Constitution does not create a basis for a secular society.
"[...] endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty, and the Pursuit of Happiness." (From the Declaration of Independence)
Yes, America is still considered by most to be a "Christian" nation, both numerically and in ideology. Although no single Christian denomination is professed by a true majority of the population, the sum of all those who claim a Christian religion is larger than the number of believers of any other non-Christian religion. Racially, the decrease in statistical dominance held by white people of Western European descent has only helped to increase the number of Christians in America, particularly Roman Catholics, due primarily to the correlation between Latin America and Catholicism.
Furthermore, common cultural narratives continue to reinforce the dominance of Christianity over all other religious beliefs, particularly among the more conservative groups and individuals.
The vast majority of Americans still consider themselves as Christian, and therefore we should still be considered a Christian nation. This does not mean that the laws will be based off of Christianity or that people will be forced to follow the doctrine. It is just a way to represent what most Americans believe.
People may claim that the majority of our founding fathers were Christian but the truth is most of them were deist because they left England during the enlightenment, which was a period of time when deism was prominent. Even the founding fathers that claimed that they were Christian ,even they had major deist tendencies on their beliefs. And when "God" was stated in the pledge of allegiance and other national documents they were referring to the deist god And ALL of them were freemasons, and in freemasonry you didn't have to believe in a Christian god all you need to believe in is an god that is everybody's god.
Today OUR nation is just a bunch of disbelief and beliefs that people been arguing for EVER about. What if god is an idea? Then the people who call us ATHEIST dumb because of or BELIEFS. When i die i cant say i told you so because I am DEAD. I cant tell you because when your dead they is nothing. Its not dark because your dead. You cant see anything also because your DEAD. STOP and think for 5 minutes about why we believe or I believe in a god. God did NOT make me. My parents who decided of having a baby is the own who made me. Pray to a milk jug you will get more help from that than my ''Lord and savior''
If the "religious right" wants to declare the United States a "Christian nation" then America is a white nation because the majority of its citizens are white, a female nation since women hold a slim majority over men, and an urban nation because most people live in cities. How many of those that identify themselves as "Christians" are familiar with the history of their religion? As soon as Rome embraced Christianity they forced it on their subjects. Christian holidays were based on pagan holidays to gain converts, at some times and places people were even put to death for not converting. Unfortunately, the reverberations of which, are still felt today. In recent decades the influence of Christian institutions has began to decline which is why Christian leaders have become so adamant on declaring America a "Christian nation." By doing this America is telling the world that our nation is not secular but based on religion as many primitive nations in the Middle-East. It also insinuates that those who do not identify themselves as Christians do not represent America. In truth America is a melting pot of many religions, philosophies, beliefs and non-beliefs. These are personal issues. Our government should never promote one over the other. To do so would be to move centuries backwards. It is arrogant, irresponsible, biased and supercilious to suggest that America is a nation where all should feel welcome not just the majority.
Although there are Christians in America, it should not be considered a Christian nation. I would be ashamed to call this country Christian with the amount of corruption, sexual immorality, and immoral behavior.
Only when the entire nation gives up it impurities and sin, will I call America a Christian nation. Until then, I won't.
America from it's inception has been based upon the separation of church and state. It is in the Constitution that many religious people like to tout when it serves their needs but ignore when it doesn't. Even if a majority of the population is Christian, not everyone is and our government is supposed to remain secular. There is also a growing number of non Christians and non believers in the United States every day.
America never should have been considered a Christian nation to start with. The Founding Fathers were deists, they made certain to separate the state from religious establishments in the First Amendment. Any time you hear a Christianist lamenting the "fact" that this is no longer a Christian nation, remind them that it never was.
America should not be considered a Christian nation, because it isn't a Christian nation. The constitution and Bill of rights does not mention Christianity one time. And in modern day America, only 17% of the public goes to Church every Sunday. America is a secular nation and always will be.