Was the United States founded as a Christian nation?

  • Yes, I believe that the United States was founded based on Christianity.

    In the United States, people had been escaping the restrictive Roman Catholic church who had caused many outrages in the name of religion. However, anyone can read the bible and know that these outrages were not truly Christian. While the first amendment announces it unconstitutional to force someone to believe in god, the belief that others should not be prosecuted for their beliefs is essentially a Christian convention. Additionally, it was only advised in the bible that people follow Christianity; those who would follow the divine laws would have to follow Christianity.

  • It is impossible

    The United States could not have been founded as a Christian nation, and one needs only to look at our Constitution to know that. Nowhere in our Constitution is it said that The United States is a Christian nation, and many appeals by Christian groups to change the preamble have failed even after years of attempts.

  • NOt just that

    No, they was not founded as just a christian natioin. In the start of the country there were several defferent religious groups that were coming over here. They all had a hand in making the country, and the christians were not the only group that was here in the start.

  • By Freedom Not Faith

    While the United States is arguably a predominantly Christian Nation and some of its first major settlers in colonies like Plymouth were of the Puritan (a more stringent version of Anglicanism, a branch of Christianity), the Founding Fathers of the nation and those that designed the documents that governed our nation deliberately left out a ‘religion’ or faith as any basis for the nation’s founding. In fact, several of the Founding Fathers were Deists or Theists meaning that they believed in a higher power, but not in an organized form of worship or following said power making them Christian-like in their mono-deistic belief, but not Christians outright. The United States of America was founded, as listed on its crucial documents such as The Declaration of Independence, The United States Constitution, and even Thomas Paine’s ‘Common Sense’, by a need for men to express their desire to change those that govern them and ensure by representation that the necessary changes for a better society be made. In that sense, The United States of America was founded on sociopolitical principles, not religious beliefs and the very founding was the concept of a democratic republic (having nothing whatsoever to do with a savior, deity, or system of religious practices).

  • No, the United States was not founded as a Christian nation

    While some may argue that the principals of the founding fathers of the USA were Christian based, the United States was in no way founded as a Christian nation. This is obvious as the US Constitution was the first constitution in history to establish a secular government. The first amendment protects religious freedom and also prevents the government from establishing or favoring any religion.

  • Freedom of religion

    The United States was founded on the tenets of liberty and freedom, including freedom of religion, by a group of non-religious Deists. The predominant religion of the citizens was indeed Christianity, but this was not always meant to be the case, and there's nothing in the Constitution or history of our country that suggests otherwise.

Leave a comment...
(Maximum 900 words)
No comments yet.