Amazon.com Widgets
  • Lol, staying along the lines of the question itself, yes, it was.

    The Continental Congress was the Governing body of the United states prior to the creation of the Constitution. When this Congress established our rights (Prior to being protected by the Constitution) they did so through "Ratifying" the Declaration of Independence. The DOI states clearly that life, liberty and the pursuit of happiness are the founding principles of the united states and are self-evident truths. The DOI also goes on to clarify that these rights (Of which we still enjoy today) are endowed upon us by Man's Creator and that they are unalienable. By definition, these rights could not have been given to us by a Human because Humans don't possess the power to endow rights that cannot be given or taken away. Consequently, the fundamental principles, the infrastructure, and the underlying solid ground that the united states is built upon were in fact originated from God.

  • I'm afraid not.

    If we completely disregard the history of the colonial era, and only look toward "when the US was truly founded" then the answer is no. This nation was founded upon secularism, and while there was a Christian-like minority at the Constitutional Convention, the ideas of founding the US upon religious beliefs were quickly shot down.

  • Nothing Godly in America

    There is nothing Godly about the United States of America, it seems like everything is base off of money, and yes its a captiolism society, but to have such an emphasis on money and what money can do or can not do for you is ridicilolus, you rarely see anyone praying like they use to.


Leave a comment...
(Maximum 900 words)
No comments yet.