Myths/America is a Christian nation
Jump to navigation
Jump to search
Myth: America is a Christian nation. |
Myth
It is often argued, especially by Biblical fundamentalists, that the United States is a Christian nation – that is, that one or both of the following are true:
- The country's laws (especially the Constitution) are based on Christian teachings, especially the Ten Commandments.
- The founding fathers intended to endorse and support Christianity, or Christian principles, by codifying them in law.
Reality
No, it isn't.
- The Constitution codifies a principle known as separation of church and state, specifically stating that "Congress shall make no law respecting an establishment of religion" (Article I)
- The Constitution specifically prohibits requirements of religious affiliation for holding public office in the US (Article III)
- "In God We Trust" on US money and the phrase "under God" in the US Pledge of Allegiance were not part of the original design of the US and were not officially adopted until the mid-20th century, almost two centuries after the US was founded.