We've all heard this one before: "America is a Christian Nation!"
But what, exactly, does "Christian Nation" mean?
Is it:
- A majority of Americans are Christian.
or
- America was founded on Christian principles by men who intended America to be a Christian Nation.
The first is quite obvious to anyone... Somewhere between 70% - 80% of Americans consider themselves Christians. However, it means nothing. Our country could be considered a "Pepsi Nation" simply because many people pick it over coke. But our government and the laws that support it do not favor Christianity OR Pepsi.
The second is the one that we hear about most - the one that sparks controversy, so it's what I'll be discussing.
If you ask someone who supports the idea of America being a Christian Nation why they support it, you will usually be met with quotes from founding fathers (the select few who were Christian, of course) and sometimes a 'fun fact' or two, such as "George Washington's dog attended church for 'x' amount of years".
So... what does that mean?
Nothing.
It distracts from the question at hand - was our government meant to favor Christian ideals?
Nowhere in the Constitution is Christianity even mentioned...
Our governments purpose is to secure our freedoms, not to spout opinions.
So most people will go back to the quotes - as if there was some hidden intention which was not (but of course should be) engraved in our government.
Some of the founding fathers were Christian, yes, but most were Deists.
In fact, many of the quotes you will see used in favor of America being a Christian Nation are taken out of context - founding fathers who believed in God in the deistic sense who in no way believed in the Bible or Christianity.
Many of the founding fathers explicitly state that our country was not founded on religion but FREEDOM of religion.
In fact, several founding fathers seemed very much against Christianity (Thomas Jefferson comes to mind).
So do our laws agree with Christianity?
Do not kill & do not steal are in the 10 commandments... But they are also in many religious texts such as the Egyptian book of the dead - and should be obvious regardless of religion.
There is no legitimate reason to believe that America is a Christian Nation.
For anybody who still has doubts:
"As the Government of the United States of America is not, in any sense, founded on the Christian religion"
- The Treaty of Tripoli