Tuesday, July 12, 2011

Do Christians think America was founded as a Christian nation?

As an ex-christian and former minister, I will say that every christian that I know believes that the USA was founded as a christian nation. They generally tend to believe that the founding fathers were of their own denomination, too (I'm being a little facetious, but only a little). This closed mindedness was one of the red flags that caused me to continue questioning things until I finally got some useful answers.

No comments:

Post a Comment