America Is a "Christian Nation" — Since When?

Hosted by

The Constitution never mentions God, and the Founding Fathers were adamant about the separation of Church and State. The Pledge of Allegiance didn't say "under God" until the 1950's, and that's when the phrase "In God We Trust" showed up on the dollar bill. Those are examples used by Kevin Kruse, a Princeton historian, to support his thesis in the new book, One Nation Under God: How Corporate America Invented Christian America.

So why do so many Americans consider this is a Christian nation? We hear how Big Business used religion to discredit the New Deal, organized labor and government regulation. Socialism was demonized, and God associated with Free Enterprise Capitalism… all the way to the White House. Is that legacy still with us?