How Christianity Is Disappearing In America

AmericaFeaturedInsidersPoliticsSociety
inkandvoice.com
inkandvoice.com

Introduction

If you consider yourself a conservative, then you likely already know that America was founded on Christian values. “In God We Trust,” right? And for many decades now, Christianity has been the most practiced religion in America, to the point that it’s hard to consider the United States as anything other than a Christian nation. But the reality is, Christianity is facing its most difficult days yet in our nation. Let’s take a look at how Christianity is disappearing in America.

About this author