Western Church in Decline

America is no longer a Christian nation. I don’t mean that there are not plenty of Christians in America. What I mean is that our leaders and our neighbors are no longer basing their decisions on biblical truths. To deny that is to deny reality. The America with the Bible as her belt and the… Continue reading Western Church in Decline