What has happened to American Religion?
It’s obvious that american religion – especially Christian Churches – are corporations. When did the Church become a Business, instead of the Church being the People first?
Churches seem to care more about the BUSINESS then the PEOPLE.
Christian music is a Corporate Business.
Christian bookstores are Corporate Businesses.
Christian radio is Corporate Business.
Christian evangelism is Corporate Business.
It seems to be all about the money for so many.
J, of Shadows of WCG discussing on his message board that consumerism and corporatism has corrupted American evangelical Christianity