I hate to say it, but I really do. I can't stand it. I think the focus of American Christianity is all wrong, I believe it goes about things the wrong way, and I believe that it's intrusive to others. Let me explain.
American Christianity is the reason why LGBTs still do not have equal rights in America. American Christianity is the reason why helpful programs such as Planned Parenthood face extinction due to defunding (I'm 100% against the defunding of PP, though they do provide abortions they also do a whole lot of good, and none of their public funding goes to abortion anyway). It is the reason why school children aren't getting proper education in the science classroom. The list goes on and on.
The way I see it is this: American Christianity couldn't care less about the well being of others or the state of other people's souls. All they care about is the Bible being legislated and forced on the general public through the government. American Christians aim for theocracy, so to speak, and they will stop at nothing to attain this.
Nothing makes me madder than Christians trying to legislate Christianity. Jesus Christ never legislated the Gospel and discouraged it. Why did God give every individual free will if he wanted his word to be the law of the land? That right there is a contradiction in and of itself. Not everyone is a Christian, and the Christian moral code does not apply to non-Christians, so why should it be legislated?
America was not founded as a Christian nation, despite what they say. America's government has always been a secular state, and everyone has always been allowed to practice whatever religion they want (despite some American Christians claiming "freedom of religion" means freedom to be Catholic or Protestant). I think it's time for mainstream American Christianity to examine itself in the mirror, get out of the business of legislating Christianity, and get back to the business of winning souls to Christ. Alas, that won't happen and I know it's just wishful thinking (due to the inability of the typical American Christianity to separate patriotism and religion), so whatever. I'm glad I'm not associated with it. I'll just say that.Do you think that all of American Christianity is in search of a theocracy, or is it just the vocal majority? Are there Christians in America that are doing good and serving others without being intrusive?