Sunday, 30 August 2009
Now I'm not highly religious or anything, but if it's one thing I've learned from church it's that God wants us to treat each and every individual with respect. Yet throughout history religious wars involving Christians have been pretty popular. I live in Texas, and there is a very strong religious foundation here, but what worries me is the condemning nature I've observed in "true" and "real" Christians. I don't really understand it, the whole fire and brimstone thing, the whole 'call your neighbor a devil worshipper and don't ever contact them again' thing. I don't understand making christianity look ugly.Honestly, I think that kind of stuff just drives people away. Sometimes I wonder if it's a question of morals or maturity.
I remember there was a "real" Catholic girl in my class, and she wouldn't talk to anyone who she knew used tampons, because in her eyes they weren't virgins. Sure there are more controversial issues in which Christians have been ugly. Homosexuality, being a very popular issue nowdays. People come from all different walks of life, yet I see highly religious Christians disowning good people, because they don't go to church every sunday.
Why is it that believers find it justified to be UGLY to other human beings to stand up to their God? Is that really what the bible teaches? Shun the nonbeliever?
Now I have seen plenty of ultra religious people who are good to everyone and they are some of the nicest most curteous people I know, but I have also seen some ugly, ugly, ugly, Christians and it's embarassing to me.
Is it right? or is it wrong?
Here are just some that have made it to television...but I know you know what I'm talking about....
"God hates the USA"
"God hates you"
"You're going to Hell"
Why would I want to believe in a God that hates me?????
How can anyone justify this???
I want to know...
Do you have any answers to these questions?