Originally Posted by S.P.Q.R.
What the fuck are you talking about? It's terrible to frown upon a group of people who openly believe that they're way is the only? It's the only damn religion that teaches you to try to spread your religion, and look down on those who arn't. I find it pathetic how everyone wants to defend such an ignorant, offensive religion.
And if I don't want to see what the Christians believe, I shouldn't go around them. Are you fucking kidding me? I'm in the US, and they're EVERYWHERE! You can't get away from them. They're morals, values and beliefs have been implemented into society as a whole. Whether it be at the workplace, grocery store, or even a school! You'll find collision when anyone hears the word "athiest", "Muslim", or "Pagan" come from anyones mouth. People are outcasted in this country, and most others, for the sole fact that they don't support christianity.