well, that's unfair to say because it has fucked up the eastern hemisphere as well. It has done nothing but perpetuate white power, racism, sexism, and homophobia.

In fact, christianity is perhaps the worst social ill entrenched in american culture at this time.

When is this worthless religion going to die out? It's time ended so long ago.


beliefs = not based on truth, why waste time?