Well I only skimmed a couple posts on the last page, but I have something to say about "animal rights". Animals eat eachother. That's how it's been done for... ever. Right? It's a completely natural thing, much like many things which have grown to be considered taboo in society. Seeing as how we're animals, who's to tell us we can't continue the tradition? Shit even homosexuality is seen in nature. Open your eyes, people. Nature isn't evil. It just is.