I believe animals are more important than humans because who made us the "masters" of this world? We made ourselves masters of everything, taking control of animals: domesticating them, killing them, etc. They have no say in what life turns out to be for them. Yes, some humans are kind to animals, but others, well they fight animals for money, beat them, starve them, and force them against their will to do things they can't or don't want to do. They take care of the environment, as for humans, we make more stuff for ourselves that harms the environment- cars(pollution), axes(kills trees/animals homes), etc. We harm them in all sorts of ways, yet we keep our lives going and harming animals without care. Who do we think we are?
first of all, anyone can hack onto Wikipedia and change the facts. Just saying. Anyway, All humans do is pollute and destroy. Anything you can think of that is good, I can link it to something they do bad.