Is It True That White People Conquer, Harm, Kill and Destroy Everything?

Asked by: Eqcuis
  • Seriously? You had to ask this question>

    OF COURSE they do. The evidence and ruins of what they do and have done remain as a scar on Humanity and the globe in a very literal sense. They destroyed society, the environment, and the natural order of things where people share everything. They act almost as devils on this earth, and they have since they first appeared. And no, it is not "everyone", and don't come back with the usual whataboutisms. The only people who would say no to the question asked by this topic would be white people.

  • It's really silly to think so.

    Humans are horrible. Throughout the history there's trace of wars. The first war recorded in history took place in Mesopotamia in 2700 BCE so saying it's just one group of people doing it is stupid. Nations have been conquering other nations for ages just look at Genghis khan or Tamerlane in Asia or the Native in America.
    The Native Americans were in a near constant state of warfare with one another, many tribes became extinct and disappeared even before Columbus, as a result of tribal conflict.

Leave a comment...
(Maximum 900 words)
No comments yet.