Britain (lowlands of Scotland mostly), Denmark and Sweden did the things often ascribed to all Western countries. You can't say that all countries did it -- certainly not England, France, Germany, Italy or any of those countries. So yeah is it all Western countries that are bad, or just some of the countries.
History does prove the harmfulness of the western way of life and the western culture. For years, the earth was doing just fine with humans. However, around the time of the Renaissance, the balance between the earth and every thing else became slightly out of whack, due to the West's expansion.
Compared to other countries, the western culture does not indicate any real harm. Westerners are doing much better than most other countries when poverty rates and life mortality is considered, so there really isn't any true concern about western culture harming anyone or anything. Other countries who do not provide their residences with decent wages, health care, protection from environmental hazards are much more destructive to their citizens and countries surrounding them. The western culture provides a democratic process that provides a way for citizens to ensure that an equal playing field can be maintained and most harm to citizens can be minimized through this process.