To me, Western culture is the very definition of materialism. If you are a part of, or advocate for Western culture, you are advocating for free market economies, economic globalization, and deregulation. In addition, Western culture is the trademark for democracy around the world; so called "Western" countries were the first to embrace democracy as a form of government. However I disagree that Western culture means superiority. Western countries use developing nations as a source of profit. However they also advocate for the growth of other countries, and buy their products.
Eurocentrism, geographic irrelevance & the invalid definition that keep changing depending on political context. In fact, Judeo-Christianity, Mesopotamia, Egypt are hijacked as "western". Anything non-western is lumped together without distinction as if its homogeneous. The colonial & Eurocentric days are over. The word "civilization" is intentionally use to contrast "uncivilized" Easterners.