Since Western culture originally created those distorted gender norms, the answer would be yes. The idea that women should stay home, that men do all the work -- distorted. How women and men should dress -- distorted. What toys boys and girls can like -- distorted. We created a rather toxic soup with those distorted "norms" that we're just now getting around to undoing.
I believe Western culture does distort gender norms and this is often done by the corporations that are trying to sell their products. These companies would prefer people fall into large groups so they can advertise to them, so essentially Western culture has been brainwashed into accepting these norms, so to speak.
The people in other countries are trapped and treated like slaves. In western civilization we are allowed more rights and opportunities. The other countries are living more in the old ages. Women don't belong home taking care of kids. And men arnt the only bread winders. Us women must get jobs an provide or our family. There's no excuse to why we should be brought back to that hell hole! We started womens rights movement because we were miserable!!! Who wants to wash dishes and clean all day it's degrading and pathetic. So distort is a negative word for it. Distort in this question means more like we have varies of opportunities and more freedom.
Gender norms are constantly evolving in Western culture, especially over the past decade or so. From the women's suffrage movement in the early twentieth century, to first and second wave feminism, women's rights movements have had a momentous effect on the way men and women relate to one another in society.
Western culture is more open and allowing of letting genders take on any role. I think from other areas of the world it may look like we distort gender norms, but I think a big factor is how we allow people more freedom and rights to do what they want.