I am not doubting in any way that women are oppressed in certain societal positions, in the work and political places for example, and must seek to be take seriously in such circumstances, but am i naive when i say that women are naturally better at particular tasks and vise verse. Have women not supported men for years in the background and been seen as the stronger one? There's an old saying "behind every successful man, is a strong woman". I believe this to be true and am happy to assume certain roles, as it just fits better for my situation. Does anyone else believe full gender equality is an unreachable target?