I believe that schools should teach about traditional gender roles between boys and girls.
I think it should be taught because of the sexism in our schools. I have observed that the girls are always on there own unless it's to piss off a male student by bulling them or just kicking them in the groin. And the fact that there are most of the time no consequences for the girls.
For example. A girl does something bad and it's most or all of the time ok.
A boy does something bad and he gets on sooo much trouble.
Women were thought to just have to work in the house and never do anything but that. They did not have good jobs. Girls today don't need to be told that they can't do anything. That is not fair. Expessially not in a school enviorment. That happens to be the foundation for their future. Dads used to not have much to do with the children. They worked and came home. That does not happen today. We should give them the choice of how their family is going to be. The schools don't have anything t
There's no reason to teach outdated ideas that idealizes men and confines women to a position of subservience. Tradition paints the man as a strong and emotionless workhorse who must provide for his mandatory wife and children, while presenting women as emotional and weak creatures who cannot fend for themselves and need a man to survive.
If you teach children gender roles then that will lead to sexism and bullying, i think they should grow up and act however they feel like. Teach them to be nice and generous. I feel that way no one would be harassed (or at least as often as usual) and then we'll finally have a nice generation.
I don't think girls typically get away with bad behavior that boys don't, but let children explore their environment. Gender is a very personal issue and if you try to box someone in, they may get embarrassed or lash out. Kids will figure this stuff out for themselves. All they need is some guidance from their parents and other adults in a their lives. Let schools be for learning subjects.
Maybe "back in the day" this wouldn't have been an issue, but with this generation sort of steering away from traditional upbringing this would be an issue. If parents want their children to be accustomed with traditional gender roles, then let them teach that to their children. There's a good chance a teacher could have students who's parents rather not restrict their child to gender roles, and the teacher would just be getting herself in trouble by teaching gender roles.
Traditional gender roles are ideas, but once a person knows about them, s/he end up in a catch-22 situation : they only have the choice to accept the role of their gender, because if they take the role of the other gender they could get bullied as sissies (for boys) or butch (for girls), and if they ignore gender roles entirely (being genderless), others will try to fit them in the box of gender roles, with the same consequences. Of course, refusing to tell kids about the differences between boys and girls is unacceptable (experiments have proved gender is more than a social construct). So kids should at least learn the basics in school. But the kids shouldn't be led to believe that one gender is inferior to the other, or that gender should only be expressed in the traditional way.