Hollywood needs more women working behind the scenes. It needs more women writing, directing, producing, and in other administrative roles. At present, most of these areas are still largely dominated by men and the voice and narratives of women are still excluded. Additionally, there need to be more stories that are primarily about women and not everything needs to be about men.
Hollywood has seen some really great women over the past, women who pushed the definition of what a woman should be in acting. Unfortunately, a lot of those women are aging or lost to us at this time, and there don't seem to be that many women of great talent moving up to fill in their place. It would be good to see young women not fall prey to what Hollywood wants of them but intent on carving out their own careers even if it means going against the grain.
The thing with women in Hollywood is its hard to relate to any of them. They do not fit the typical mold of what a real women is. They are small, super skinny and very much not of the mold facing what women today are. I would love to see real women in Hollywood.
Hollywood needs to change their view on women rather than adding or subtracting women from their ranks. Hollywood perpetuates the social construction of women as sexual objects good for nothing past a physical body, sweet voice, and minor acting skills. It is time to put that aside and work towards showing what young girls should look up to as women rather than just bodies.