Women are gaining respect in Hollywood. Although early numbers suggested a number of women working in in the industry during the 1920s, as the industry matured, women were cast aside from writing and director roles. That is no longer the case, with more women assuming a these roles and older female actors gaining roles.
If women were respected as much as men in Hollywood, there would be equality in pay. Hollywood tends to be liberal. You would think they would insist on equal pay for equal performance. Unfortunately, Hollywood doesn't respect its women because they still pay them significantly less than men make for similar work.
In any career or occupation, women do not get paid enough nor are respected enough. In today's Hollywood, women are seen merely as sexual objections and the older they are, the less likely they are to get work. Men have the main roles and most women are the supporting actresses. There are women who are taking on more production work in film. These roles include producers, directors, and writers which do earn respect.
I believe that women are not as respected in Hollywood as they were in the early days of film. While there are many wonderful, talented women acting I believe that even the top earning women in film are still making less than their male counterparts for similar films. In addition I feel like there is a lack of roles for women over a certain age or women from a diverse background. If a man created a role or a franchise he is allowed to play that role until he no longer wants to while I feel that isn't the same for women.