Yes, college gives women an advantage in advancing in the workplace, because it shows that they are developing their skills. A college degree would give anyone, not just a woman, an advantage in the work place. Many jobs specifically require a degree, and maybe in a certain field. A woman with an education shows that she takes herself seriously and is willing to do what it takes to get ahead. She would have more confidence in her career field.
I believe that college gives women an advantage in advancing in the workplace. A college degree is highly valued in our society and gives a woman the credentials often needed for a promotion. Also, women who pursue an advanced degree, such as a master's or doctorate, are demonstrating that they are ambitious and want to further improve themselves.
College gives everybody attending a step up in the workplace. It's extremely helpful for women, things like Title IX haven't helped open the doors in certain industries, what that diploma says and verification that they have it is what makes more opportunities in the workplace available. This country has pretty clearly put women behind with regards to job advancement opportunities, but college can help them out with such a high female population. Not a bad thing.
There are now more women college graduates than males. This can only help them appear more qualified and able to advance easier in the work place. That being said, women still face a lot of problems in the work place due to gender discrimination. College can help relieve that pain.
Yes, college gives women an advantage in advancing in the workplace, but it is not different than the advantage that is given to men who complete college. College is a positive way to advance a career in general, but its benefits are not gender specific. College helps learn practical skills in a particular field, and also helps with professionalism and positive social contacts.