Some jobs still pay females less than males for the same tasks and responsibilities, which is wrong and actually illegal. Equal pay for both sexes was established under national law in 1963, at least in the United States. Other than the wage gap, I believe females are pretty much equal to men in the workforce in modern times.
Now more than ever. Men in the workplace are made to feel guilty for the results of their honest efforts. People quote flawed statistics on a mythological wage disparity and use it to demonize a very specific genre of Americans: white males. Women are the recipients of special scholarships that given them an academic advantage, as shown by the fact that women are statistically more likely to obtain a college degree than men. Job for job, with the same level of experience, and working the same amount of hours, the "wage gap" disappears, and in some professions, women tend to make more than men when compared as previously stated. Men are unfairly persecuted and not given sufficient societal acknowledgement for the success that they achieve, but are rather painted as the evil recipients of some sort of innate privilege, and have their accomplishments symbolically stripped from them accordingly.
Women on average are paid 77% of the average man's salary in the US. This figure shows the huge chasm in opportunities between the two sexes. I believe this comes from the majority of bosses around the country being male and rather backward in their views towards women, may it be conscious or subconscious they are denying women opportunities of promotion into better paid, more powerful positions.
Women still do not get the same wages as men even though they do the same work and sometimes even more. Because of this, gender discrimination is still a concern in America. Everyone in the workforce, no matter what their gender, should be paid equally and this still does not happen. The gender bias is absolutely obvious in terms of wages.
While it is not as bad as it was decades ago, it is still a problem. This is something that has been deep rooted into society for generations so it's not going to disappear overnight, people will still harbour their feelings towards each other despite law changes. Feelings tend to overpower logic, hence why many of the problems we face today let alone gender discrimination is still a lingering problem.
Yes, gender discrimination is still a major concern in the American workforce. Although women have made great strides in the boardroom in recent years, they are still underrepresented. Many people still assume that women cannot handle the pressures of high level jobs the same way that men can. Thus, women may get passed over for these positions when they have earned them.