I think overall affirmative action programs do more harm then good, they promote the idea that races are unequal and the Government needs to step in and correct that, overall I think affirmative actions needs to be eliminated and people are hired to jobs based on ability and nothing else.
There are equal rights in the United States now. At the local and federal level, more often than not, people are treated as one and the same. Of course there is still discrimination, but this is on a person to person basis. The USA is the most diverse nation on earth.
There is still a struggle in the politics of race and gender that still need to be overcome, and even though we as white men believe that equal rights have been achieved just enough, there are still subtle influences keeping down minorities in the workplace, as well as women, and it has not been fully addressed.
In some cases they aren't needed and can even be a hinder, but they are still needed as a whole and in some places in particular. Some companies and places wouldn't have diversity on its own and may even be discriminatory against certain people, so it is needed to ensure that everyone has a fair chance.
A lot of people assume that because an African-American has been elected and reelected President, that somehow racism and discrimination are over. That is simply untrue. The African-American population is still living in a repressed socio-economic state compared to whites. Also consider the growing Latino minority, and you should see that we still need to make sure everyone gets a fair shot.