I think America is more racist today than it was before Obama took office. I'm not exactly sure if it has anything to do with President Obama directly. Possibly more African-Americans expected a bigger change when Obama was elected. America has always faced racism but now it is just publicized more. Racism against whites is actually becoming more of a problem than in the past.
At face value, America may not be more racist, but we certainly hear about more racial issues than we did prior to Obama. One example would be the Black Lives Matter campaign. They complain that white people discriminate against black people, but at the same time, they are committing hate crimes against white people, which didn't seem to be as common prior to the Obama administration.
I don't necessarily believe that racism is any worse than ever before. I think that it is just being talked about more, and the media is covering the issue better than before. It used to be that racism was overlooked, brushed under the rug, and basically just not talked about. Now we have people who are speaking out and coming forward, and making noise to create change and better lives and futures for minorities. We haven't seen a spike in racial issues, just hearing more about it.
Having a black president has not made America more racist. It probably has just brought it to the forefront. There certainly seem to be a lot more negative headlines and hate crimes. We are a nation publicly divided on the nightly news, but those feelings were just hidden behind closed doors before.