Not every professor mind you, but college professors overall. It's a good deal for right-wing students, because they are receiving a more diverse education. It's a bum deal for left-wing students, because they are only seeing one viewpoint of America, which, ironically contradicts the "open-mindedness and tolerance" that the left often preaches.
Professors are definitely partly to blame for America's shifting to the left politically. You would be very hard pressed to find very many conservative professors. I think this does a disservice to society. Professors should remain neutral and allow the young people of America to make up their own minds.
In general Americans are becoming more educated but the economic disparity between rich and poor is increasing. So the people are getting more disenchanted with Capitalism. This is leading these educated people to move towards more liberal and leftist thoughts. The professors and academics are more interested in science and progressive thoughts which are sometimes in conflict with right wing thought. Because of the education system being more progressive the people are also becoming more interested in left and liberal thoughts. Americans also had to face war and recession and they want to try out the liberal thoughts which the professors and academics suggest.
Since there are a lot of bad professors out there who only teach to make money not to help students learn, I think that they are part of the problem and not the solution. Although you cannot completely put the blame on professors for America's shifting left, it cannot be ruled out that many professors do not do a good job relaying their lessons across to the students. Professors need to do a better job teaching their classes and step down if they cannot do so.
Most professors are liberal but students are smart enough to make up their own minds about whether they want to be liberal or conservative. Also, I've personally never met anyone who changed from right wing to left wing as a result of going to college. Society does seem to be leaning more towards the left but you cannot place the blame on professors.
Modern society as a whole has been moving "left" for many years, according to "left" being defined as placing emphasis on equality and human rights. As the older generations, who are generally more conservative, lose their influence, it is natural for America as a whole to move towards the "left".