Places of higher learning, both colleges and universities, should be more involved in helping graduates find jobs. Unfortunately, once students finally get out of school after many years of being in it, it can be overwhelming to enter the job market. Colleges do things such as job fairs, but this isn't always enough. Professors and advisers should write recommendation letters, suggest jobs, and do all they can to help graduates.
I believe that universities should take some measures to help their graduates get jobs. The reason to go to a university is to receive a higher paying job than you would otherwise. They don't need to make sure students get jobs, but they should provide resources for their students to be successful in their job hunts.
Yes, universities should help their graduates get jobs. Universities have more resources than individuals, through their alumni networks and business partners. They also have a duty to educate the student, and should be able to demonstrate the effectiveness of their education programs to the graduates by helping them get a job. And lastly, it will be a positive reflection on the university in the long run.
I think there are a great many degrees offered by universities today which are totally worthless as careers, and will do nothing to lead to employment. I feel that a university should be required to help every graduate find employment in their field. Perhaps that would phase out some of the more frivolous degrees that cost a small fortune but serve no purpose.
Universities already fact massive challenges in preparing students for success in life. In many case this already goes beyond classroom instruction and training. So while university should (and almost all do) provide access to prospective employers and job hunting resources it should not be the responsibly of the school to actually find employment for their graduates.