Yes, I think that now is the time to make more college majors career-focused, and time to get rid of majors that are from liberal studies such as women's studies and philosophy that do not lead to careers. College is supposed to get you a job after college, but too many majors just don't do that.
Now some people might say a major is already a career focused move, how can a college make it more career focused? Just because you know what you what you want to do, and are being taught it, that does not mean that you are learning what you should learn for a potential job in the future. Many jobs require specific abilities that colleges are not teaching and colleges should look at what jobs today are asking for and work that into their lessons.