“College is where liberal professors teach ‘ridiculous’ classes and indoctrinate students who hang out and protest all day long and cry on our dime. They’ve become elitist, politically correct institutions that often fail to provide practical skills for the job market” (Washington Post, 2017). Wow! That’s quite an indictment of American colleges and universities.
College in the 21st Century (or, is it too late to change my major?)drallisonbrown2019-07-24T08:38:14-04:00