Are professors really just liberal bullies?
According to a new Zogby poll, a significant number of Americans think that most college professors are politically biased. Considering that more people who identify as "Republican" think that bias is a serious problem, we might assume that they think the bias is of a "liberal" bent.
Conservative scholar Victor Davis Hanson tries to explain this bias, which he feels is leading to a massive miseducation of American students today. He gives a familiar story: up until the 1960s, higher education focused on "classical issues" about eternal truths embedded in history. Toward the end of the 1960s, liberals took over colleges and universities and started to impose a "theraputic" agenda on education. This included the creation of programs such as women's studies, ethnic studies, peace studies, and popular culture studies that are all intended to make students aware of the the awful things about the society they live in--race, class and gender inequalities, etc.
For Hanson, the problems with the new higher education are: 1) it forces students to accept a liberal party line on a variety of social issues. Student must simply "accept" their professor's "preconceived notions"; 2) it forces students to accept relativism--the idea that there is no such thing as eternal Truth. Since Truth and Reason don't exist, the only way to get people to do things is not by rational persuasion, but by force. Only power matters--(and white, heterosexual, capitalist men have had it too long, according to this "liberal" view, so they are the bad guys); 3) it forces students to ignore vocabulary, grammar, syntax, rhetoric, as tools by which people can form ideas and lay them out in a public forum for discussion and debate.
Ultimately, Hanson believes that we ignore classical disciplines such as history, literature, philosophy, and language at our peril because if students have no knowlege of the eternal truths that lie within the thread of humanity's hisotry, they will be incapable of judging the present according to any decent standards.
I admit that I teach in both what Hanson would call "classical" and "theraputic" disciplines. What I find is that students already come to those classes with a belief in relativism and a lack of history, language, literature, and critical thinking. I spend most of my time trying to convince students that an opinion is not something sancrosanct and that they ought to have a way of providing reasons and evidence for what they think. I can't count the times I hear someone say "Well, who's to say what is true, right, moral, etc? What right does anyone have to tell me what to think?" Perhaps they find these attitudes reinforced in some of their college classes, but I think they come to the university with these views already. Trying to convince students (who are not Christians--they seem to have no problem grasping this point) that there might indeed be some ethical standards that we can all agree upon and are not dependent on your culture is one of the first and hardest task of the ethics and political philosophy courses I have taught.
What I find interesting is that Hanson's distinctions suggest some kind of essential core to the "theraputic" disciplines--they are all alike in their message (America and white men are bad). It is as if there is nothing within those fields that allows them to impart wisdom of any sort. Its curious that most of the fields he mentions are heavily influenced by the social sciences, such as political science, sociology, and anthropology. One way to read his article would be to say that it merely displays professional envy by someone in the humanities who has made a career out of interpreting works that few people now (or ever) really paid that much attention to. Its not clear to me why the "classical" disciplines can't be taught dogmatically, as gospel that you have to accept. Indeed, I wonder what would happen in one of Hanson's classes if a student disagreed with one of the "eternal truths" that supposedly lie within the great works of Western history, literature, and philosophy. (And its not at all clear what those might be--its not as if Western culture speaks with one voice and has had only one message in the course of some 4000 years of recorded history)
But the political dimension of this article can be tied up with the results of the Zogby poll. It seems that older, white, conservative people might feel that the university does not reflect their understanding of knowledge any longer and is being overrun by liberal elitists who do not have faith in American democracy, truth, and morality.
What role does the university play in our American democracy? What kind of knowledge should it pursue?