Formation, Part 5

There can be little doubt that the conditions under which scholars work shape the ideas they have.  To my mind, this makes it extremely important to think seriously about those conditions. As Steve Fuller suggested already in his first book, Social Epistemology, it ought to be possible to predict what kind of knowledge a particular organization of cognitive labor might produce, or, indeed, to work out what the best way of organizing our intellectual pursuits might be if certain kind of knowledge is our aim. More existentially, we can ask what kind of mind will result from subjecting a human body to a particular form of discipline.

That’s, of course, the question I have been asking over the past four posts about the formation of an “academic” way of thinking in elementary, secondary, and post-secondary education, then onward to graduate school. In this post, I want think a little about the most advanced stage of academic development, namely, the mind of the full-time scholar, the tenured professor. This means I’ll be fast-forwarding from graduate school to tenure, without paying much attention to the increasingly important “formative” (deforming? disfiguring?) process that occurs in the “post-doctoral” but “pre-tenure” period. This is the life of adjunct faculty and assistant professors and deserves a post of its own. But I think it is useful, first, to sketch the sort of mind that those years are supposed to both produce and allow us to select for a life (more or less) of service to the university.

I believe that a university should provide conditions under which people who have demonstrated their intellectual abilities are free to make up, and speak, their own minds. One way to put this is that it should be a place where it is possible to have an idea without holding to an ideology. A place where it is possible to think outside the orthodoxy. (Remember that story about Galileo from my last post; it is possible that the Church actually offered Galileo such intellectual freedom–albeit only to make up his mind, not to speak it.) Another way to put it is that it should be possible to think without the support of a “foundation”, i.e., a funding agency with an agenda. Rather, the university should provide the, let’s say, universal foundation of “reason”, to support the inquiries of scholars.

I think there is way too much pressure, even within the university, for academics to “sign on” to one or another ideological tradition, to make themselves useful to one or another social project. Riffing on Al Gore’s famous title, I once asked Fuller whether truths could be judged more or less “convenient” to particular political interests, and whether all truth is destined always to be judged, in part, on this kind of convenience, rather than being held to some more universal, rational standard. He answered that “universities have a vital role to play in mainstreaming awkward voices … by integrating them into a curricular narrative, so they are not seen as merely slaughtering the sacred cows but as replacing them with a more durable species.” For “awkward” here, we can read “ideologically inconvenient”. Most importantly, we can imagine a university that does not let a truth that is inconvenient for some current constellation of interest groups also be inconvenient for an individual scholar to believe it and, as it were, “profess” it. Indeed, we can make it entirely convenient for professors to believe and say whatever they want.

It seems to me that the value of such an institution, where ideas would be able to flourish independently of their utility for enterprises of state or business, is obvious. The same institution would expose each new generation to a similarly flourishing kind of mind in the classroom. And it also seems to be a pretty straightforward matter to arrange the necessary conditions. Compared to the enormous costs of running today’s universities, I think such institutions would be relatively inexpensive to establish. Obviously, governments and corporations could always try to entice the greatest intellects out of their garden of free inquiry and into more “gainful” pursuits. Let those whose first love is truth remain behind. There’s nothing shameful about either set of values; they’re just different. The important thing is to ensure that universities are staffed by people who value freedom and stability over profit and innovation.

Formation, Part 4

One of the things we learn about ourselves as undergraduates is whether we have a natural disposition for “academic” work. While a university education is increasingly necessary to success in professional life, it isn’t something that everyone finds equally enjoyable or interesting. Some people just “get through it”, even with very good grades, and look forward to earning their degree and starting their careers. Others, however, are saddened by the prospect of leaving school and getting “real jobs”. For some of them, there is grad school.

Graduate studies, of course, require a certain aptitude for intellectual labor.  But they also require a set of (for lack of a better word) “moral” competences to engage with others in a common project of maintaining and extending our cultural heritage. Graduate students have the double task of demonstrating that they are able to study an issue carefully to arrive at qualified conclusions and showing that they are able to participate in a community of inquirers who will both agree and disagree with them. In some disciplines there is plenty of room for cantankerousness or independence of mind (as you choose), whereas other disciplines are less tolerant of aggressively critical personalities. Indeed, some disciplines outright require a certain assertiveness, while others, conversely, require deference, often no less explicitly. Finding the right field, for you, involves being sensitive to the tone of discourse and gauging your own reaction to it. Is this the sort of relationship you’d like to develop to your peers?

These posts on academic “formation” have been concerned with the place where our moral and intellectual competences meet–particularly as they meet in statements of what can be called “doctrine”, i.e., orthodox truths. I have been trying to argue that being too eager to indoctrinate our students with what we believe is true (and sometimes good) might interfere with their ability to form the habits of mind that are needed, later in life, to form their own beliefs in a scholarly or scientific way. Sometimes (indeed, most times) we have to let them hold false beliefs for good reasons, on pain of getting them to hold (or just profess) true beliefs on our authority. I’ve shown how this might work in primary, secondary and post-secondary education. But can this tolerance for falsehood be sustained at all levels of education? Does it apply, for example, to grad school as well?

One of the most important things I learned from Steve Fuller many years ago was to treat “theories” as “presumptions”. This, he suggested (in Philosophy, Rhetoric and the End of Knowledge), might help me to feel less oppressed by the intellectual orthodoxies that one constantly encounters on university campuses. Instead of imagining that I have to believe an orthodox opinion, I can treat it as a presumption that governs how I am allowed to discuss it. It guides the process of conversation, the “procedure” by which the truth may be challenged and defended. In this sense, it works much like the presumption of innocence in a trial. I don’t have to believe that the accused is innocent, but I do have to treat the accused as though she’s innocent until such time as my challenge succeeds.

How can this inform the development of the graduate student’s mind? Well, instead of forcing our graduate students, whether at the MA or PhD level, to adopt as gospel the currently fashionable theory of the phenomena they are interested in, we can ask them merely to presume that their elders are right about it until the preponderance of the evidence they gather persuades the community to change its mind. That is, we can allow them to earnestly pursue and expose our errors, so long as they grant us that those errors are the result of our own earnest and sincere attempts to discover the truth. We can require them to acknowledge the orthodoxy, that is, without demanding that they genuflect to it.

The traditional symbol of such genuflection is, of course, Galileo’s renunciation of his belief that the Earth moves, at the demand of the Catholic Church. I won’t get into the details in this blog post, nor presume to settle the issue in such a place, but it is important to note that the conventional caricature of Galileo as a victim of orthodoxy has been plausibly challenged (by Paul Feyerabend, among others). The truth, some argue, is that the Church was perfectly willing to let Galileo continue his inquiries into the hypothesis that the Earth moves, so long as he did not publicly denounce the current orthodoxy until a sufficiently robust and elaborate alternative could be constructed. If they did not proceed more cautiously (than the fiery Galileo would have preferred), the Church feared, it would only cause confusion and draw the general authority of the Church into question. The result would be chaos, both moral and intellectual. On the face of it, there is some wisdom in that attitude.

Today, the University plays the role of the Church. It demands that scholars be careful in their public pronouncements (about the climate, vaccines, evolution, gender, etc.), always acknowledging the dominant view (as that held by “97% of all scientists”, etc.), while at the same time promoting and defending their right to pursue their inquiries wherever their curiosity leads. It’s freedom with responsibility, to use a favorite conservative slogan. And it is sometimes forgotten that a university is much better thought of as a knowledge-conserving institution than a knowledge-innovating one.

The existential question that graduate students should be trying to answer, and should be helped by their supervisors to answer, is whether they have the personal disposition to work in an environment that presumes the truth of a number of statements that they, personally, know to be false. We must, I would argue, never demand that they believe something they have good reasons to reject, but we can, in fact, ask them to proceed on the presumption that what we’ve known for decades more or less holds. We should not be ashamed of testing their knowledge of the tradition, no matter how “conservative” that makes us look.

Formation, Part 3

This series on the formation of an academic habit of mind will probably end up being in five parts. I’ve written about primary and secondary education. I’m now going to write about “higher” or “post-secondary” education, then graduate school, and then a career in research. In all cases, I’m trying to focus attention on how we know things, rather than what we know. Indeed, I’m worried that in our eagerness to get students, and, later, citizens, to believe particular things we are undermining their ability to make reasoned judgments about what is true or false. Sometimes, I want to suggest, you have to let someone arrive, for the moment, at what you think is the wrong conclusion so as not to undermine their ability to think.

I’m generally in sympathy with those who think that today’s undergraduates are being subjected to an outrageous demand for “political correctness”. These students are being told that there is nothing wrong with being gay, but there is something very wrong with thinking that there is something wrong with being gay. They are told to be acutely conscious of the history of oppression of women and minority groups, and yet that their suspicion that gender or race or disability is a marker of inexorable differences between people is simply a continuation of that history of oppression. They are told that a whole series of “questions” that they might have aren’t really questions at all, but refusals to listen to what other people say. They are being told not to ask those questions. They are being told the answers to them.

Let’s keep in mind that we are talking about students who are usually between 17 and 23 years of age. My own ideas about the culture I lived in were undergoing radical changes during this time. If I had not learned to empathise with my younger self, I would be continuously cringing at the thoughts and feelings I cultivated at that age. Many of them were just wrong and silly, but some of them were seriously “incorrect”. Some of them were racist and some were sexist. Some of them were attempts to justify my privilege and some of them were the false problems that such privilege suggests. I don’t claim to have corrected all my views, but I do think I remain corrigible.

Though I was not aware of it very explicitly, I think I enjoyed, in my undergraduate days, (the early 1990s) what Steve Fuller has since taught me to call “the right to be wrong”. That is, I was allowed to express my beliefs, no matter how erroneous, albeit always at the risk of being corrected by my peers about how wrong I was. I had the right to be wrong, we might say, but not the right to remain blissfully ignorant about what is right. I was entitled to hold my beliefs, even in the face of evidence to the contrary, but not to remain in the dark about that evidence.

Perhaps this was the privilege of being a philosophy major. But I think it applied quite generally at the university I attended. The “right to be wrong” is in effect wherever it is possible to succeed as long as your arguments are sound and reasonable, regardless of the conclusions you reach. In an important sense, you could be a homophobe in those days as long as you weren’t a bigot about it. It wasn’t about the beliefs you held, but the way you held those beliefs. It was not about your conclusions but about your comportment.

Today, words are considered acts, and sometimes even acts of violence; underlying sentiments are deemed right or wrong, and to even look at another person in a certain frame of mind is deemed inappropriate, oppressive, hurtful. So I worry about today’s students. It must be impossible to think under those conditions, acutely aware of a whole list of “thoughts” that, no matter what your basis might be for thinking them, reveal that you are already a contemptible person. We have these absurd cases now of students being expelled for the “micro-aggression” of reading a book that is merely about the incorrect view in public.

In our eagerness to make universities “safe places” for people of all races, genders, creeds, backgrounds and orientations, we have forgotten how fundamentally “safe” a college education simply is by nature. There are few if any sticks and stones wielded on campus (certainly not in the course of their daily studies by students against students, though sometimes, let’s grant, they’ll be involved in clashes between demonstrators and police.) There are just lots of words. And, while there is a certain competitiveness and lots of social intrigue, it is subtended by a great deal of freedom, to change courses, majors, and even universities, to leave a party or to stay home in the first place. Most importantly, there is the certainty that it will soon end–at the end of the semester and, ultimately, with the granting of the degree.

As I’ve said before, school is a leisure activity. It lacks the seriousness of “real life” and should therefore be a place where a broader range of experimentation is allowed, where a greater freedom of expression can be afforded. We should not be so keen to identify and denounce particular beliefs that people hold. Instead, we should seize upon those beliefs as occasions to form a properly “intellectual” way of holding them. Though it may feel odd, we have to make people better at holding beliefs we don’t agree with. After all, being good at holding a belief means being able to change your mind about it in the light of evidence.

Formation, Part 2

Schools, of course, produce an “academic” frame of mind. As Pierre Bourdieu was fond of pointing out, the root meaning of “school” is leisure; the learning we engage in there is detached from the day-to-day labor that social life requires. In school, we learn to justify our beliefs, not in terms of our experiences, but in terms of the evidence that can be adduced for them. We learn to construct arguments for our claims rather than appealing merely to our own authority or that of others.  That does not, of course, deny that authority exists. In real life, we do sometimes just have to do as we are told. But in the relative leisure of school we learn another kind of legitimacy, the authority of “right reason”. We are taught a rational procedure for forming beliefs.

In my last post I talked mainly about primary education. In my next post I will talk about higher education: universities, or what is also called “post-secondary” education. In between, then, there is something called “secondary” education: high schools and “prep schools” in North America,  the lycée in France, gymnasium here in Denmark. It is here that the academic habit of mind is really supposed to form, to “prepare” the student for a university education, or demonstrate very clearly that the student is not suited, either by ability or interest, for such an education. Interestingly, this happens during the student’s adolescence, which is a formative period in many other ways as well.

During their secondary education, students not only continue to develop distinct academic competences, they also develop their sexual interests, orientations, and competences, their political views, their sense of style, their athletic ability, their artistic talent, and so forth. Much of this is integrated into their school experience, some as part of the curriculum, some as extra-curricular activity, and some as part of ambient social life. Although middle-aged adults often don’t much recognize their teenage selves in their current beliefs and attitudes, much of our personality is no doubt grounded in our experiences in these years. And with this personality there is also what may be called an intellectual style.

Since my focus is on the formation of the academic habit of mind, I would like to identify distinctly scientific attitudes by first distinguishing and then connecting them to distinctly political attitudes. Both sets of sentiments, if you will, are explicitly educated in school–in the science classroom and the civics (or social studies) classroom. As I pointed out in my last post, we are trying both get them to think in a certain way and to get them to think particular things. In the science classroom we want to develop a general “empirical” frame of mind in students, and also get them to understand a specific set of truths. In the civics classroom we want them to develop a healthy “deliberative” approach to politics, but also a belief in the virtues of democracy and the rule of law. If schools produced a lot of climate-change-denying fascists, for example, we’d consider them a failure.

But should we also be worried if our schools produced a few climate skeptics and a few fascists? Should we make students who don’t arrive at the orthodox scientific and political conclusions before they reach the age of eighteen feel “wrong”, or “stupid”, or even “evil”? Is it really so bad for a high school student to be unconcerned about global warming or sanguine about the prospect of dictatorship? What of the student that simply doesn’t believe the climate is changing? Or believes the change is natural, like the coming and going of ice ages in the past? What of the student who isn’t convinced that our democracy is working or, having listened a bit too much to George Carlin, that the country is run “by the owners for the owners”? What do you do with students who are just a bit too skeptical, or a bit too cynical, to “buy in” to what normal, “decent” people believe about our world and our history? What sense of their place should they have as they graduate from high school?

My sense is that educators are a bit too worried about what the students believe when they are eighteen. And they are not worried enough about what the students are able to do at the same age. In fact, it almost seems like they are worried that the students think too much and believe too little. The students themselves internalize this anxiety and, depending on their natural degree of conformity, look for beliefs they can either subscribe to or rebel against. Their teachers differ, of course. Some inspire them to conform, others inspire them to rebel, but the effect is the same. They become oriented around the doctrine, not the inquiry. The students want to be told what to think, not how to think. Or that, in any case, is what I worry is too often the case.

Perhaps, in high school, it is too soon to talk about climate change and democratic process. Perhaps, as in the case of biology and evolution in primary school, there are simpler processes, smaller machines, to think about and understand before the students should be asked to apply them to understanding global weather patterns and national election results. Perhaps students could be taught about relatively familiar institutions, like money and family in the civics classroom. Perhaps they could be taught about ocean circulation and cloud formation in the science classroom. Learning to understand these things will also teach them how to gain an understanding of the larger issues when, as adults, they will need to engage with them directly, as members of the citizenry.

They will then be able form a qualified opinion about carbon taxes and campaign funding. They will understand how the basic components work and what proposed changes might accomplish. As they learned these basic components, they will have been treated as serious thinkers with open minds that don’t have to be “made up” in a hurry. They will come out of high school, perhaps with a wide variety of provisional beliefs about the world in which they live, but also with an understanding that those beliefs are likely to change over the next eighteen years of their lives as they get confronted with new evidence.

They will not see their current beliefs, but their ability to think,  not how they now experience the world but how they evaluate evidence, as the foundation for the next phase of their education, whether that be in college or the school of hard knocks. No matter how “dangerous” their ideas are at this point, they will not have gotten the sense that their teachers (and therefore their culture) found them threatening. Their minds will have been kept open for further participation in the “ongoing conversation of mankind,” at their leisure.

Formation, Part 1

Education is a formative process. At school, we learn not just what to think but how to think, and those two kinds of learning are, famously, not always pulling in the same direction. The better the students are at critical thinking, the better those students are at resisting your attempts to indoctrinate them. This is why you had better be selective in your choice of what to get them to believe. Basically, you want to make sure you get them to rationally think what you want them to think ideologically. If you don’t respect their (correct) reasoning, you are going to do more harm than good, even if the end result is that they believe what you tell them, and even if what you tell them is true. It’s the process by which they arrived at the conclusion that matters.

Now, at each level, there are some truths that, as ends, seem to justify the means. Most curricula have a body of “doctrine” that it is deemed very important for students to learn. This is the legitimate content of, precisely, “indoctrination”. At the beginning, you don’t just want to teach them how to add, you do actually want them to think that two plus two is four. Later on, you don’t just want them to be able to read the classics, you want them to know that it was Polonius, not Hamlet, who said “This above all: to thine own self be true.” The point, however, is that you can, hopefully, get them to believe these things by reference to the evidence of their senses. You can put two apples next to two other apples and count. You read scene 3 of act 1 of Shakespeare’s Hamlet with your students and note the speaker of line 564. That is, you can persuade the students of the rightness of the doctrine without offending their sense of reason.

I wonder if we respect this principle as much as we should. Do we confine ourselves, at each level, to expect students to believe us only about those things that we can defend with reasons? Do we sometimes say things that, though true, and though we do have reasons to believe them, we are unable, given the level of the students, to get them to truly understand? Do we sometimes expect them to believe things they can’t reasonably understand?

Let me take a perhaps controversial example, namely, the teaching of evolution in schools. There is a certain percentage of students, most famously in the United States, whose parents tell them that evolution is not true, and that God instead created them and made them the way they are. They would rather have their children believe that they are loved by God than that their dispositions are the result of a long, more or less random, process of mutation and selection. And their children arrive at school prepared to learn with that belief pretty firmly entrenched. We can say it’s not a “rational” belief, but it is one of the things they believe because the people they love most of all, namely, their parents, told them so. Believing what their parents tell them has, on the whole, proven to be a good strategy of belief formation for them. It has served them, so far, as a reliable “method”. (And keep in mind that many children who do believe in evolution do so for the very same “irrational” reasons. Their parents told them so.)

Then they arrive in school. Here they learn how to (better) read and write, add and subtract, observe and inquire. In the science classroom, these skills are then increasingly brought together as they learn how nature works. They see how dye diffuses in water, how larvae turn into moths, how full of life the water in the local pond is under a microscope. And they are told about a number of “theories” that explain these observations. They are told, for example, that a glass of water is really full of “molecules”, which are too small to see, but really are what the water is. (For fun, they go home and label the kitchen faucet H2O.) They learn the important concept of metamorphosis (if not the word) and get an insight into the life cycle, including, of course, the rudiments of an understanding their own progress from a single cell to the child they are. (For fun, they come home and announce they’d like to be butterflies when they grow up!) They come to understand that “life” belongs not just to cats and dogs and trees, but also to very tiny things they can’t see with the naked eye. In other words, that life begins with something very small, and that there is, perhaps, some sort of connection between life and matter, between the smallest cell and the biggest molecule. (For fun, they imagine that God has a giant microscope he sees them through.)

In other words, they are learning some very basic things that they will need in order to assimilate the modern theory of evolution. There is still a lot to learn, both about natural history and about biological cultures, but they are well on their way. Given time, they may one day be able to understand how life on Earth might have started  with simple proteins and progressed, through millions and millions of years, and dinosaurs and mammoths and neanderthals, to us. But it’s a really big idea. It was only first really proposed less than 200 years ago, and the pivotal “neo-Darwinian synthesis” has only about seven decades behind it. It’s a difficult theory to learn, both for the culture and the individual. It took us about 200,000 years of being human to get there. It took Darwin about fifty years of being alive to come up with it. And it took Darwinians about another century to get it firmly established and integrated into our theories about life.

All along some parents were telling their children that they should be grateful for the life that their loving God (or Great Spirit or what have you) had given them. I think many people who talk about the importance of “teaching evolution” (and not teaching creation) forget what an amazing scientific accomplishment the theory of evolution is, and what an amazing intellectual accomplishment it is to understand it. The idea is literally as amazing as the idea that we’re here because some higher being loves us. And the thought that we descended from apes that in turn descended from bacteria is as profound as the feeling of being loved by an omniscient being. I think we do great harm to our children by rushing them into a “belief in evolution” at an early age, and certainly by denouncing a theory of life that they have been given by parents who love them. What we should be doing is forming their ability to observe nature and inquire into its working. We should be helping them to formulate the question, “Why am I here?” Not giving them the answer.

In time, as they grow into mature thinkers, they will come up with their own “great synthesis” of God’s love and nature’s wonder. Maybe some of them will have to abandon their faith in order to keep their heads straight. Others will decide that the scientific view of life is not for them. Why should anyone mind? If we teach them how to think, they won’t need us to do it for them. If we help them become reasonable people, we won’t need them to think anything in particular.