How Do We Know?

This is a question that has occupied me all my professional life. How, in fact, do we know what is true and what is false? I have at times thought about this question in metaphysical terms and at times in sociological terms, at times as a problem of phenomenology and at times as a problem of psychology. I believe, quite generally, that knowing is something we do with our bodies, which is to say with our minds and our hearts, and that we do it together. We can’t know anything by ourselves. But we do have to do something. We can’t leave it altogether to others to know things for us.

All of this is quite trivial, perhaps. But I think our epistemological situation today has a certain urgency. Sometimes it feels like an outright emergency. Once we have raised the question of how we know things, and looked at the chaos of our means to do it, we can easily find ourselves wondering whether it is possible to know anything at all. For my part, I am optimistic. I not only believe that we can know a great many things, but that we in fact do know a great many things. And I’m confident that we will learn a great many more. When I say “we” I mean us as a culture, of course, but also you and me. I am entirely hopeful that I’m going to learn something in the years to come.

How, then, do I propose we do this? My epistemology these days is built out of three basic competences. To know something we have to be able to (1) make up our minds, (2) say what’s on them and, (3) write it down. More specifically, we have to be able to (1) form justified, true beliefs, (2) discuss them with other knowledgeable people, and (3) compose coherent prose paragraphs about them. While I’m happy discuss the standards against which we might evaluate these competencies, and while I of course grant that mastery is always relative, I will not consider someone knowledgeable (not even you, dear reader) if they declare that they lack these abilities. If you haven’t made up your mind about something, are unwilling to discuss it, and/or refuse to put it in writing, I will not grant that you know it. I think it would be great if we approached all claims to know things in this way.

Remember that every time someone expects you to believe them they are claiming to have knowledge. If not — if, when you scratch them, they say that they don’t actually know — then you have no reason to believe what they are telling you. The reason for this isn’t really philosophical but rhetorical: when someone claims to know something and asks you to believe them there is a conversation you can have to help you make up your mind. You can ask how they know. What tools and materials did they use to attain this knowledge? What theories framed their inquiries, what methods guided their investigations? What concepts did they use to grasp the data they were given? And their stories here will be plausible to you or not. You may find the stories so convincing that you simply believe them then and there. Or you may choose to retrace some of their steps, to try to replicate their findings. You may suspend belief as long as you like. As some point, however, if you want to know this thing, you will have to make up your mind.

Again, trivial notions that, to me, have some urgency in these times. I sometimes sense an indignation in my fellow human beings that it should be this hard to know things. Why can’t we just believe what we are told? people seem to say. They seem disappointed in their “post truth” politicians and “fake news” journalists. They seem to feel entitled to being told the truth at all times. They won’t accept the burden of checking the facts for themselves. We seem to have lost our sense of charity about the “honest error”, both in the thought and the speech of our interlocutors and in our own hearing and thinking. We imagine everyone is always either speaking their minds plainly or deliberately concealing what they think. We imagine that what we hear is always what was said, and what we understand by it is always what was meant. We have lost our taste for the difficult work of believing things.

The knowledge of a culture is the product of countless well-meaning, but often misguided, people trying to figure out what the facts are, working among countless more who are trying to accomplish their goals, with or without our knowledge. The facts, of course, don’t make themselves known, even under the best of conditions. We must do it for them. But at some point, it sometimes seems to  me, a generation decided that the work was more or less finished and there was nothing left for us but to believe. I think we’re beginning to see that this won’t work. I’m hopeful we’ll soon learn how to know again.

Flipped, Blended

There’s a lot of talk about “blended learning” and “flipped classrooms” these days, also here at CBS. I, too, deliver much of my writing instruction using a combination of online and in-person interaction. But I sometimes wonder if we’re not a little too excited about all this. I mean, hasn’t education always involved a “blend” of media, hasn’t the classroom always been “flipped”?

Consider the most traditional of classrooms. The students have been given an assigned reading and are told to show up to class prepared to discuss it. How is this less “flipped” than asking the students to watch a lecture or documentary at home and having them show up for class prepared to engage in some “interactive” task based on it? Likewise, isn’t traditional education already a “blending” of reading assignments, writing assignments, group work, classroom discussion and lecturing? The real issue, it seems to me, is not whether we are using multiple modalities to educate the students, but whether we can get them to do the activities we assign, show up for the encounters we have planned.

The introduction of IT has, indeed, changed this game somewhat. There is, especially, a tendency to immerse students in one or another form of “social media”, through which their learning process is continuously in contact with that of their peers and, at least as sometimes experienced, their teachers. Such contact is important, but I think we do well to reflect on whether it really does have to be continuous. I don’t want to get into the very deepest layers of this thorny issue, but we do have to be mindful of the difference between the slogan “Social media means you never have to be alone” and “Social media means you can never be alone.” One problem with our online lives is that they don’t really know when to leave us alone. Students participating in a course cannot be expected to react to updates at all hours, nor can they expect to receive information that is given out in a particular time and place that they were prevented from attending. We need to maintain a certain amount of order.

I think this is an important challenge for education in the future. While I am embracing the new technologies in my own practice (I’ve been blogging since before it was mainstream), I still defend a lot of “old school” sensibilities–brick and mortar, chalk and talk. I don’t, in particular, like the anti-lecturing rhetoric of those who are promoting the new pedagogies, just as I’m wary of the anti-prose rhetoric of the promulgators of “new literacies”. If we really did train the ability of students to read treatises, write essays, and attend lectures, I think our culture would be stronger as a result of it. The ability to read a 25-page chapter, write a 5-paragraph essay, or listen to a 45-minute lecture is not trivial. Training these abilities shapes minds of a particular kind. I believe those minds have a value.

Charity and Literacy

“Charity I have had sometimes…” (Ezra Pound)

I’ve been thinking about Twitter lately. When Carlo Scannella suggested, back in 2008, that Twitter might be an “oral” (sub-)culture, not a strictly “literate” one, the project was still in its infancy. Like him, I think a strong case can be made that the way people ordinarily use the platform does not exhibit their literacy. That sounds almost like I’m trying to be funny, but there’s a serious meaning behind it. In this post, I want to say a little more about that but keep in mind that Scannella has made a similar argument for blogging. I’ll look at that another time, as a way of picking up a project I started on my now-retired blog but never developed fully.

Let’s begin with the strongest argument for considering Twitter a site of literacy practices. There are explicitly literary projects on the platform, like those of Teju Cole and Eric Jarosinski. These are clearly meant to be read, however ironically, as writing within a symbolic order that connects it to the full corpus of our literature. In other words, Twitter certainly can be a site of literacy practices–it can even be used to produce literature. Of course, you don’t have to produce literature in order to demonstrate literacy, so the question remains. Is Twitter willy-nilly a site of literacy–simply, for example, because it requires people to use written signs?

Well, first of all, it doesn’t actually require that, does it? Twitter now lets people share images, still and moving, as well as recorded sounds, as easily as strings of words. It is possible to retweet someone else’s picture, with a comment that (literally!) is nothing more than a thumbs up. Being able to use Twitter, then, requires a very limited mastery of written signs. But that’s too easy an argument, I’ll grant. The interesting question (especially back in 2008) is whether Twitter is a literate culture even when it uses written signs, letters, words, sentences.

My first hint that it is not comes from my own experience. I have found distressingly many typos in my tweets. Apparently, I don’t proofread them properly; I just fire them off — just as I sometimes deploy imperfect sentences in conversation or extemporaneous lecturing. I treat my sentences on Twitter as utterances that I don’t know how will turn out when I begin. I don’t seem to feel nearly as compelled to edit them as I do when I’m actually writing. I’m also not especially bothered by other people’s typos. While they might make me stop reading a book they had published, spelling mistakes and missing words don’t make me abandon someone’s Twitter feed. It seems perfectly understandable and, where errors make the tweet difficult to understand, I proceed as I would in conversation, asking my interlocutor to clarify what they must have meant. I don’t “correct” them. I think that’s a telling sign.

The same can be said about serious disagreements over facts and ideas. I approach any particular utterance as a tiny piece of a larger puzzle that doesn’t have to make sense on its own but will only be given a meaning as the conversation proceeds. Twitter is, ideally, located on what Rosmarie Waldrop called “the lawn of excluded middle”, “where full maturity of meaning takes time the way you eat a fish, morsel by morsel, off the bone.” Or, as Pound would perhaps have suggested, we must read each other’s tweets with charity, for maximum truth and meaning. Perhaps “it coheres all right / even if my notes do not cohere.”

But now we come to crux of the issue. Are we engaging correctly with what Twitter is when we seize on an individual tweet and denounce it as an expression of someone’s racism, sexism or other vile disposition? Should we not approach an out-of-context tweet initially as a mere fragment of a conversation we’ve overheard? “Did s/he really say that?” does not have to be a rhetorical question. Perhaps there’s a typo or a missing word that accounts for the apparent error in thought, i.e., “incorrectness” of opinion. Perhaps it “coheres all right” in the larger of whole of a detailed argument.

One of the characteristics of a literate culture is that it has a “record”, a body of “documents”. But not every utterance of a literate culture is, in fact, a document, even when, technically speaking, “written down”. Not every receipt or shopping list or scribbled note to oneself is a record of what a culture thinks and feels. We are pretty good at distinguishing passing remarks from serious propositions in speech. (We get a bit confused, however, when someone makes a recording of them.) Perhaps we need to practice greater charity in our approach to tweets. They do not document what any of us believes. They are merely part of the stream of language within which we have to find our composure. It is when we find it that we can produce a proper “text”, i.e., practice our literacy. Only then are we actually writing.

Curb Your Impact

Researchers are often advised to have an impact “beyond academia”. Indeed, Andrew Gunn and Michael Mintrum (2016) see the so-called “impact agenda” as focused on “university-based research that has non-academic impact, meaning it delivers some change or benefit outside the research community.” The basic idea is that the value of our universities lies in their contribution to society, for which the economy often serves as a proxy.

The EU’s 2020 target is to spend 3% of GDP on research and development, about a third of which is likely to be invested in university-based research (the rest is government and business sector R&D.) While 1% of GDP may not seem like much, it’s certainly a significant amount of money (about €150 billion) and it’s reasonable to expect a return. Indeed, Gunn and Mintrum detect “an impatience to see rapid and significant returns on the investments being made.”  This agenda can certainly be felt by researchers, whose performance is increasingly measured through indicators of societal impact. They are now acutely aware of their stakeholders outside the university and they are earnestly engaged in catching their attention. But perhaps the impatience that Gunn and Mintrum detect in all this activity should give us pause.

Consider another target in the Europe 2020 strategy: 40% of people aged 30–34 are to have completed higher education. Already today, about  25% of 20-29-year-olds are enrolled in university in most EU countries, so this target (which is already close to being reached) is entirely realistic. In any case, a few percentage points either way won’t affect the point I’m going to try to make. With such a high percentage of the population being either the product of higher education or in the process of getting such an education, it seems odd to get the people (researchers, faculty) who are spending a mere 1% of GDP to invest a great deal of their time (a portion of that economic activity) thinking about how their research makes a contribution “beyond academia”.  If the  researchers confined their efforts to their peers and students, their impact would by no means be insignificant.

With the passage of time this impact would spread throughout the population as the students graduate and begin to apply the skills, facts and theories they have learned. While this will be slower than the impact a piece of research might have by directly shaping government polices and corporate strategies, it will be deeper and more lasting. This is especially true when we consider that the 40% of thirty-somethings who have completed higher education will presumably have a disproportionate impact on GDP (i.e., as a group they will contribute more than 40% of their cohort’s GDP-contribution). Finally, by letting students round off their understanding of the theories in “the school of hard knocks” before they are implemented at the policy level some of the worst excesses of Ivory Tower thinking can be avoided.

Imagine for a moment what would happen if academics came to measure themselves only by the external performance indicators stipulated by the “impact agenda”. (Perhaps we could add to this an interest mainly in student evaluations as regards their teaching.) Suppose it was a matter of complete indifference to them what the students actually learned (so long as they were “satisfied” while at university) and what happened to the society and economy in the long run (so long as they were feted by bureaucrats and entrepreneurs.) In the short term, it is possible that high-quality research would inform well-crafted policy initiatives. But, after a time, researchers will learn (for they are not stupid) how to game the system to maximize their immediate, apparent impact. Mediocre but shiny ideas will crowd out the ideas that might actually work in practice. Many mistakes will be made. Meanwhile, since the minds of the students have been mostly neglected, the next generation of researchers won’t be up to the task of delivering what the best minds of the last generation were capable of. The race to the bottom will be on.

To avoid this, I would recommend evaluating academic research by measuring the value of higher education according to its long term impact on the economy and society, not the immediate measurable effects of discrete pieces of research on the policy process. I think the short-term indicators (student satisfaction, policy impact) are ultimately distractions, and we might find that academia is performing well by them but driving the culture into the ground. If researchers think the most important audiences are outside their own disciplines and classrooms, they will stop giving their best thinking to their students and colleagues, waiting instead for them to occupy some position of power wherefrom they can commission a policy paper. Education becomes merely a way of conditioning students to draw on “academic” expertise when they get out into the real world. “Life -long learning” becomes a kind of conditioned dependence on epistemic authority. And peer review becomes the mutual appreciation society it has long been accused of being.

In short, I propose we curb our socio-economic enthusiasm, our pursuit of “impact”. Let’s direct our energy back into academia and wait for it to diffuse organically throughout society. As T.S. Eliot once said, you don’t make flowers grow by pulling on them.

Update: See also Tina Basi and Mona Sloane’s post at the LSE Impact Blog.

Peer Grading and Peer Review

There is an important connection between my thoughts on peer grading and my thinking on peer review. Peer review in the familiar sense of having two or three qualified scholars read and comment on a paper before it is published in a journal was once necessary because publication was a costly process and there had to be some fairness about who got access to the scarce resource of printed pages. The process had to be blinded (in my opinion) to encourage frank assessment of work and discourage nepotism that would have wasted valuable space. Similar practical considerations justify the traditional approach to grading, in which students submit a single copy of their essay and the teacher gives it a grade that is none of anyone else’s business. For centuries it was simply impractical to demand that students produced 25 copies (or more for larger classes) of their essays. But information technology has made the cost of publishing and making copies negligible. Or, rather, once a platform exists, the cost of publishing an additional paper or making an additional copy is essentially zero.

This makes it possible to imagine an “open” peer review system in which papers are posted online along with the reviews as they come in. I’m not against such a system but I’m not sure it’s actually necessary. To me, it seems easier just to have people post their work to their own website (which may be personal or hosted by a university, but should in any case link to an institutional repository to provide a stable URL) and see what other people do with it. “Publication” just means uploading it to a server and sending an email announcing it to your peers. All review in this world would be “post-publication”. That is, our ideas would be evaluated as we would, ideally, evaluate the results in published papers. Today, unfortunately, the “peer-reviewed” stamp has a tendency to discourage serious criticism of published work. That’s an important reason we’re in the middle of a “replication and criticism crisis”.

Now think of students and their grades. They work alone (or in groups isolated from the rest of the class) on papers they submit to the teacher. After some time they get a grade that passes some final judgment on them. They do of course hear about the grades their peers received on the same assignment, but how many of them then sit down and read those papers to learn what a “better” or “worse” paper looks like? No, once the grade is received, closure has (in the vast majority of cases) been reached. Just as too many researchers put their published work behind. it’s time to focus on the next assignment, the next grant proposal.

But we have long had the technology to make a completely different approach possible. In addition to their assigned reading, students can be asked to read each other’s work. They can be expected to “publish” weekly assignments to a class website, where their peers can comment on them. Here, they can teach others what they’ve learned from their readings and discuss the issues that have come up in class. And they can do so by way of acquiring the writing skills that scholars have. The peer-feedback can be as mandatory as the assignments themselves. And it, too, can be graded. Indeed, even the students can grade each other, trying to apply the same criteria that their teachers apply. And the can in turn be graded on their ability to do this, their eye for quality work.

I have long suspected that the culture of peer review is an extension, or continuation, of the culture of examination. There are many who argue that the peer review system is broken, or at least seriously bent. Perhaps we can only change the way we engage with the published work of our peers by changing the way we engage with the work of students, and the way we ask them to engage with each other’s. After all, today’s students are tomorrow’s scholars.