Must Our Writing Instruction Be Research-Based?

At first pass, this might seem like a no-brainer. “Of course our writing instruction should be based on research about writing!” we want to say. But some strong versions of this claim have given me pause. “If you’re not in writing studies, you no longer get to write those articles about how students can’t write,” Amanda Fields recently said on Twitter. “All you have to do is scratch the surface of composition history and theory to understand how ridiculous you sound when you claim that students can’t write.” I think she was referring to the same Robert Zaretsky piece in the Chronicle of Higher Education that I mentioned on Friday. That piece (and my post) is probably also what Elizabeth Wardle was thinking of when she tweeted that “there is something to know about teaching and learning writing. Our students deserve to be taught from research-based practices. Otherwise you are actually harming them.” Can we really dismiss experience-based criticism of student writing as “ridiculous”, merely because it doesn’t come from a professor of writing studies? Does it “actually harm” our students to be taught how to write by teachers who, let’s say, “merely” do a lot of writing and reviewing themselves as part of their own scholarship? That’s what I want to think about.

Consider a more modest activity. Suppose we weren’t talking about the basis on which we teach writing but the basis on which we offer advice. Does writing advice have to be research-based? I can’t imagine that Fields and Wardle would insist on it. After all, a lot of the advice we pass on to our students comes from working journalists and novelists, i.e., people who have a demonstrated ability to write but have neither the time nor the desire to study it formally. They just have their experiences to draw on, and they’re often happy to share them in the form of tips, or sometimes even “rules”, for good writing. Do we dismiss Stephen King’s writing advice as baseless because he doesn’t have a PhD in writing studies? Or do we just take it with the requisite grain of salt? I think the answer there is obvious and I think we’d have to grant that Zaretsky is an accomplished enough scholar to offer academic writing advice in this spirit.

And research, in fact, also has to be taken with a grain of salt. I used to cite Arum and Roksa’s Academically Adrift to my students, hoping to persuade them that “research shows” that writing makes you smarter and group work makes you dumber. (I didn’t say it exactly like that.) But then I read Freddie deBoer’s critique and realized that it wasn’t as simple as that. I still believe that writing is good for you and that group work is better for learning how to work in groups than for learning course content, but I no longer declare this to be demonstrated by “the research”. (I still sometimes mention it, I must admit.) The point is that the rhetorical force of an appeal to research isn’t univocal. To invoke research is always to invite critique and this commits you to a longer conversation. At the extreme end, we have research like that of the Cornell Food Lab, which had a great influence on the management of, e.g., school cafeterias, until it was discovered to be of doubtful quality. Interestingly, the problems were not discovered by peers and practitioners applying the insights, but by outsiders who were intrigued by the lab’s director’s methodological claims.

Indeed, application itself can become a source of error. If you’re going to provide research-based instruction you’re going to have to translate a study’s results into implications for your own practice. You’ll have to prepare a lecture that will work with your particular students and design an assignment that is relevant for them. If the study is going to guide your own practices, you have to be very sure that you understand its results. Even if a certain intervention has been found to have a positive effect, it may be that too much of it will end up having a negative effect. Or it may be that the effect depends on doing or not doing something else at the same time, or before, or after. Failing this, the whole thing may either be a waste of effort or outright harmful. I should say that I’m less worried about “doing harm” than some writing scholars — like Wardle, quoted above — but if we’re “actually harming” our students when we don’t base our pedagogy on research, then surely that possibility remains if we misunderstand that research?

Wardle describes pieces like Zaretsky’s as expressions of “frustration with a few student writers” from which teachers draw “conclusions based solely on their own experiences, histories, and biases.” Against this, she demands “not personal feelings and frustrations, but research-based evidence grounded in more than a sample of one.” Indeed, “the field of rhetoric and composition [has] spent decades gathering such research about student writing,” she tells us.

This all sounds very impressive and it seems perfectly reasonable to value such research. But isn’t she here pitting it against a straw man? Is it fair to characterize Zaretsky’s ideas about student writing as grounded in a sample of one? Hasn’t he been exposed to hundreds, even thousands of student essays by now? And can’t he also claim to have thought about the problem for decades, i.e., his whole career? Is it accurate to say he’s merely expressing his “personal feelings”? I imagine his experiences are a lot more serious than that, and he has probably even carried out a few experiments to test his ideas. Also, though he may use only a few students as examples, he is obviously presenting them as representative of a general problem. He’s not saying it holds for all students, only that there is a significant group of students that can’t write as well as he’d like. Importantly, it’s his considered opinion that they could write better if he took the time to teach them. Does he really need to carry out a fullblown study of writing practices to support that claim? Can’t he just express his experience-based opinion and see where the discussion leads?


No Man’s Land?

At the start of the summer, Robert Zaretsky published a piece (“Our Students Can’t Write”) in the Chronicle of Higher Education that was roundly denounced on Twitter by my fellow academic writing instructors. John Warner, author of the widely praised Why They Can’t Write, was perhaps its harshest critic, suggesting that Zaretsky is “plug ignorant about how writing works.” Nigel Caplan objected to the characterization of “writing as ‘tool,’ writing teacher as ‘mechanic’ and writing center as ‘repair shop’,” and Elizabeth Wardle questioned Zaretsky’s expertise on student writing. All seemed to agree that the piece had been better not written, or, if it really had to be written, that it should not have been published in the CHE. Last week, in any case, Wardle published a (somewhat oblique) response, “What Critics of Student Writing Get Wrong,” which was wellreceived by the same community. In this post, I want to look at the tension between university writing instructors and content teachers that Wardle and Zaretsky instantiate.

It’s important to begin by noting that both are accomplished scholars in their respective fields. Zaretsky is professor of world cultures and literatures at the University of Houston. Wardle is professor of written communication and director of Miami University of Ohio’s writing center. I’m going to presume that both are tenured or enjoy tenure-like job security. Both have substantial academic cv’s and writing credits. In fact, I was taken aback by Warner’s suggestion that Zaretsky doesn’t know anything about “how writing works”; I read some of what he’s written about Albert Camus and intellectual life and it seems like altogether competent craftsmanship to me.

What Warner probably meant was that Zaretsky doesn’t understand how student writing works, and this, indeed, is also what Wardle was after with her jab at his “expertise”. “It’s easy for teachers to take their frustration with a few student writers and extrapolate from it a number of conclusions based solely on their own experiences, histories, and biases,” she writes. “But academics should demand more from such public statements. We should demand not personal feelings and frustrations, but research-based evidence grounded in more than a sample of one.” That is, while Zaretsky, like Wardle, has a great deal of experience with student writing, his reflections are not research-based and, while he may be an expert academic writer, he cannot be considered an expert in writing pedagogy. This, I think, marks one important fault line in the debate.

The other is institutional, and here it was Zaretsky who threw down the gauntlet. In addition to “scrawling a couple of comments at the bottom of [a student’s] paper,” he suggests, “we might urge the student to pay a visit to the writing center. Such centers, however, are as easily abused as used, often reduced to the pedagogical equivalent of the confessional, a place where students are absolved, not cured, of their writing sins.” (I think I know what he’s getting at here; a lot of writing instruction these days seems to take the student’s side on the stuffy conventions of academic writing. Jennie Young, who directs the first-year writing program at the University of Wisconsin, for example, recently called for dropping conventional citation requirements.) Zaretsky believes that the solution is to give content teachers more time to teach writing as part of the curriculum: “writing-intensive courses must become the norm, not the exception, over the entire course of a four-year education.” And they should “reward tenured professors who retooled as composition teachers and reassure tenure-line professors that teaching writing is as important as writing monographs.”

John Warner took particular exception to the last suggestion: “as though there isn’t already a wealth of highly skilled composition teachers out in the world eager to do the job, but unable to do so because of their contingent status. WTF?” That comment is somewhat ironic in light of a post Warner wrote back in 2015 on his IHE blog, in which he made it clear that composition teachers cannot prepare Zaretsky’s students to write the papers he wants them to write for his class. In fact, he offers nine whole reasons why composition courses cannot guarantee the requisite writing ability needed in “history, philosophy, sociology, economics, political science, whatever” courses. Surely that leaves some room for the sort of thing Zaretsky proposes?

The battle lines seem to be drawn up. On the one hand, there’s the question of who should be teaching students to write — content teachers or writing instructors; on the other, there’s the question of what the basis of writing pedagogy should be — personal experience or scientific research. The tragedy is that everyone seems to agree that the students need to be able to write, and that most of them need help learning how to do it better. I’m going to spend a few posts looking at the arguments on both sides in greater detail. After all, I find myself in an odd position, worthy of some careful reflection. Institutionally, I feel more aligned with Wardle (or, indeed, Warner): I’m a writing consultant based at a university library, not a content teacher in an academic department (though I do some adjunct teaching on the side). Intellectually, however, I feel a greater kinship with Zaretsky’s position: writing instruction, I fundamentally believe, should be derived from the writing experience of scholars, not “scientific” studies of student (or even faculty) writing practices. This is no doubt why I sometimes feel like a lonely wanderer in the wasteland between two entrenched armies. And these blog posts, perhaps, are nothing more than so many fragments to shore against my ruin. Let’s see.

Good Writing is the Creative Destruction of Bad Ideas

Yesterday, while preparing some brief remarks for my students in a course on the management of innovation in organisations, I came up with what I thought was a pretty good aphorism. A few hours after tweeting it, I realized it could be much improved (for both truth and style) by adding a qualifier. It’s not all writing that is the creative destruction of bad ideas, but good writing almost always is. When done right, our writing exposes our ideas to the winds of criticism. Some ideas stand up, others yield, bend and survive, and some are swept away like so many dead leaves that have served their purpose. The metaphor has its limitations, as all metaphors do, but I must say I am rather fond of it. Let’s see how well it stands up to being written down.

“Schumpeter’s Gale” is the process by which innovation renders existing practices, and even whole industries, obsolete. These practices weren’t “bad” at their inception, of course, but their value diminishes as the new process sweeps the land. New processes themselves, before they take, must of course stand up in the storm and stress of existing market forces. Some innovations don’t ever make it; ask your local venture capitalist how many investments turn a profit. Those that do have proven their worth, at least temporarily.

Perhaps I have an occasion to coin a phrase. Let’s call the creative destruction of bad ideas through critical discourse “Popper’s Gale”. This corresponds nicely with the division of labor between Hayek, Popper and Schumpeter proposed by the Economist.

Hayek and Popper were friends but not close to Schumpeter. The men did not co-operate. Nonetheless a division of labour emerged. Popper sought to blow up the intellectual foundations of totalitarianism and explain how to think freely. Hayek set out to demonstrate that, to be safe, economic and political power must be diffuse. Schumpeter provided a new metaphor for describing the energy of a market economy: creative destruction.

Popper didn’t just propose to think freely, of course, but critically. His signature idea was to define science, not in terms of “truth”, but in terms of the falsifiability of our claims. Science was less about what we today somewhat uneasily call “knowledge production” (as if the point is to produce ever more of the stuff) than about the identification and destruction of falsehoods. Whatever remains standing in the strong winds of criticism is true … but only for now. The point isn’t to believe but to keep testing our ideas, keep looking for their weaknesses and let them fall as they must. An open society, we might say, is not to be measured by how frequently it makes new discoveries but how efficiently it discards false hypotheses.

Steve Fuller has recently taken this further, construing Schumpeter and Popper’s twin gales as harbingers of “the post-truth condition”. The authority of science itself can be seen as undergoing a kind of creative destruction, a sort of secularization applied to itself, as Fuller puts it (2018, p. 108). But, as he’s quick to remind us, this isn’t really some new crisis for our culture. Indeed, he would probably point out that both Schumpeter and Popper appropriated some crucial concepts from the Marxist tradition, which gave us not just “creative destruction” but also the evocative notion of “permanent revolution”. Popper deployed the latter in his engagement with Kuhn, and their disagreement remains an “essential tension” in the philosophy of science.

Let me conclude with one last gesture at the winds of history, call it “Wittgenstein’s Gale”. When he took a job as a country school teacher, his sister famously complained that this was like using a precision instrument to open crates. His response is no less famous:

You remind me of somebody who is looking out through a closed window and cannot explain to himself the strange movements of a passer-by. He cannot tell what sort of storm is raging out there or that this person might only be managing with difficulty to stay on his feet.

When we’re teaching students to write we are teaching them to stand up in what must sometimes appear to be a looming storm of criticism. It’s important that they learn to face it because if they just stay in the house they’ll never know how good their ideas really are. Of course, there’s no way around showing them how bad some of their ideas are too. But if they don’t hold on to them too tightly, too sentimentally, if they are willing, sometimes, to give them up to the wind, they will find walking much easier. In science, we might say, revolution is normal. Ideas are destroyed only to make the creation of better ones possible.

A Bigger Iceberg (4)

I normally imagine an ordinary empirical research paper in the social sciences as an arrangement of about 40 paragraphs. The first three serve as the introduction. There are then five paragraphs in each of the background, theory and methods sections, followed by a fifteen-paragraph analysis (preferably divided into about three themes), and a five-paragraph discussion section. The remaining two paragraphs go into the conclusion. That’s the order they appear in the paper and, ideally (though often not really), the order in which they are read. We might call it their “narrative” or “rhetorical” order. Each paragraph occupies about one minute of the reader’s attention, after which the reader, again ideally, moves on to the next. It should take about forty minutes altogether to read the paper.

But what about the logical order of these paragraphs? And what about their epistemological foundations? Here it will be useful to arrange them somewhat differently and, as my title suggests, Hemingway’s conception of the “dignity” of our writing will be useful too: “The dignity of the movement of an iceberg is due to only one eighth of it being above water.” Now, the entire paper is, of course, “above water” and, in my last three posts, I’ve tried to convince you that there needs to be a solid mass of experience, reason, and reading below the surface to bear it up. But suppose we cut the forty paragraphs of your paper off at the waterline and pushed it into the sea to fend (float) for itself. By Hemingway’s math, 5 would now float on top and the rest (35) would sink out of view beneath the waves. Which five would we see?

I think the obvious answer here is the introduction and the conclusion. In fact, they would float in the opposite order, with the conclusion at the top and and the introduction below it. In fact … I’d expect the key sentence of paragraph 39 (the first paragraph of the conclusion) to outmaneuver paragraph 40 for the top of the tip of the iceberg, planting a flag there with your thesis statement emblazoned upon it. Under paragraph 40, you have paragraphs 1 and 2. And under paragraph 39 you have have paragraph three (which contains your thesis statement but does not state it.) If we were now to dive down and explore beneath the surface of we’d find paragraphs 4 through 8 (the background section) bearing up paragraph 1 and paragraphs 9 through 13 (theory) bearing up paragraph 2. Under paragraph 3 things are bit more complicated because of all the work it does in the introduction: paragraphs 14 through 18 (method) bearing up a part of it, 19 through 33 (analysis), another part, and 34 through 38 (discussion) the rest. Many teachers and reviewers will tell you that the quality of the introduction and conclusion of a paper predicts the quality of the rest. This is because the dignity of the movement of your prose in those five paragraphs depends on the existence of the rest of the paper. You don’t want your iceberg to be a mere inflatable beach toy, a mere surface full of air. What you see above the water should indicate a much greater, much more dignified mass below. Something solid.

Like I say, I’m working on some illustrations for this conception of a paper, and I hope to have them finished at a reasonable quality by mid-September. But I’m happy to hear what you think of the idea as expressed in words. I’m also interested to see how you visualize it.

A Bigger Iceberg (3)

If you want to do science you’re going to need theory and method, and you’ll have to write about them. Here, too, Hemingway’s ratio holds. The theory section will not exhaust your theoretical understanding, and the methods section will not be able to explain everything you did to collect your data. Below the surface of your text lies all the reading of the literature in your field that you’ve done, and all the care you took in selecting, sampling, observing, recording, transcribing, coding and so on. You could write a whole book on the theories that frame your research (thankfully that’s usually already been done for you). You could talk for days about your fieldwork or experiments or surveys. The dignity of the movement of your writing depends on all the things you could say but don’t, all the questions you are able to answer if someone should ask.

Think of theorizing as a particular kind of reading. It’s not just reading, of course; it takes a lot of thinking too, but much of this thinking can be understood as a way of engaging with what other people have written. You can’t tell the reader about everything you’ve read, and there should be a reserve underneath the surface of the text. You should have a much wider knowledge of the discourse than you make explicit in your theory section. What’s important is to approach your theory as a shared framework for thinking about your research. Your reader is as theoretically sophisticated as you are; their mind has been formed by the same discipline that yours has been shaped by. You review the literature, not so that you can spare your reader the trouble of reading it, but learn how your reader thinks. The theory section is mainly a reminder to the reader of what the theory should lead us to expect of the analysis. It activates the reader’s theoretical dispositions.

This makes it very different from your background section, which is intended to inform the reader about things the reader may not be aware of. The sources you use in your background section will therefore not be as familiar to the reader as those you use in your theory section. The background is provided by newspaper coverage, popular histories, non-fiction books, company reports, official statistics and so forth. You are telling the reader something that you are entitled to presume the reader doesn’t know. You will be explaining it in those terms, in a particular, helpful tone of voice. But again you will not be able to tell them everything there is know about the subject. Think about the sorts of questions a reasonable, curious reader might have and make sure you’d be able to answer them from the depth of your knowledge. There should be many more details under the surface than you make explicit in your text.

Your methods section explains how you gathered your data. It is important to keep in mind that your basis for writing it is your own experience, i.e., what you actually did to collect your data. This part of the iceberg is actually very similar to the little anecdote we began with. The tip is just your honest narration of a number of things you did. Of course, it is framed by the methodological literature as well, and it also requires you to reflect seriously about sources of error and ethical issues, but the substance of your methods lies in your own personal research experience, “the sequence of motion and fact,” we might say, cribbing from Hemingway, that produced the data and which might ensure its validity going forward. Need I say that you will have done much more in this regard than you can say? The tip of the methodological iceberg is just a simple story, but underneath it are all the questions you can answer about how and why you did the things you did.

Finally, we have that special class of experience we call “data”, “the given”. These are the brute facts that you can take for granted in your analysis and interpretation of your research objects. The reader has been induced by your methods section to trust that this is how the people in your sample answered the surveys, or this what they said during the interviews, or this is how they behaved during your field visits. Or the reader trusts that the data has been collected responsibility from a publicly available database and organized properly for analysis. The data that is available to you for analysis should far exceed the data points you present in the analysis section of the paper. You will present summaries of your data, not the raw data itself. But the raw data should, of course, exist. Indeed, these days you are likely to receive an email from a critic who’d like to examine it. So make sure it’s there. Your dignity depends on it.

By distinguishing the reading you share with the reader (theory) from the reading you don’t (background), and the experience that is yours alone (method) from the shareable product of that experience (data), you get a good sense of what the iceberg of your text looks like under the surface. In my next post I’ll try to bring it all together into a coherent image of a whole research paper.