Tag Archives: Usage

2015 in Review

Happy New Year! Before heading into a new year of blogging, I’d like to take a quick look back at the past year. My first post of 2015 was my attempt to articulate what I found so troubling about the tone of Steven Pinker’s 2014 book on academic writing. In this post, I argue that the value of Pinker’s insights on writing are obscured by his overly broad characterization of what ails academic writing.

I followed this defence of academic writing with a somewhat related topic: how we use metadiscourse. In this post, I talked about the evolution of signposting, suggesting that even boilerplate metadiscourse can be transformed into something that informs the reader while being well-integrated into the text. I think this notion is important because the prevalence of clunky metadiscourse shouldn’t be treated as an argument against the value of effective metadiscourse in academic writing.

My next topic was a very different one: whether I should use the singular ‘they’. At the simplest level, I decided to do so (spoiler!) because this blog is a place where I can pick and choose among style conventions as I wish. But, more generally, I made this particular decision to embrace the singular ‘they’ because I believe that it is necessary, correct, and beneficial.

During our summer term, I taught a thesis writing course. At the end of that course, a student sent me an interesting email questioning some of the assumptions animating my discussion of productivity. This note gave me a welcome opportunity to think more about the ethos underlying the notion of productivity, especially as it pertains to graduate student writers.

In lieu of writing a reasonable number of new posts this year, I spent an unreasonable amount of time classifying my old posts. What I came up with was an annotated list, published in September as How to Use this Blog. This list is now a permanent page on the blog, allowing me to update it as needed. It can be found by using the For New Visitors tab. This list is a good way to see the type of topics that I have discussed here and to find groupings of posts on particular topics. If you are interested in finding out if I’ve covered a specific issue, you might prefer to use the search function (located near the top of the left-hand column).

After a short post previewing AcWriMo 2015, I ended the year with a post on the way we write our presentation slides. In this post, I discussed the possibility that our presentations may suffer if we compose slides in the same way we compose our other written work.

As you can tell from the brevity of this list, the past year was a relatively quiet one here on Explorations of Style. The blog is now five years old, a milestone which has given me the opportunity to reflect on what comes next. While I will continue to publish new posts, my main project for the upcoming year will be to rework some of the older posts. I’ve learned so much over these five years—from readers, from students, from colleagues, from other bloggers—and some of my original posts need updating to reflect that development.

As always, I’m happy to hear about topics that you’d like to see discussed or questions that you’d like answered. In the meantime, thanks for reading and good luck with your writing!

Choosing the Singular They 

In this post, I want to talk about an issue that has been troubling me for as long as I have been writing this blog. Should I be using the singular they? That is, should I be using they as a gender-neutral pronoun for a grammatically singular antecedent? In general, I have not done so, but trying to fix this sentence from a recent post forced me to revisit that policy:

An established Harvard academic writing a book is doing something very different than a new doctoral student attempting their first article.

My usual way to circumvent this issue has been to use the plural. But that solution—‘doctoral students attempting their first articles’—worked dismally here. Making the whole sentence plural sounded daft, and making only the second half plural upset the comparison. So I left it as it was and made a note to make a more systematic decision later (and to make it the topic of a post).

People with much more expertise can give you actual reasons for using the singular they without compunction; I’ll include some helpful links at the end of this post. I’m only going to give my reflections:

1. It’s necessary. We need a gender-neutral pronoun in order to refer to a singular antecedent without specifying gender. The phrase that I often need to use in my blog writing is some variant of ‘When a student shows me their writing …’. Up till now, I have edited such sentences to read ‘When students show me their writing …’. While this is a solution of sorts, I’ve never particularly liked it; I want to be talking about a single generic student, not a bunch of students.

2. It’s correct. Despite what you may have heard, it’s not incorrect to use the singular they. The decision is ultimately a judgment call: Should we use the singular they or might it be disturbing to our readers? Will those readers recognize what we are doing? Might they find it incorrect or excessively informal? My main concern about adopting the singular they in this blog has been one about reception; if enough people believe it to be wrong, I’ve worried that it might be an unnecessary distraction. I’m ready now to let that worry go.

3. It’s beneficial. Using the singular they solves a real problem and gives us important flexibility in the way we reference gender. We should do more than just say that he can’t be a generic pronoun. Even saying he or she—which is obviously stylistically insupportable—makes it seem unduly important that we identify people by gender. Given our understanding of the complex ways that we perform and present gender, it seems entirely desirable to enrich our capacity to leave gender binaries out of places where they are irrelevant. Of course, there are those who argue for an entirely new gender-neutral pronoun, one which could refer to a specific person without identifying gender. Using the singular they doesn’t obviate this perceived need for a gender-neutral pronoun, but it does help. It may be that we will eventually say ‘Sam came to my office and showed me their writing …’ as a way of making Sam’s relationship to traditional gender categories irrelevant. Or it could be that a newly coined gender-neutral pronoun will emerge and take root. In the meantime, it is still beneficial to be able to use the singular they to refer to singular generic nouns and indefinite pronouns.

4. Finally, I can if I want to. If I see this practice as necessary and correct and beneficial, why not do it? In particular, why not do it in this space where I’m answerable only to myself? In the unlikely event that anyone cares enough to judge this decision, it won’t matter. I can continue to make decisions about this issue in other contexts as I wish. And I will certainly continue to teach this issue in such a way that students are aware of a range of opinions and practices. But if I think this usage is desirable and the main impediment is that it may ‘seem wrong’ to some, I think it behooves me to follow my inclinations.

So I’m making it official—this blog will use the singular they, as needed. I totally get that this is immaterial to all of you, but making the decision is a weight off my mind. If you are still troubled by this issue, I suggest having a look at the resources below. And, if you only have time for one, I recommend the first: Tom Freeman does an excellent job explaining the full range of associated issues on his terrific editing blog, Stroppy Editor.

Everything you ever wanted to know about singular “they”, Stroppy Editor

Singular ‘They,’ Again, Lingua Franca (Anne Curzan)

Epicene ‘they’ is gaining greater acceptance, Copyediting (Mark Allen)

There’s (Starting to Be) Some ‘They’ There, Lingua Franca (Ben Yagoda)

Singular they, you, and a ‘senseless way of speaking’, Sentence First

Dogma vs. Evidence: Singular They, Lingua Franca (Geoffrey Pullum)

The Oxford Comma, or the Limits of Expectation

Why talk about the Oxford comma? Surely everything has already been saidsung, or drawn. But why even have a blog if you can’t use it to share your great love of the Oxford comma? More importantly, its use comes up in all my classes; if I say ‘comma’ to a room of graduate students, someone will immediately ask whether it’s right or wrong to put a comma after the penultimate item in a list. I can explain both sides of the issue, but I cannot, in the end, answer the ‘right or wrong’ part of the question. Which leads me to another reason to talk about it: the very fact that there’s no simple answer to the question is instructive for the way we think about issues of style in our academic writing. Before delving into this topic, let’s look at what the Oxford comma actually is.

The first thing to note is that it is properly called a ‘serial comma’, the comma that appears before the conjunction in a list.

Peace, love, and happiness

Do you put a comma after ‘love’ or don’t you? That’s what my students want to know. Actually, that question is pretty easy—I do, always. The harder question—and the one they care about—is whether you should put a comma there. That’s the question that can’t be answered on the grounds of correctness. Both are correct, unless you are writing for a particular journal or press with a stated preference. Since most of my students are primarily concerned with writing their theses, they often have to make this decision themselves. And so they want a definitive answer. After I give a full accounting of the subtle pleasures and perils of the serial comma, someone is sure to say, while stifling a yawn, ‘so should we use it or not?’. Again, either way is fine, but here is a summary of my non-answer.

1. Despite the name and the song, the Oxford comma isn’t more formal. Students often tell me that they think it’s inessential but probably a good idea for a formal piece of writing. In truth, however, the serial comma is neither formal nor particularly stuffy. It’s not even more British: its presence in the Oxford style guide notwithstanding, it’s much more common in the US than in the UK. The only way in which it’s more formal is that it isn’t generally used in newspapers where narrow columns put space at a premium.

2. All the evidence about ambiguity can cut both ways, but in ordinary academic writing—in which internally complex list items are routine—the serial comma will help more often than it hinders.

This theory of community engagement addresses tensions within the spheres of politics, arts and culture and finance.

This theory of community engagement addresses tensions within the spheres of politics, arts and culture, and finance.

In this case, I think that using the serial comma is the easiest way to show what goes with what. I also recommend using semicolons when list items get unruly, and those are always used serially. Overall, avoiding ambiguity is our responsibility as writers. If commas can’t help us, then we need to reword; that is, there are many instances in which the problem is not the presence or absence of the serial comma, but rather the awkwardness of the list itself.

During diagnosis, treatment and monitoring of a patient’s pathologies, measurements of medication levels are often essential.

During diagnosis, treatment, and monitoring of a patient’s pathologies, measurements of medication levels are often essential.

Measurements of medication levels are often essential during diagnosis, treatment, and monitoring of a patient’s pathologies.

While the second version avoids the obvious potential for misreading found in the first, the third may be better overall. I’ve written it with the serial comma, but it would, of course, be fine without.

3. Even if neither option is wrong, it’s still a good idea to make a decision. That is, just because both options are correct doesn’t make flipping back and forth desirable. Consistency is a useful principle when making style decisions. Doing the same things the same way every time is a kindness to your readers—since they will be saved the bother of wondering about stylistic inconsistencies–and a kindness to yourself—since you’ll be saved the bother of wondering what’s right each time. Some people will use the serial comma only when ambiguity might result from not using it, but I think that practice can potentially confuse the reader.

It took me a long time to get over my belief that the serial comma was inherently better. I believed this for years, and that was before I went to work for Oxford University Press, where it is house style. My editorial eyes are deeply attuned to it, and its absence trips me up every single time. But that just shows the limits of our own expectations. The serial comma is only better in a world where it is expected to appear. The only surefire solution is a world in which everyone does everything the same way, which is implausible and slightly creepy. The best we can do is choose our way and stick to it consistently, so our readers—consciously or unconsciously—become accustomed to our punctuation habits. Again, this is why I don’t like the practice of using a serial comma only when it’s obviously beneficial; inconsistency undermines our reader’s ability to becoming habituated to our writing style.

While the Oxford comma may seem more like a punchline than a punctuation mark at this point, I think that there is something important in this conversation. Writing requires us to make choices far more often than it demands simple rule following. And those choices should be made based on our best understanding of style conventions and reader expectations. The serial comma shows us the limits of expectation, but it also confirms the importance of making informed and consistent decisions about academic writing style.

Commas and Relative Clauses

Our task for today is to understand how we punctuate relative clauses. In the simplest terms, a relative clause is a clause that begins with a relative pronoun (which,  that, who, whom, whose). Let’s begin by looking at this example of a sentence with a relative clause:

CNCP patients, whose complaints of pain are not adequately addressed, start to display aberrant drug-related behaviours that are mistaken for addiction.

This sentence—taken directly from student writing—is not incorrect as written, but it doesn’t say what the author intended. Here is what the author meant to say:

CNCP patients whose complaints of pain are not adequately addressed start to display aberrant drug-related behaviours that are mistaken for addiction.

The difference? The second version of the sentence shows that it is about a subgroup of CNCP patients ‘whose complaints of pain are not adequately addressed’. There are many CNCP patients in the world and only some of them suffer in this manner. The first version says that all CNCP patients have complaints of pain that are not adequately addressed. Because of the commas, we have to read the relative clause as supplementary information about all CNCP patients. Technically, our first sentence could be reworded as follows:

CNCP patients [all of them] have complaints of pain that are not adequately addressed. CNCP patients start to display aberrant drug-related behaviours that are mistaken for addiction.

Rewording the sentence in this way reflects the fact that the original sentence portrayed the relative clause as supplementary. But the author’s intention was not to provide extra information about this group of patients; instead, the author wanted to define a particular group of patients under discussion. The lack of commas in our revised version indicate that the information following the relative pronoun is integral to the antecedent noun:

CNCP patients whose complaints of pain are not adequately addressed start to display aberrant drug-related behaviours that are mistaken for addiction.

The bolding emphasizes the integration of the relative clause. This integration is conveyed to the reader by the absence of commas. When we do use commas, we are telling the reader that we are providing supplementary information.

I chose this example because it is easy to see—even without being familiar with the subject matter—that the punctuation in the original sentence was probably misleading. One of the great difficulties in explaining how to punctuate relative clauses is that context matters. I always tell students to take whatever I have said about punctuating relative clauses home with them: only in applying those principles to their own sentences—sentences that they themselves fully grasp—will they come to understand whether a relative clause is integral or supplementary.

If you are familiar with this topic, you will notice that I am not using the traditional terminology (restrictive and nonrestrictive) or the usual variants (defining and non-defining, essential and inessential, identifying and non-identifying). It is possible, of course, to explain what is meant by these terms, but I have never found the common terminology to be particularly intuitive. More recently, I have noticed people using the terms integral relative clause and supplementary relative clause. I find these terms to be more intuitive, which is why I have started to use them in my classroom teaching. I would be interested to know if anyone has thoughts about whether this different terminology is helpful or just confusing.

Now let’s look at some more examples to reinforce the distinction between integral and supplementary relative clauses.

There are many narratives that can be used to illuminate the psychological concept of extraversion.

The relative clause is integral to the meaning of ‘narratives’. The sentence isn’t just telling us that there are many narratives. It is telling us that there are many narratives that can be used in a particular fashion.

The philosophical approach that is articulated by Rorty will set the tone for the proceedings at the conference.

This sentence is telling us what will set the tone for this conference. And it isn’t just any philosophical approach: it is the philosophical approach that is articulated by Rorty. Again, the relative clause is integral to the meaning of ‘philosophical approach’. Now let’s look at some examples of supplementary relative clauses:

Given the educational conditions in Malawi, which is located in eastern Africa, creative teacher training programs are essential.

Using transactional memory, which requires special hardware or software support, will address the problems associated with using locks.

Theorists argue that gender equity, which is defined here in economic terms, is a crucial component in any attempt to address the global AIDS crisis.

In each of these cases, the antecedent of the relative clauses is completely sufficient without the relative clause. A country is a useful example since it is easy to see that you don’t need any additional information to know what is meant by Malawi. Its location within its continent is obviously supplementary information. Likewise, ‘transactional memory’ is a fully defined term: the fact that it requires special hardware or software support is extra information. Take that information away and the term itself is just as informative. In the third example, even though the supplementary relative clause claims to be defining ‘gender equity’, it is doing so in a supplementary way. The sentence is telling us that gender equity is crucial and it is also clarifying what gender equity means in this context.

Here is a final example, one that gives three different versions of the same sentence:

The articles, which stem from the 1970s and the early 1980s, show Lefort intent on persuading the reading public about the totalitarian nature of the Soviet Union and the countries of the Eastern bloc.

The articles that stem from the 1970s and the early 1980s show Lefort intent on persuading the reading public about the totalitarian nature of the Soviet Union and the countries of the Eastern bloc.

The articles which stem from the 1970s and the early 1980s show Lefort intent on persuading the reading public about the totalitarian nature of the Soviet Union and the countries of the Eastern bloc.

The first two sentences follow the pattern I have been discussing. I chose this example because it shows how easily ambiguity can arise when we’re not clear about the punctuation we need. The first sentence is discussing a group of articles and using its relative clause to give us extra information about when they were written. The second sentence, on the other hand, is using its relative clause to identify a particular subset of articles. The implication of the first sentence is that all the articles were written in the 70s and early 80s. The implication of the second sentence is that there is a broader group of articles (presumably spanning a broader time frame); the author is drawing your attention to a subset of that broader group. Needless to say, it is important for the author to clarify which is meant. In my own experience, the decision about how to punctuate relative clauses often helps me to clarify my own meaning. Similarly, in discussing this issue with students, it often emerges that they aren’t quite sure what they were hoping to convey through their punctuation choices.

But what of the third sentence? Is it the same as the second sentence or is it different? In other words, is it okay to use ‘which’ to introduce an integral relative clause? Yes, it is. But while I would love to leave it at that, I feel I should say something about how I view this issue. The good news is that we have already covered the important part: you must signal your intention to your reader through your use of commas. If the information is integral, skip the commas; if, on the other hand, the information is supplementary, show that with your use of commas. Simple enough. But you do need to choose a relative pronoun and, for many, that decision raises a certain anxiety. When I ask students about their habits in this regard, I get a range of replies (often involving something a high school English teacher once said): guessing and then feeling bad; turn taking (first ‘which’, then ‘that’); using ‘which’ because it is more formal; never thinking about it. For a fairly typical prescriptive discussion, see this post from APA Style. For a more nuanced, historical view, try Stan Carey’s excellent post on this topic (as usual, Carey also provides a very helpful roundup of what others have said about this issue).

Given the general uncertainty this topic engenders, what should we do? My own preference—and that is all it is, a preference—is to use ‘that’ without commas and ‘which’ with commas. The first part of this practice is unexceptionable: nobody uses ‘that’ to introduce supplementary information. It is the second part that causes heartache. Look at this simple table:

integral supplementary
that YES NO
which ?? YES

My preference is to replace those two question marks with a ‘NO’. Not, to repeat, because I think this use of ‘which’ is wrong, but only because I like the clarity and simplicity of reserving ‘which’ and ‘that’ for different uses. I start with the important question—do I need commas or not?—and then use that as the basis for my decision about what relative pronoun to use. I explain this to students in just these terms: once they have sorted out the important issue of how to punctuate, they are free to choose their relative pronouns however they wish. But I do stress that this distinction is often treated in more absolute terms in advice on scientific writing. Whether or not this is true across the board, I do suggest that students preparing scientific papers consider reserving ‘which’ for instances in which they are using commas to convey supplementarity. For me it all comes down to this principle: if our audience might find a particular usage to be ambiguous—even if we know that it is perfectly acceptable—it can make sense to avoid that usage.

There is much more that could be said, but this post is already far longer than a blog post should be! If there is anything that you see as needing further explanation or elaboration, I would love to hear about it in the comments.

This post is the fourth in a series of posts on comma use. The first post dealt with commas and coordinating conjunctions. The second dealt with non-standard commas and punctuating for length. The third dealt with the importance of knowing when you need a pair of commas.

Impactful Pet Peeves

Everywhere I’ve been over the past week, people have been sharing this list of ‘grammar mistakes’. You don’t need to click on the link to know the sort of thing: a list of errors that are terribly egregious despite the fact that everyone makes them all the time. I am fascinated by the mindset that is unmoved by the prevalence of  such ‘errors’. The pleasure of being right when everyone else is wrong seems to be so great that it obscures any sense that we should view the prevalence of a particular practice as relevant.

I generally try to avoid linking to things that I find as unhelpful as this list; you surely don’t need my help finding shoddy advice on the Internet. But I went ahead and did so because I want to point to two key issues with this list. First, very little on this list is grammar (and the bits that are grammar are either wrong or dismally explained). This observation is more than just a quibble. The perception among students that their writing problems primarily involve grammar means that they often view their path to improvement as both narrow and fundamentally uninteresting. Not to say that grammar is actually uninteresting (obviously!) but rather that students might engage more readily with the task of improving their writing if they conceived of the task as having a broader intellectual basis. Improving your writing isn’t just fiddling with technicalities and arcane rules; it is a matter of thinking deeply about your ideas and your communicative intent. Calling it all grammar can be both dismissive and uninspiring.

The second—and more important—issue is the reasoning that underlies this list. A list like this says ‘all educated people should know these things, so avoid these errors lest you seem uneducated’. This edict misses an opportunity to talk about better reasons for avoiding certain usage patterns. For example, should you say ‘impactful’? It is meaningless to say that it isn’t a word: it is so obviously a word (if you aren’t sure, contrast it with ‘xsxsjwcrt’ and you’ll see the difference). But that doesn’t mean the world needs more instances of ‘impactful’. Use it at your own risk: most people find it icky and its presence in your writing may make them think unkind thoughts about you. Moreover, if something is having an impact on something else, you can likely convey that more effectively with a clear subject and a strong verb. Your writing will improve much more decisively if you disregard unnecessary discussions of legitimacy and instead think more about why certain usage patterns are so widely disliked.

After I had written this, I found a great roundup on this topic from Stan Carey. He discusses a range of these sorts of lists and provides his usual insightful response. He concludes with an excellent warning about grammar pet peeve lists: “Read them, if you must, with extreme caution, a policy of fact-checking, an awareness of what grammar isn’t, and a healthy disrespect for the authority they assume.”

Lastly, I really enjoyed the inaugural episode of the new language podcast from Slate, Lexicon Valley. The highly entertaining and wide-ranging conversation about dangling prepositions ends with an amusing discussion of Paul McCartney’s famous double preposition. A preposition at the end of a sentence is generally permissible, but it is probably best not to split the difference in this fashion: “But if this ever-changing world in which we live in/Makes you give in and cry/Say live and let die”.

Links: Language Sticklers

Last week, I posted a link to a brief discussion in The New Yorker about the inclusion of abbreviations (OMG), symbols (♥), and slang (muffin top) in the OED. I included it originally because I was amused at the predictable outrage.* But as I thought about it more, I began to feel that there was an important point to be made about usage and writing. Not only is the outrage based on a misunderstanding of the proper role of a dictionary, it also overlooks the importance of context for writing decisions. The dictionary doesn’t tell us what words are suitable in our writing; the inclusion of annoying usage in a dictionary has no real role to play in our usage decisions for formal writing. We have to decide what language is appropriate for our purposes and our audience, and our ability to make those decisions comes from having a sound grasp of the specific context in which we write.

This brief consideration of language outrage brings me to an post from the New York Times’ Schott’s Vocab blog by Robert Lane Greene. Drawing on his book You Are What You Speak, Greene discusses what language sticklers get wrong. I particularly liked his consideration of what he calls ‘declinism’: the view that language was better yesterday and will be worse tomorrow. His most interesting point concerns the role of mass literacy: “So a bigger proportion of Americans than ever before write sometimes, or even frequently, maybe daily…. A century ago, a nation of 310 million engaged with the written word on a daily basis was unthinkable. Now its uneven results are taken as proof by some that language skills are in decline. That is far from obvious.” Here is a review of Greene’s book, also from the New York Times. In this review, Geoffrey Nunberg provides an amusing summary of Greene’s critique of modern ‘declinism’: “We’ve passed from the thoughtful homilies of Fowler to the pithy dictums of Strunk and White to the operatic curmudgeonry of modern sticklers like Lynne Truss, whose gasps of horror at the sight of a misplaced apostrophe are a campy cover for self-congratulation.” I agree, and I love the irony of the sentiment: The written word may not be in decline, but the quality of the jeremiads is definitely slipping.

* I was also reminded about an article by Ben Yagoda that I discussed in an earlier post; Yagoda observes that student writers don’t generally use slang in their writing, preferring instead to use the vaguely elevated language that he calls ‘clunk’. The inclusion of slang in a reputable dictionary isn’t likely to cause an outcropping of informal academic writing. Novice writers may need help managing formality in their writing, but not because they are confused between their academic writing and their social media writing.

Links: Journal Article Publishing, Paywall at the Times, Additions to the OED

The blog PhD2Published recently ran a three-part series on journal article publishing: getting started; choosing a journal; and dealing with rejection. If you are thinking about publishing for the first time, it is a great idea to expose yourself to as many sources of information and opinion as you can; this blog has an extensive list of academic publishing resources. If you are working in the sciences, you may also be interested in this piece from Science on publishing in scientific journals.

This blog post from The Scholarly Kitchen discusses the new paywall at the New York Times. The author points out that the paywall allows the paper to charge organizations for access. While some individuals may get around  the paywall by accessing Times’ stories through social media or blog readers, institutions will pay for subscriptions, giving the Times the financial support it both needs and deserves.  

Lastly, here is something from The New Yorker Book Bench blog on the new additions to the OED. Ian Crouch has given an amusing account of the predictable outrage that attends any inclusion of novel coinage in an authoritative dictionary. In his words, the OED is a “far-reaching collection of English words, with an eye to history, which aims to be both prescriptive (these words and only these words are correct) and descriptive (these are the words that are used, as here’s how). When historians, linguists, and the generally curious want to know how people spoke in the twenty-first century, it will be useful to know about OMG and LOL, and how the phrases reflected usage that ranged from serious, to semi-serious, to full-on ironic.”

Links: Grammar Day, Fowler Revisited, Googlization

Happy National Grammar Day, everyone! Needless to say, it is always grammar day around here, but I am glad all of you get to join in the fun once a year. If you click the link, you’ll find grammar day e-cards, a theme song, a recipe for the grammartini, an exposé of common grammar myths, and even merchandise!

From The New Criterion, here is a review of Fowler’s A Dictionary of Modern English Usage: The Classic First Edition. In the review, Barton Swaim discusses a reprint of Fowler’s first edition as a way of revisiting the ongoing debate between descriptivism and prescriptivism. Swaim argues, in effect, that prescriptivism is both inevitable and way more fun. We will always, in his view, go looking for expert opinion about our writing decisions. And those expert opinions will be more stimulating than the bland descriptivist work of academic linguists. I have no trouble with the view that prescriptivism is more entertaining and often more immediately satisfying. However, what Swaim’s highly dismissive account of descriptivism fails to take into account is the possibility that persistent exposure to descriptivism might actually change the way we approach questions of usage. That is, a dominant descriptivist view might discourage our belief that all educated writers should use language in only one way and that all deviance from that way is deficiency. It may be unsatisfying to be told that a particular usage will be acceptable to some readers and unacceptable to others, but that may be all we, as writers, can hope for: a sound description of current practice to help us make up our own minds.

From Inside Higher Ed, here is a review of Siva Vaidhyanathan’s book The Googlization of Everything (And Why We Should Worry). The review includes an interview with Vaidhyanathan, who offers a clear-eyed case that Google’s ubiquity is significant. If everyone gets their information via Google, it matters how Google’s search standards are constructed. Despite its title, the book is clearly less a condemnation of Google, more a call for greater public conservancy of the wealth of human knowledge; Vaidhyanathan is urging us to have a serious discussion about the responsibilities and risks of the digital scholarship era.