In Defense of the Grammar Nazi

Watching television is often a complete waste of time and a total abdication of Life. I admit that I suffer from anguish, shame, and guilt if I’ve just spent three hours viewing re-runs of Family Guy, Frasier, and Bar Rescue, so I’m as guilty as the rest of the world when it comes to Couch Potato Syndrome [1].

Nevertheless, if you try really hard, you can turn your sin into a virtue by questioning what you’re seeing and thinking about how it applies to what you should be doing instead. And one of my more recent observations has been in relation to attitudes towards “skills” and “expertise.” So let’s start with one of my favorite gripes – the Celebrity Chef.


Chef: Fuurin Kazan Chef in Black and White ( / CC BY 2.0 (

Americans in general seem to love watching people in conflict. I dare say that almost every “reality show” is predicated on the need to see people fighting while trying to live on an island, etching tattoos, singing, dancing, or even, for goodness sake, baking a cake! (“You’re going down, punk, when I whip your ass with this amazing fondue!”) The more spiteful and bitter the contestants, the better the ratings. Rome had the Coliseum with gladiators and Christians; we have cable TV with hosts and contestants.

But what all of these shows include is the Expert who exemplifies the target skill; the master who can turn something mundane and mindless into a work of sheer brilliance. Celebrity Chefs are such people. For them, the difference between a winning dish and one that has them vomiting into a bucket is whether the cook added two bay leaves or three.  And woe betide anyone who should cook a steak for 3 seconds too long – because there lies the way to the door. Fundamentally, the message that is being sold to us here is that the fine attention to minuscule details is what make a chef a great chef.

And we all sort of accept that.

Meanwhile, over in the sports arena, the same message is being played, except that the skill here is seen in the slightest of motions and the briefest of actions. In golf, putting just an inch too short of the hole is a miss that could have been avoided if just the merest of extra effort had gone into hit. In swimming, that last kick while extending the finger tips to touch the wall is the difference between a gold medal and nothing. In baseball, a two degree extra angle when the batter hits the ball can mean the difference between a World Series and a long, quiet flight home. And in basketball, that extra tap to push the ball an extra inch over the edge of the hoop can turn a player from good to legendary.

And we all sort of accept this.

Now consider the reaction at work I get when someone says, “So which of these two designs is the best?” and I reply with, “Er, better. It’s the comparative, not the superlative. If we had three choices, we could have a best.” Does this “attention to detail” come across as acceptable? Is this modest defense of some sort of standard seen as the equivalent of Gordon Ramsey tossing a whole plate of food into the trash because the color of the scallop is browner than he thinks tolerable? [2]

Nope. Any attempt at being precise in the use of language is seen instantly as nothing more than bourgeois pedantry, trivial snobbery, or the action of a Grammar Nazi [3]. Just take a look at any discussion thread on the Internet related to some issue of language use and within six or seven responses the level of argument will have dropped to name calling and attacks on anyone who tries to be in any way linguistically precise. There’s a good chance that even you, dear reader, are already feeling the pressure to trot out the “But language is always changing” argument in defense of anyone who seems to be having a hard time using their first language as their first language! When Sarah Palin used the word “refudiate,” rather than ‘fess up that she’d made a mistake, she actually tried to argue that she was no different from Shakespeare who “liked to make up words!” Sarah Palin as Shakespeare! And George W. Bush was the new Cicero.

Now before some of you have collective heart attacks and click repeatedly on the “Comments” button, let me be clear that I am NOT suggesting that everyone has to talk proper, avoid splitting infinitives, never use “their” when “there” is the right word, avoid ending a sentence with a preposition, or stab themselves in the eye if they say “irregardless.” No, we know that oral language is frequently dysfluent, peppered with errors, given to jumping from topic to topic, and studded with words whose meanings can be slippier than a bucket of eels in olive oil. And written language can be similarly dotted with misarticulations (“nyoo kyoo luh” for nuclear anyone?), spelling mistakes, wandering apostrophes, malapropisms [4], and just plain unreadable rubbish.

Where the disjunct appears is that while people will accept a Ramsey tantrum to defend standards in cookery, a Simon Cowell insult in defense of musical talent, or Tyra Banks tossing out some poor unfortunate judged not good enough to be America’s Next Top Model, they see no value whatsoever in the idea that there may be some standards in the use of the English language.

A big part of why this happens is that we are all, in our own heads, experts at language. After all, we’ve been speaking it all our lives so we must be experts. So how dare some self-appointed, smug-faced, pedantic, “no-life” critic tell ME that I used the wrong word… or can’t spell… or know my own language.

It’s a manifestation of the well-known difference between knowing a language and knowing about a language. And knowing about language is not regarded as a skill or expertise in the same way that knowing about cookery, golf, basketball, singing, tattooing, baking cakes, surviving on an island, or any other such endeavor is viewed.

In a recent post at Gizmodo, Casey Chain pointed out that Google’s definition of the word literally now includes the following definition:

Used to acknowledge something that is not literally true but is used for emphasis or to indicate strong feeling

Words do, of course, change meaning over time – less than 40 years ago being gay had nothing to do with sexuality – but there is nothing “pedantic” or “petty” about taking a stand to prefer one definition over another. In fact, the failure to try to preserve a word’s meaning can lead to it being totally hijacked by special interest groups.  Take the word socialized as in “socialized medicine.” Here’s a word that has been used particularly by the political right because it sounds close to socialist and serves to taint the very concept of “free health care” as being somehow close to communism – and you don’t support communism, do you? Listen to any Talk Radio show and you’ll hear it being used in the pejorative sense by all right-wing commentators, whereas left-wingers are more likely to talk about “affordable health care” or just “health care.” It’s a good example of where allowing a word’s meaning to change ends up with it becoming pejorative; like gay, or queen, or fag – all of which have slid from having a non-pejorative, non-sexual meaning to become almost taboo [5].

So unfashionable as it may be to talk about things such as “standards” and “norms,” it is possible to be fully aware of the evolutionary nature of language while at the same time taking some effort to protect some of the features that keep the system rich and fascinating without letting it degenerate into an “anything goes” mish-mash of rough words strung loosely together with no thought for the comprehensibility, flow, phrasing, and even beauty of language.

And after all;

A thing of beauty is a joy for ever:
Its loveliness increases; it will never
Pass into nothingness; but still will keep
A bower quiet for us, and a sleep
Full of sweet dreams, and health, and quiet breathing.

[1] The saddest and most soul-destroying conclusion that one can come to is that it’s not just the watching of TV shows that is pointless but that one is watching the same thing over and over! With an average four-score years and ten alloted to our miserable time on the earth, depression can really set in when you realize that this is the tenth time you’ve seen Peter Griffin try to flip a dead frog out of the window, and it’s still funny. I guess when I’m lying on my death-bed about to croak, I’ll think, “Gee, if I’d only skipped those re-runs I’d have another few years to live.”

[2] Even the term “Grammar Nazi” itself illustrates the negative regard people have toward those who want to pay attention to those details that make language special and interesting. A Google search for the phrase turns up over 2 million instances, and Wikipedia provides the definition, “A Grammar Nazi is a common term used on the internet and on social websites for an individual noticing a grammatical mistake and correcting obsessively. ‘Grammar Nazis’ usually correct any punctuation or spelling errors they find in a comment or post. British comedians Mitchell and Webb have an interesting take on the Grammar Nazi.

[3] It seems to be de rigueur for celebrity chefs to be loud mouthed and arrogant, so much so that contestants in cookery contests appear to have developed these qualities before actually learning to cook. Thus the pleasure in watching these types of show is as much about seeing pride going before a fall as it is about having any genuine interest in a winner.

[4] A malapropism is where someone uses a wrong word that is phonetically  similar to the intended one. Examples of malapropisms would included “Magellan circumvented the world” for circumnavigated; “He was wearing a turbine on his head” instead of turban; and “When a baby’s born you have to cut the biblical cord” instead of umbilical.

[5] For those curious, the word gay appears to have taken on its meaning of homosexual in the 1920’s. At the end of the 1700’s it was used as a euphemism to describe a female prostitute – a “gay lady.” Queen was first used as slang to refer to male homosexuals way back in 1729 (“Where have you been you saucy Queen? If I catch you Strouling and Caterwauling, I’ll beat the Milk out of your Breasts I will so.” From the book Hell upon earth: or the town in an uproar. Occasion’d by the late horrible scenes of forgery, perjury, street-robbery, murder, sodomy, and other shocking impieties.) Finally, fag (or faggot) comes from US slang in the early 1920’s, most likely by way of its use of a term of abuse for a woman in the 1840’s.


6 responses to “In Defense of the Grammar Nazi

  1. “not everyone has to talk proper”–you mean, “properly”.

    • LOL! Yes, and congratulations, Katie, on being the first to spot this deliberate mistake. Using an adjective instead of an adverb to describe… well, a verb… is one those stylistic shibboleths that tests the unwary! Sure, Apple got away with “Think Different” but that’s because the word different can be used as both an adverb and an adjective. Back in the mid-1700’s, you could use either different or differently as an adverb but by the end of the 1800’s, the more common version was differently – the “-ly” ending sealing its identity as a “normal” adverb. I took a quick look at the Corpus of Historical American (COCA) and found no examples of different as an adverb but plenty for differently.

      Sadly, any other errors in this article are not deliberate; they’re the product of sloppiness and late-night writing after a couple of Cuba Libres 😉

  2. Ahhhhh… as an SLP I disagree. Most SLPs I’ve met are NOT linguistic experts. To borrow your analogy, an individual can only use the input of an “expert chef” if they have the ingredients and tools in their pantry. If I’m cooking my oft-made pasta stir fry, and a chef asks me where the fresh organic bay leaves are (or the pasta maker), I’m going have to inform him “not here”. If an “expert” in language spends time informing me of their knowledge and the contents of their pantry, they come off as patronizing and yes, pedantic. Imagine if I’d never heard anyone use that term correctly or if I didn’t know organic ingredients existed and many label them “better”?

    I was raised in a home in which the determination difference between shall and will was used. I answered “this is she” when requested by name on the telephone from age 5. I knew mother gave the instructions to Michael and me when my friends were trying to “spruce up” incorrectly to “Michael and I”. But I did not match my peers. And my tips were not well received when not requested even when I was proofing a paper for “spelling”.

    Teachers and SLPs use improper language (as compared to SAE) ALL. THE. TIME (notice the improper punctuation for emphasis) and it is not my job to be an expert grammarian. I am a communication professional, not a red pen. My mother is a grammar expert and she was not well-liked when “correcting”. She could teach, model, or explain WHY certain choices were easier to read, more well-received, or portrayed a certain sense of culture and education, but to say “er, better” is the correct comparative IS pedantic. It sends the message that “you are wrong”, which while true, is unnecessary when a spoken message only needs to be easily interpreted.

    In essence, a tasty stir fry without organic bay leaves is appreciated by my family more than an expert chef’s meal in 9/10 situations. If I am cooking for Oprah, let me have it. I’ll go buy it, plan it, research it, etc. If I am writing a grant for my school, I will get the “grammar Nazi” (hoping they know their stuff; most on social media who claim this odious title are simple nitpickers with little knowledge) to proofread each sentence. If I speak at ASHA, I’ll go over my terminology with that same intensity. But what is the purpose in conversation? What is the purpose in art? I leave Stephen Fry to state it so eloquently in this video .

    • I guess my first clarification should be that I’m not suggesting that all SLP’s are experts in language, although I have to believe that their knowledge ought to be more extensive than the general public, otherwise we might as well go back to being Speech Therapist and focus on teaching people to “talk properly,” mainly about how the rain in Spain allegedly stays mainly on a plane. However, what I’m trying to express is that if we have expert chefs, expert basketball players, expert lawyers, expert surgeons, who we regard positively in relation to their level of skill in what they do, why is the notion of an “expert in language” have such a negative connotation? When contestants on Masterchef eventually get kicked off, usually after suffering tirades of abuse from the judges, they inevitably say, “I learned so much from my time here with these wonderful chefs,” whereas if an expert in language points out to someone that “I ain’t done nothing” would make more logical sense as “I ain’t done anything,” this is rarely seen as a learning experience but a personal critique? It’s almost as if society at large abhors not just a vacuum but also a language expert.

      Of course, as with telling a joke, timing and delivery are very important. In education, we hear about the “teachable moment” and leveraging that is where the language expert can, in fact, play a role. But there does some to be some fundamental opposition to the very concept of a “language expert” to the point that even using the phrase itself gets people’s backs up. It isn’t helped by the fact that even the full time professional linguists who teach, research, and write about language for a living are often so democratic and accommodating that they fear being labeled “prescriptivist” and seem to suggest that we should just accept whatever anyone says as acceptable and just “go with the flow.” In fact, the most vehemently anti-language experts are usually the language experts! It appears that just a fine wines and a gourmet dinner all come down to taste, so does language.

      As you might notice, I’m in a bit of a prescriptivist mood this week 😉 The pendulum has certainly swung toward the “it don’t matter just do what u fink is good coz I’m an invididual and you ain’t not got no right to crit me” side, so I’m just pushing back a bit and suggesting there’s is nothing wrong with notions of “standards,” “common practices,” and “best practice” when it comes to language. And sure, just as you point out, how we use it when writing a grant will be different from how we use it when out for a night drinking with the lads. But if we don’t point out such things, we will soon end up with folks not knowing the difference, and more and more employers are seeing how not doing this results in job applications that appear to have been written by a 7-year-old. Dude. And it you can’t express yourself clearly when applying for a job, what makes me think you’ll be better once you have it?

      And as for Stephen Fry, I love him dearly, the whimsical little darling that he is, but if you watch him on QI sometime, you’ll see that he, too, is not immune to being a pompous pendant, and has become, to some extent, the very thing he parodied in an old Fry %& Laurie sketch called “Welcome to the Smug Hour!”

  3. Now all bets are off if you are teaching a grammar/ELA/journalism class. We have a class for education majors which addresses this very thing, teaching the little nuances saved for the most formal of instruction. And I do think language experts are acknowledged and lauded. They are hired as editors, English teachers or linguistics professors. They are not asked to comment on my FB statuses or my conversations to friends or co-workers. They do not have the right to rank my students’ communication as “wrong” compared to theirs if those students write and speak in the dialect of their communities.

    This argument is old (as in Shakespeare old) in that change is always seen as leaving a standard. Google Shakespeare coins new terms, improper grammar, critics, etc and you will find that the new is always given the stank eye (I love this term. Is it wrong? Stink eye? Stank eye is what I hear in my head and so I choose to spell it as such). The most educated group will always be a “Standard”, for better or worse.

    And I’m a huge fan of overuse of commas. See above! My mother would rip that up.

  4. A pendant, you say? (just joshing… I know a typo when I see it)

    I teach social skills classes and the most professorial students “Well actually, that is an improper use of the term ______” have the least friends. There is something about communication that is different, and we as SLPs know why. But I agree if you are an actual expert and using the teachable moments when people are open to being taught, then go for it. Sometimes the moment is, “this is not a job interview.. it’s ok to sound like your peers”. Are you training corporate speakers or helping teens make friends? The expert needs to teach what Fry spoke about with context and communication. It’s just so nuanced.. and can come off as more than a little elitist.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s