On the first day of what would be a depressing and alienating two-year trudge under the fluorescent lights of a rural high school, a soft-spoken bald man stood in front of my English class and looked at the ceiling as if trying to remember what he was going to say.
“So. In the past few years, you’ve all learned that an essay should be five paragraphs. The first paragraph states your argument and includes a topic sentence. You develop your argument over the next three paragraphs, and finish with a conclusion paragraph that starts with the words ‘in conclusion’ or something.”
Silent assent from thirty smallish heads.
Small gasps. Heresy!
“They probably taught you never to start a sentence with ‘and’ or ‘but.’ Forget it. Don’t use adverbs? Forget it. Forget,” he pointed at us, “all of it.”
My class drove multiple teachers to tears, and substitutes swore blood oaths on our principal’s desk sealing their promise never to teach again until we were all dead and buried under crossroads, but this man never even had to raise his voice. He’s among the pivotal figures that made me want to write, and not give up for all the years I was terrible at it.1 He understood writing, and just as important, he understood his students.
A little too well, actually; he had to resign after they caught him banging a student.2 Let’s focus on the writing part. He knew the rules that had been ice picked into our heads over the years were just ways to get something out of the kids who didn’t want to write and give the teachers a few things to decorate with smiley faces and Xs. Some of the rules were stylistic nonsense passed down by Hemingway to better distinguish American writing from British frivolity and ensure we never expressed amusement during the cold, hard, manly work of scraping a pencil across paper. Some rules were artifacts from other languages that don’t apply to English in any logical or meaningful way. Some rules were just the usual authoritarian madness of doing it because once somebody smarter than the rule makers said something offhand that the rule makers actually understood and it became out-of-context gospel.
They always promised us that if we mastered all the rules, we could find our voice later, which would have been true if they just meant the basic rules of grammar, but they meant all the rules pitched in the non-bald-not-banging-student-guy classes, and following all those rules would have left us with exactly one voice with which to go write another grammar textbook. Then, in the middle of these rule lectures, they give us Shakespeare, a man so unsatisfied with the state of his language he invented words even when he didn’t need to rhyme.
Bald guy reminded us writing was art. He reminded us that English is a rich and flexible language, and sifting something new out of it is half the fun. He reminded us that the structure of a sentence can be funny or sad. Most of all, he reminded us that writing is about communication. Writing is the most explicit art form; you can communicate enormously complex ideas or explore the oddest and most trivial quirks of the human experience.
Because everybody has a blog or is at least spitting bile at teenagers in a YouTube comment, people are by and large remembering this or figuring it out. The results are mixed, but at least the power and variety of expression through writing is on more constant display.
Yet just as the grammar Nazis are being crushed by the weight of a billion “how r u” text messages, the punctuation terrorists are coming out of the woodwork and fighting over the use and non-use of Oxford commas, and the rule war is being waged anew because nobody seems to understand that punctuation is as much an art as the rest of writing. Instead, they smugly post contrived sentences that mean different things depending on the placement of commas, because this tactic was so successful in fixing that thing that time.
Yes, you can use punctuation in incorrect ways, but that does not mean there is only one way to use it. A friend recently told me publishers don’t care whether you use an oxford comma or not, as long as you pick one and stick with it. This is stupid. If punctuation obscures or distorts the meaning of a sentence in an unintended way, it is wrong, but apart from that, punctuation is about rhythm. An Oxford comma is not a flip switch in an author’s voice, it’s a decision made in the moment to maintain the flow of the idea. Momentum, syncopation, rhythm and pattern make a sentence flow, because writers are trying to transfer the voices in their heads into yours. You can hear punctuation in speech: politicians talk in periods, Morgan Freeman is liberal with the commas, and Jon Stewart is a master of parentheses. Lewis Black made a career out of the exclamation point while Dennis Leary barely uses any punctuation at all. If you told Dennis Leary he needed more Oxford commas, I can only hope he’d put a cigarette out in your eye, but I heard he quit smoking.
Punctuation started with periods that told the speaker when to take a breath, and as both a longtime proponent of using the run-on sentence to better communicate the ranting rage in my head over the nonsense that people choose to fight about in this country and a person who is occasionally asked to read his work out loud, I’ve come to value this original function in a visceral way. Parentheses suggest a subtle aside (Jon Stewart lowering his voice and head) and can provide commentary or extra information while keeping you in the moment.3 Sometimes you want to keep the pace breakneck so you use em dashes—the noblest of the dashes—to let the reader know the ride ain’t stopping and something big is coming at the end.4 In this case, em dashes are doing something similar to a pair of commas, which can also denote side info but they do it more casually, and parentheses. You can use a single em dash to serve a purpose similar to a colon—making it absolutely clear that the thing after the dash follows from the thing before it. It can also be used to signal an abrupt change in—you know, screw it, em dashes do a bunch of stuff, you get the picture. A colon is a way to introduce things and to join ideas, and says something definite: this part of the sentence is important, and you can say it in an authoritative voice. Its purpose gets muddled with the semicolon, which is like a weak link between ideas; you can forget all the stuff about clauses: a semicolon joins two sentences without a period or ‘and’ or ‘but’ or ‘so’ or whatever. Semicolons, colons, periods, dashes, parentheses, commas, and even Oxford commas overlap each others’ jobs far more than rules lawyers would like. The situation is confusing and fluid, which is why everybody is afraid of the semicolon: it’s the only punctuation mark that’s honest and says, “Well I kinda do a little of this and a little of that.”
English is a mutt of a language, inheriting ludicrously contradictory spellings and grammars from other languages. The fact that word and whirred are pronounced exactly the same while lead and lead sound different depending on what you mean (unless the former is in the past tense in which case it’s spelled differently and pronounced like the latter) should tell us English is not so much a black tie affair as it is a soccer riot with a body count. But if we accept the chaos that informs the language, there’s a lot of expressive power to be found.
In conclusion, the next time somebody makes a strong case either for or against the Oxford comma, you can assume that their minds are simply collapsing because they looked into the abyss too soon. If make his point clear, Yoda could, give a shit about Oxford commas, nobody should.
And beyond that, there are obvious advantages to reading other people—their intentions, their feelings, and their character—as accurately as possible.
But you and I, as perceivers, are just as vulnerable to being influenced by faulty assumptions, biases, and lenses as everyone else is.
We've got the same mental hardware as everyone else, we have the same limited time and energy, and so we take the same shortcuts without realizing it.
Only now, hopefully, you do realize it. And that's half the battle. Awareness of bias makes it easier to mitigate or root out bias entirely.
I wrote No One Understands You and What to Do About It to help people understand why they are so often misunderstood, because it happens a lot.
But the truth is, not every misunderstanding is ... well, a misunderstanding. Sometimes, the perceiver is seeing the truth about you, and you are the one with blinders on.
Really knowing yourself is harder than you might think. We don't always have access to what's going on in our own minds. And we are complicated creatures, with multiple selves to contend with. (Are you really the same person with your close friends that you are at work or with family?)
We also have particular motivations—we want to see ourselves in certain ways. There's no objectivity in perception, whether you talking about perceiving others or perceiving yourself.
So how do you know if you are being misunderstood and misjudged or if you are fooling yourself? It's not easy to know, to be honest. And it's a topic that really deserves its own book.
But one piece of advice I can give you is to look for consistency across perceivers. In other words, if everybody—your friends, your family, your colleagues—is making the same "mistake" about you, then it's probably not a mistake at all.
And then it's time to go into Phase 2 for you, to question the assumptions you've been making about yourself and reconcile others' version of you with your own...
Perceiving people—including yourself—accurately is perhaps the most difficult thing we humans do. People are complicated, and their words and deeds are riddled with ambiguity and open to interpretation.
We don't realize that's the case, because the way our brains are wired makes perception feel so obvious and effortless. But it's neither—which is why we so often screw it up.
If you want to come across the way you intend to—to have other people see you as you (think you) are or as you'd like to be seen—you are going to have to give them a hand. Remember that it doesn't help to blame the perceiver for getting you wrong. Instead, try making it easier for him or her to get you right.
Whenever you are forming an impression or making a judgment about a person, remember to use these strategies:
Take your time
Don't rush to judgment. Keep in mind that your first impression of someone may be dead wrong, because there are always other possible interpretations of his or her behavior.
Think about the circumstances and how they might have influenced the person's actions (e.g., "Maybe Susan isn't trying to be rude. Maybe she's just nervous meeting new people, and her fear and awkwardness is making her come off poorly. She might be quite different once you get to know her.").
Commit to being fair
Remember that we all (or, at least most of us) want to be fair, but that doesn't mean we are actively pursuing that goal whenever we perceive another person. A simple reminder to yourself to be fair when you judge someone else is enough to activate the goal and diminish your unconscious bias.
Make it a mantra, something you say before you walk into any meeting. Stick it to your computer with a Post-It note. The more you consciously think about being fair, the more accurate your perception will be.
Beware of confirmation bias
Once we form an impression of someone, we tend to look selectively at his or her behavior to find confirming evidence that our impression is correct, rather than looking at all the evidence available.
Imagine that you are considering two candidates for a management position—Eliot and Joanna.
You know them both, but not particularly well. You are worried that Joanna may not be assertive enough to be an effective manager—there was that one time that she seemed reluctant to take the lead on a project—so you are thinking of giving the promotion to Eliot. (The stereotype that women are less assertive may well be biasing your perception here.)
To evaluate this decision correctly, you need to consider four kinds of evidence.Thanks to confirmation bias, we tend to look only at hypothesis-confirming evidence (i.e., instances where Joanna was not assertive—just one of the four boxes above) and ignore the rest.
So when you are making judgments about other people, make sure you are checking all four quadrants—considering evidence for and against your hypothesis and considering what other people have done under similar circumstances.
Reprinted by permission of Harvard Business Review Press. Excerpted and adapted from No One Understands You and What to Do About It by Heidi Grant Halvorson. Copyright 2015 Heidi Grant Halvorson. All rights reserved.