Literally the Worst Definition of a Word Ever
Stop using 'literally' figuratively!
Ours is a cultural and linguistic moment obsessed with irony.
For just that reason, people need to stop using the word 'literally' to mean 'figuratively'. I am not the first to say this. Yet the point failed to sink in. Worse, prompted by the addition of the non-literal sense of 'literal' to several dictionaries, including the Oxford Enflish Dictionary Online, the usage is getting even more accepted.
The situation is also getting worse because the flames are being fanned by several reactionary articles which claim that the usage which means 'figuratively' is perfectly legitimate. They make this claim based on three lines of reasoning: That the usage is very old, that nobody actually gets confused by the two meanings, and that language evolves naturally and we must simply describe it and conform to it, rather than judge it or make prescriptions for it.
I will describe precisely why this usage is a bad thing for the language. But first, because all of the above arguments are faulty, I want to take the time to point out why:
This point tends to be the primary data used to make the non-literal use of 'literally' look legitimate to detractors. Why the hell does this matter to anyone? What is so much more authoritative about English-speakers of centuries past than those from this century? We do not claim that modern doctors should include a medical procedure in a modern textbook because doctors did so in 1759.
Sure, those who, like me, decry the usage at issue tend to associate it with younger speakers who are diluting the language with their text-speak. But isn't that a legitimate complaint? Original date of usage aside, no one is denying that this usage has exploded in popularity recently, largely due to young people who are not, so to speak, of the lettered classes. So should we really be formalizing our conventional language around people whose most thoughtful comment on a dictionary would be TL;DR?
I say they aren't even invited to the convention.
Bad Reason 2: "And anyway," they say, "it does not devalue the primary definition of the word or the value of the English language at large because one would have to be an idiot to confuse which sense of the word is being employed."
There are a few things wrong with this.
For one, it assumes the faulty premise that only genuinely misunderstanding the meaning of usage indicates a problem with it. That isn't the case. The primary stylistic problem with the non-literal 'literally' is that, though hearers can sort out which usage is being employed, that sorting is jarring.
For example, if I hear that "he literally broke my heart" or that "Transformers 2 is literally the worst thing ever" my brain goes in places that the speaker doesn't want it to go. Now, of course I am going to extrapolate the intended meaning from context. But noting only that ignores the fact that somebody who is trying to express themselves to me more, rather than less, accurately by being figurative has failed. Instead he or she made me visualize the rending of cardiac tissue, or made me compare genocides to blockbusters. It's just not good communication.
(And anyway, the non-literal usage of "literally" is just so damned dramatic. It comes from our worst, most unpleasantly exaggerative linguistic instincts; Quoth a girl I walked by recently: "If I got the wrong shampoo, I'm literally going to kill myself.")
But the bigger problem with this Bad Reason 2 is that it simply isn't true. This reasoning ignores the possibility and the likelihood of people's "talking at cross purposes." That can be a huge mistake. For example: I once heard two very smart people (both, in fact, versed in the philosophy of language, from which the phrase 'sense of a word' comes) have a conversation, which was a fierce disagreement about the artistic merits of a band, for almost ten minutes. Only after that long did they determine that they were each discussing entirely different bands!
So yes, context is a vital and informative part of understanding words, but it is careless to assume that subtleties will never, ever get lost in translation. This inevitable phenomenon even has a historical name, The Inscrutability of Reference (which phrase means nothing more and nothing less than "Our inability to know precisely what the hell each other are talking about").
Bad Reason 3: "All that the maker of a dictionary can do is record uses that come into being organically. We cannot tell people what is and is not 'correct,' because the concept of correct does not even apply to language usage. The job of the dictionary editor/linguist/lexicographer is only to observe and record."
This is the old canard that because language evolves, it is somehow elitist and morally wrong to try to formalize certain usages of language as legitimate while writing others off as illegitimate. Holders of this view are called Descriptivists, while holders of the opposite are called Prescriptivists.
I do not propose to make the case for Prescriptivism writ large here, because I can just as well defeat this point on Descriptivist terms. (If you do want to see that discussion borne out, I highly recommend David Foster Wallace's classic essay "Authority and American Usage", though for the love of god ignore the footnote about Wittgenstein's private language argument, which is all wrong.)
So I have this question for the Descriptivists who draw an Is from an Ought by saying that language evolves, and can therefore never be rightfully subject to authority: In what sense are the efforts of a Prescriptivist dictionary editor to formalize or ban a word not part of that very linguistic evolution that you cheer on? If everybody is supposed to have a part in the organic evolution of a living, breathing language, why not understand this as the elites, in their own snooty way, doing just that?
A language is better the more things it can say clearly. It should allow us to communicate what we mean. We need the primary definition of 'literally' to be left alone, because without it, we don't have any other way to say that thing.
Irony is the use of words expressing something other than their literal intention. I chose to open this article by noting that this is a moment in which irony is a cultural fixation. You might have wondered why I think that is relevant.
It's relevant because having one word, 'literally', which word is exempt from ironic usage, allows us to talk about that very obsession. It allows us to demarcate irony from non-irony. The non-literal definition of 'literal' makes English smaller.
Need proof? Simply consider how many times I have just had to employ the clunker of a phrase "the non-literal usage of "literally.'" (Tellingly, I stole the phrase from one of the articles arguing against me.)
We can see, then, that at the very best, using 'literal' figuratively needlessly complicates things. At worst, it lessens the very power of the language to describe. We can therefore see why we need to eliminate this language from our speech and from our dictionaries. Describing is all languages do!
Malcolm Gladwell teaches "Get over yourself and get to work" for Big Think Edge.
- Learn to recognize failure and know the big difference between panicking and choking.
- At Big Think Edge, Malcolm Gladwell teaches how to check your inner critic and get clear on what failure is.
- Subscribe to Big Think Edge before we launch on March 30 to get 20% off monthly and annual memberships.
Big tech is making its opening moves into the health care scene, but its focus on tech-savvy millennials may miss the mark.
- Companies like Apple, Amazon, and Google have been busy investing in health care companies, developing new apps, and hiring health professionals for new business ventures.
- Their current focus appears to be on tech-savvy millennials, but the bulk of health care expenditures goes to the elderly.
- Big tech should look to integrating its most promising health care devise, the smartphone, more thoroughly into health care.
A new study, led by psychologist Jean Twenge, points to the screen as the problem.
- In a new study, adolescents and young adults are experiencing increased rates of depression and suicide attempts.
- The data cover the years 2005–2017, tracking perfectly with the introduction of the iPhone and widespread dissemination of smartphones.
- Interestingly, the highest increase in depressive incidents was among individuals in the top income bracket.
Here's why universal basic income will hurt the 99%, and make the 1% even richer.
- Universal basic income is a band-aid solution that will not solve wealth inequality, says Rushkoff.
- Funneling money to the 99% perpetuates their roles as consumers, pumping money straight back up to the 1% at the top of the pyramid.
- Rushkoff suggests universal basic assets instead, so that the people at the bottom of the pyramid can own some means of production and participate in the profits of mega-rich companies.
SMARTER FASTER trademarks owned by The Big Think, Inc. All rights reserved.