Big ideas.
Once a week.
Subscribe to our weekly newsletter.
Literally the Worst Definition of a Word Ever
Stop using 'literally' figuratively!
Ours is a cultural and linguistic moment obsessed with irony.
For just that reason, people need to stop using the word 'literally' to mean 'figuratively'. I am not the first to say this. Yet the point failed to sink in. Worse, prompted by the addition of the non-literal sense of 'literal' to several dictionaries, including the Oxford Enflish Dictionary Online, the usage is getting even more accepted.
The situation is also getting worse because the flames are being fanned by several reactionary articles which claim that the usage which means 'figuratively' is perfectly legitimate. They make this claim based on three lines of reasoning: That the usage is very old, that nobody actually gets confused by the two meanings, and that language evolves naturally and we must simply describe it and conform to it, rather than judge it or make prescriptions for it.
I will describe precisely why this usage is a bad thing for the language. But first, because all of the above arguments are faulty, I want to take the time to point out why:
Bad Reason 1: "Aha!" they point out. "The usage is not some horrible new invention of the millennials, it has been around for a very long time, in the dictionary since 1903, and first used in 1759!"
This point tends to be the primary data used to make the non-literal use of 'literally' look legitimate to detractors. Why the hell does this matter to anyone? What is so much more authoritative about English-speakers of centuries past than those from this century? We do not claim that modern doctors should include a medical procedure in a modern textbook because doctors did so in 1759.
Sure, those who, like me, decry the usage at issue tend to associate it with younger speakers who are diluting the language with their text-speak. But isn't that a legitimate complaint? Original date of usage aside, no one is denying that this usage has exploded in popularity recently, largely due to young people who are not, so to speak, of the lettered classes. So should we really be formalizing our conventional language around people whose most thoughtful comment on a dictionary would be TL;DR?
I say they aren't even invited to the convention.
Bad Reason 2: "And anyway," they say, "it does not devalue the primary definition of the word or the value of the English language at large because one would have to be an idiot to confuse which sense of the word is being employed."
There are a few things wrong with this.
For one, it assumes the faulty premise that only genuinely misunderstanding the meaning of usage indicates a problem with it. That isn't the case. The primary stylistic problem with the non-literal 'literally' is that, though hearers can sort out which usage is being employed, that sorting is jarring.
For example, if I hear that "he literally broke my heart" or that "Transformers 2 is literally the worst thing ever" my brain goes in places that the speaker doesn't want it to go. Now, of course I am going to extrapolate the intended meaning from context. But noting only that ignores the fact that somebody who is trying to express themselves to me more, rather than less, accurately by being figurative has failed. Instead he or she made me visualize the rending of cardiac tissue, or made me compare genocides to blockbusters. It's just not good communication.
(And anyway, the non-literal usage of "literally" is just so damned dramatic. It comes from our worst, most unpleasantly exaggerative linguistic instincts; Quoth a girl I walked by recently: "If I got the wrong shampoo, I'm literally going to kill myself.")
But the bigger problem with this Bad Reason 2 is that it simply isn't true. This reasoning ignores the possibility and the likelihood of people's "talking at cross purposes." That can be a huge mistake. For example: I once heard two very smart people (both, in fact, versed in the philosophy of language, from which the phrase 'sense of a word' comes) have a conversation, which was a fierce disagreement about the artistic merits of a band, for almost ten minutes. Only after that long did they determine that they were each discussing entirely different bands!
So yes, context is a vital and informative part of understanding words, but it is careless to assume that subtleties will never, ever get lost in translation. This inevitable phenomenon even has a historical name, The Inscrutability of Reference (which phrase means nothing more and nothing less than "Our inability to know precisely what the hell each other are talking about").
Bad Reason 3: "All that the maker of a dictionary can do is record uses that come into being organically. We cannot tell people what is and is not 'correct,' because the concept of correct does not even apply to language usage. The job of the dictionary editor/linguist/lexicographer is only to observe and record."
This is the old canard that because language evolves, it is somehow elitist and morally wrong to try to formalize certain usages of language as legitimate while writing others off as illegitimate. Holders of this view are called Descriptivists, while holders of the opposite are called Prescriptivists.
I do not propose to make the case for Prescriptivism writ large here, because I can just as well defeat this point on Descriptivist terms. (If you do want to see that discussion borne out, I highly recommend David Foster Wallace's classic essay "Authority and American Usage", though for the love of god ignore the footnote about Wittgenstein's private language argument, which is all wrong.)
So I have this question for the Descriptivists who draw an Is from an Ought by saying that language evolves, and can therefore never be rightfully subject to authority: In what sense are the efforts of a Prescriptivist dictionary editor to formalize or ban a word not part of that very linguistic evolution that you cheer on? If everybody is supposed to have a part in the organic evolution of a living, breathing language, why not understand this as the elites, in their own snooty way, doing just that?
Alternative Reasoning:
A language is better the more things it can say clearly. It should allow us to communicate what we mean. We need the primary definition of 'literally' to be left alone, because without it, we don't have any other way to say that thing.
Irony is the use of words expressing something other than their literal intention. I chose to open this article by noting that this is a moment in which irony is a cultural fixation. You might have wondered why I think that is relevant.
It's relevant because having one word, 'literally', which word is exempt from ironic usage, allows us to talk about that very obsession. It allows us to demarcate irony from non-irony. The non-literal definition of 'literal' makes English smaller.
Need proof? Simply consider how many times I have just had to employ the clunker of a phrase "the non-literal usage of "literally.'" (Tellingly, I stole the phrase from one of the articles arguing against me.)
We can see, then, that at the very best, using 'literal' figuratively needlessly complicates things. At worst, it lessens the very power of the language to describe. We can therefore see why we need to eliminate this language from our speech and from our dictionaries. Describing is all languages do!
Dogs digest human food better and poop less
A new study finds that dogs fed fresh human-grade food don't need to eat—or do their business—as much.
- Most dogs eat a diet that's primarily kibble.
- When fed a fresh-food diet, however, they don't need to consume as much.
- Dogs on fresh-food diets have healthier gut biomes.
Four diets were tested
<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yNTU5ODI1MS9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTY1NjY0NjIxMn0._w0k-qFOC86AqmtPHJBK_i-9F5oVyVYsYtUrdvfUxWQ/img.jpg?width=980" id="1b1e4" class="rm-shortcode" data-rm-shortcode-id="87937436a81c700a8ab3b1d763354843" data-rm-shortcode-name="rebelmouse-image" data-width="1440" data-height="960" />Credit: AntonioDiaz/Adobe Stock
<p>The researchers tested refrigerated and fresh human-grade foods against kibble, the food most dogs live on. The <a href="https://frontierpets.com.au/blogs/news/how-kibble-or-dry-dog-food-is-made" target="_blank">ingredients</a> of kibble are mashed into a dough and then extruded, forced through a die of some kind into the desired shape — think a <a href="https://en.wikipedia.org/wiki/Food_extrusion" target="_blank">pasta maker</a>. The resulting pellets are sprayed with additional flavor and color.</p><p>For four weeks, researchers fed 12 beagles one of four diets:</p><ol><li>a extruded diet — Blue Buffalo Chicken and Brown Rice Recipe</li><li>a fresh refrigerated diet — Freshpet Roasted Meals Tender Chicken Recipe</li><li>a fresh diet — JustFoodforDogs Beef & Russet Potato Recipe</li><li>another fresh diet — JustFoodforDogs Chicken & White Rice Recipe.</li></ol><p>The two fresh diets contained minimally processed beef, chicken, broccoli, rice, carrots, and various food chunks in a canine casserole of sorts. </p><p>(One can't help but think how hard it would be to get finicky cats to test new diets. As if.)</p><p>Senior author <a href="https://ansc.illinois.edu/directory/ksswanso" target="_blank" rel="noopener noreferrer">Kelly S. Swanson</a> of U of I's Department of Animal Sciences and the Division of Nutritional Sciences, was a bit surprised at how much better dogs did on people food than even refrigerated dog chow. "Based on past research we've conducted I'm not surprised with the results when feeding human-grade compared to an extruded dry diet," he <a href="https://aces.illinois.edu/news/feed-fido-fresh-human-grade-dog-food-scoop-less-poop" target="_blank">says</a>, adding, "However, I did not expect to see how well the human-grade fresh food performed, even compared to a fresh commercial processed brand."</p>Tracking the effect of each diet
<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yNTU5ODI1OC9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTY3NjY1NTgyOX0.AdyMb8OEcjCD6iWYnXjToDmcnjfTSn-0-dfG96SIpUA/img.jpg?width=980" id="da892" class="rm-shortcode" data-rm-shortcode-id="880d952420679aeccd1eaf32b5339810" data-rm-shortcode-name="rebelmouse-image" data-width="1440" data-height="960" />Credit: Patryk Kosmider/Adobe Stock
<p>The researchers tracked the dogs' weights and analyzed the microbiota in their fecal matter.</p><p>It turned out that the dogs on kibble had to eat more to maintain their body weight. This resulted in their producing 1.5 to 2.9 times the amount of poop produced by dogs on the fresh diets.</p><p>Says Swanson, "This is consistent with a 2019 National Institute of Health study in humans that found people eating a fresh whole food diet consumed on average 500 less calories per day, and reported being more satisfied, than people eating a more processed diet."</p><p>Maybe even more interesting was the effect of fresh food on the gut biome. Though there remains much we don't yet know about microbiota, it was nonetheless the case that the microbial communities found in fresh-food poo was different.</p><p>"Because a healthy gut means a healthy mutt," says Swanson, "fecal microbial and metabolite profiles are important readouts of diet assessment. As we have shown in <a href="https://academic.oup.com/jas/article/92/9/3781/4702209#110855647" target="_blank">previous studies</a>, the fecal microbial communities of healthy dogs fed fresh diets were different than those fed kibble. These unique microbial profiles were likely due to differences in diet processing, ingredient source, and the concentration and type of dietary fibers, proteins, and fats that are known to influence what is digested by the dog and what reaches the colon for fermentation."</p>How did kibble take over canine diets?
<p>Historically, dogs ate scraps left over by humans. It has only been <a href="https://www.thefarmersdog.com/digest/the-history-of-commercial-pet-food-a-great-american-marketing-story/" target="_blank">since 1870</a>, with the arrival of the luxe Spratt's Meat Fibrine Dog Cakes—made from "the dried unsalted gelatinous parts of Prairie Beef", mmm—that commercial dog food began to take hold. Dog bone-shaped biscuits first appeared in 1907. Ken-L Ration dates from 1922. Kibble was first extruded in 1956. Pet food had become a great way to turn <a href="https://www.dogfoodadvisor.com/choosing-dog-food/animal-by-products/" target="_blank">human-food waste</a> into profit.</p><p>Commercial dog food became the norm for most household canines only after a massive marketing campaign led by a group of dog-food industry lobbyists called the Pet Food Institute in 1964. Over time, for most households, dog food was what dogs ate — what else? Human food? These days more than half of U.S. dogs are <a href="https://www.nytimes.com/2014/08/03/magazine/who-made-that-dog-biscuit.html" target="_blank">overweight or obese</a>, and certainly their diet is a factor.<span></span></p><p>We're not so special among animals after all. If something's healthy for us to eat—we're <em>not</em> looking at you, chocolate—maybe we should remember to share with our canine compatriots. Not from the table, though.</p>Your genetics influence how resilient you are to the cold
What makes some people more likely to shiver than others?
Some people just aren't bothered by the cold, no matter how low the temperature dips. And the reason for this may be in a person's genes.
Harvard study finds perfect blend of fruits and vegetables to lower risk of death
Eating veggies is good for you. Now we can stop debating how much we should eat.
- A massive new study confirms that five servings of fruit and veggies a day can lower the risk of death.
- The maximum benefit is found at two servings of fruit and three of veggies—anything more offers no extra benefit according to the researchers.
- Not all fruits and veggies are equal. Leafy greens are better for you than starchy corn and potatoes.
Cephalopod aces 'marshmallow test' designed for eager children
The famous cognition test was reworked for cuttlefish. They did better than expected.