Once a week.
Subscribe to our weekly newsletter.
The invention that made us human: Fire
Did fire change the development of the human brain?
- The earliest evidence for fire dates back nearly 440 million years.
- Our hominin ancestors first used natural wildfires to flush out prey and forage for food.
- Richard Wrangham's cooking hypothesis suggests that a ready supply of cooked food allowed the Homo lineage to develop its large, complex brains.
Of humanity's greatest inventions, fire remains as important today as in the time of our ancient ancestors. If not as apparent.
We have replaced the hearth with electric ovens and central heating, but the burning of fossil fuels accounts for 63.5 percent of U.S. electricity generation. We still heat our homes and cook our food with fire — just in a more roundabout manner.
We even use fire in ways our ancestors couldn't have imagined. The internal combustion engine has replaced animals and our own wobbly legs as the preferred method of travel. We can go farther in a day than the vast majority of our ancestors did in a lifetime and even escape the confines of our planet. Thanks to fire.
But fire has done more than create the energy that makes our lives comfortable. By one Harvard professor's account, fire altered the course of our evolution.
Fire, a brief history
Wildfires, such as this one at Yellowstone National Park, have been a recurring phenomenon for more than 440 million years.
First, some Chem 101. Fire requires three elements for its reaction: oxygen, a fuel, and a heat source. Since two of the three elements are provided naturally by plants, the history of fire became intricately tied to them.
Some of our earliest evidence for fire goes back 440 million years to the Silurian period, when Earth's climate stabilized and plants and animals began to move to land. Of note, this period provides the earliest fossil evidence of vascular plants.
From this point, fire becomes a recurrent phenomenon with times of high and low activity based on environmental conditions. During the Carboniferous period, atmospheric oxygen hit a record high of 31 percent and plants spread across the supercontinent Pangea, so charcoal records suggest a lot of fire activity during this period. Conversely, the pittance of charcoal from the Triassic period suggests low atmospheric oxygen and fewer plants.
Jumping a few million years to the late Miocene, hominins moved to the grasslands and begun to further diverge from their ape relatives — likely due to the difference between the African savanna and the dense jungle. Here, they would have also encountered wildfires with far more regularity.
We didn't start the fire
Prometheus Brings Fire to Mankind by Heinrich Fuger. Our early reliance on nature for fire draws parallels to later mythologies.
Low-hanging references aside, Billy Joel was on to something. Popular culture conjures the image of a caveman banging two stones together. Sparks fly, and then the eureka moment. Yet, our ancestors' first usage of fire probably wasn't a matter of control or invention. It was more likely opportunistic.
In a review for the Royal Society Philosophical Transactions B, J.A.J. Gowlett hypothesizes that hominins took advantage of natural wildfires for foraging. "For hominins," he writes, "benefits could include retrieval of birds eggs, rodents, lizards and other small animals, as well as invertebrates. Although fire does not create such resources, it renders them far more visible, and chance cooking might well improve their digestibility."
Gowlett notes that analogues to this behavior exist in the natural world today. Savanna chimpanzees use fires to locate resources, and several bird species follow fires to snatch up any prey flushed out by the smoke and flames. There has even been anecdotal evidence of some raptors, such as Australia's "firehawks," picking up smoldering wood from one fire and carrying it elsewhere to start another.
Early hominins would have also begun to discover fire's properties by observing and interacting with these blazes. For example, if a meaty morsel proved too raw, they may have learned to place it on embers to continue the cooking process.
Given our early reliance on nature for fire, it's little wonder that the theft of fire theme has appeared time and again in the world's mythologies.
But we kept it burning
It's difficult to follow the development of hominin control over fire because of what Gowlett calls its "disappearing act." Fire isn't as well preserved in the archeological record as, say, middens or flint tools. And progress was incremental, with fire control being learned in different places at different times.
Certain archeological sites have proffered a bounty of stone tools, suggesting long-term quartering. Such occupancy could mean hominins learned to at least maintain fire as far back as 2.5 million years ago. But direct evidence is scarce.
As we move forward, we see more evidence of hominins' control over fire. Archaeologists have discovered campfire traces and charred animal and plant remains at Wonderwerk Cave in South Africa. These have been dated back to approximately one million years ago. And the oldest known hearth, found at Qesem Cave, Israel, dates back more than 300,000 years ago.
Interestingly, archeologists aren't sure which hominin species got cozy at Qesem. "It is clearly different than [Homo] erectus and has affinities of both [Homo] sapiens and Neanderthals," Ran Barkai, Tel Aviv archaeologist, told National Geographic. "Since Neanderthals appear very late in the Levant and are of European origin, and since the Qesem teeth bear more resemblance to early Homo sapiens in the Levant, we believe they are closer to Homo sapiens."
Hearths and campfires tell us hominins could maintain fires for cooking and warmth. They do not, however, prove our ability to create fire. After transferring a brand from a wildfire, a tribe member could have been given fire duty and tasked with fueling the fire to prevent its extinguishing.
Good evidence for fire creation appears around 120,000 years ago, when hominins had access to twine, a requisite to developing the bow drill. And archeologists have dated two glues used in hafting, bark pitch and gypsum plaster, to between 50 and 100 thousand years ago. Neither of these can be prepared without fire.
At this point, Gowlett argues, the invention of fire belongs to our ancestors. "[A]n understanding is emerging that fire use is not a single technology or process, but that several scales of use, and probably several intensifying technologies, evolved over a long period, intertwined, and sometimes eventually became bound together," he writes.
Fire (and food) for thought
Cooked meats are easier to chew and digest; as a result, our bodies can extract more nutrients from the same amount of meat. Similarly, cooking vegetables increases levels of healthy stuff like antioxidants. That's because the cooking process breaks down the plants' cell walls and, like meat, makes them easier to digest and process. (Though, it is a tradeoff. Some veggies are healthier raw, and it depends on how you cook them.)
Wrangham argues that the ability to create cooked foods shaped the brains and bodies of our Homo ancestors. Since our ancestors spent less energy digesting foods and could draw out additional nutrients, they had more to nutrients to spend, and evolution spent those dividends on maintaining larger brains — not to mention smaller teeth and jaws. Larger brains allowed us to process more information, create more dynamic social groups, and adjust to unfamiliar habitats. All of which benefited us evolutionarily.
With that said, the cooking hypothesis has its detractors. Some argue there is little proof that humans were cooking or maintaining fire in concurrence with Homo erectus' brain-size explosion (roughly 1.5 million years ago). It's also possible that a diet of raw meat and veggies could have provided the necessary nutrients for bigger brains.
Other hypotheses exist to explain the increase in hominin brain size. The social brain hypothesis, for example, argues our brains evolved to meet the challenges of living in large social groups. But even here, fire plays a role. Remember that before our ancestors could ignite fire, they had to maintain it. This required a division of labor, which is only possible in a species with a highly structured social network.
Fire may or may not ultimately prove to be principal in our evolutionary development. For any such hypothesis more evidence is needed — though fire, cooked food, and social networks likely all played a part.
Without a doubt, fire has proved a primary mover in the evolution of civilization. It helped us migrate to climates that would otherwise prove inhospitable. It was essential to the development of cuisine, agriculture, metallurgy, architecture, and a host of other industries. In short, the invention of fire has taken humanity places no other species has gone.
- Futuristic inventions and emerging technologies that will change the ... ›
- Accidental inventions: 7 amazing inventions made by mistake - Big ... ›
- Top 20 greatest inventions of all time - Big Think ›
- Ancient Bone Arrowheads Discovered in Sri Lanka - Big Think ›
- Ancient bone tools and jewelry discovered in Sri Lankan cave - Big Think ›
What is human dignity? Here's a primer, told through 200 years of great essays, lectures, and novels.
- Human dignity means that each of our lives have an unimpeachable value simply because we are human, and therefore we are deserving of a baseline level of respect.
- That baseline requires more than the absence of violence, discrimination, and authoritarianism. It means giving individuals the freedom to pursue their own happiness and purpose.
- We look at incredible writings from the last 200 years that illustrate the push for human dignity in regards to slavery, equality, communism, free speech and education.
The inherent worth of all human beings<p>Human dignity is the inherent worth of each individual human being. Recognizing human dignity means respecting human beings' special value—value that sets us apart from other animals; value that is intrinsic and cannot be lost.</p> <p>Liberalism—the broad political philosophy that organizes society around liberty, justice, and equality—is rooted in the idea of human dignity. Liberalism assumes each of our lives, plans, and preferences have some unimpeachable value, not because of any objective evaluation or contribution to a greater good, but simply because they belong to a human being. We are human, and therefore deserving of a baseline level of respect. </p> <p>Because so many of us take human dignity for granted—just a fact of our humanness—it's usually only when someone's dignity is ignored or violated that we feel compelled to talk about it. </p> <p>But human dignity means more than the absence of violence, discrimination, and authoritarianism. It means giving individuals the freedom to pursue their own happiness and purpose—a freedom that can be hampered by restrictive social institutions or the tyranny of the majority. The liberal ideal of the good society is not just peaceful but also pluralistic: It is a society in which we respect others' right to think and live differently than we do.</p>
From the 19th century to today<p>With <a href="https://books.google.com/ngrams/graph?year_start=1800&year_end=2019&content=human+dignity&corpus=26&smoothing=3&direct_url=t1%3B%2Chuman%20dignity%3B%2Cc0" target="_blank" rel="noopener noreferrer">Google Books Ngram Viewer</a>, we can chart mentions of human dignity from 1800-2019.</p><img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yNDg0ODU0My9vcmlnaW4ucG5nIiwiZXhwaXJlc19hdCI6MTY1MTUwMzE4MX0.bu0D_0uQuyNLyJjfRESNhu7twkJ5nxu8pQtfa1w3hZs/img.png?width=980" id="7ef38" class="rm-shortcode" data-rm-shortcode-id="9974c7bef3812fcb36858f325889e3c6" data-rm-shortcode-name="rebelmouse-image" />
American novelist, writer, playwright, poet, essayist and civil rights activist James Baldwin at his home in Saint-Paul-de-Vence, southern France, on November 6, 1979.
Credit: Ralph Gatti/AFP via Getty Images
The future of dignity<p>Around the world, people are still working toward the full and equal recognition of human dignity. Every year, new speeches and writings help us understand what dignity is—not only what it looks like when dignity is violated but also what it looks like when dignity is honored. In his posthumous essay, Congressman Lewis wrote, "When historians pick up their pens to write the story of the 21st century, let them say that it was your generation who laid down the heavy burdens of hate at last and that peace finally triumphed over violence, aggression and war."</p> <p>The more we talk about human dignity, the better we understand it. And the sooner we can make progress toward a shared vision of peace, freedom, and mutual respect for all. </p>
Scientists find that bursts of gamma rays may exceed the speed of light and cause time-reversibility.
- Astrophysicists propose that gamma-ray bursts may exceed the speed of light.
- The superluminal jets may also be responsible for time-reversibility.
- The finding doesn't go against Einstein's theory because this effect happens in the jet medium not a vacuum.
Jet bursting out of a blazar. Black-hole-powered galaxies called blazars are the most common sources detected by NASA's Fermi Gamma-ray Space Telescope.
Cosmic death beams: Understanding gamma ray bursts<div class="rm-shortcode" data-media_id="cu2knVEk" data-player_id="FvQKszTI" data-rm-shortcode-id="c6cfd20fdf31c82cb206ade8ce21ba3f"> <div id="botr_cu2knVEk_FvQKszTI_div" class="jwplayer-media" data-jwplayer-video-src="https://content.jwplatform.com/players/cu2knVEk-FvQKszTI.js"> <img src="https://cdn.jwplayer.com/thumbs/cu2knVEk-1920.jpg" class="jwplayer-media-preview" /> </div> <script src="https://content.jwplatform.com/players/cu2knVEk-FvQKszTI.js"></script> </div>
Is Bitcoin akin to 'digital gold'?
- In October, PayPal announced that it would begin allowing users to buy, sell, and hold cryptocurrencies.
- Other major fintech companies—Square, Fidelity, SoFi—have also recently begun investing heavily in cryptocurrencies.
- While prices are volatile, many investors believe cryptocurrencies are a relatively safe bet because blockchain technology will prove itself over the long term.
Presentation slide from Sanja Kon's presentation on the evolution of money at 2020 Web Summit
Credit: Sanja Kon<p>The move came shortly after the payments company Square invested $50 million into Bitcoin, and after Fidelity announced that it was opening a Bitcoin fund into which qualified purchasers could invest <a href="https://www.bloomberg.com/news/articles/2020-08-26/fidelity-launches-inaugural-bitcoin-fund-for-wealthy-investors" target="_blank">(minimum investment: $100,000)</a>. Together, this institutional backing might have something to do with Bitcoin's recent surge back to near its 2017 price peak of $19,783. (Bitcoin is listed at 19,384.30 as of Dec. 3.)<br></p>
Presentation slide from Sanja Kon's presentation on the evolution of money at 2020 Web Summit
Credit: Sanja Kon<p>But more importantly, it suggests cryptocurrencies might soon have the opportunity to prove themselves in real-world use cases. After all, skeptics have long doubted the ability of cryptocurrencies to go mainstream as a form of everyday payment. But people seem increasingly comfortable with digital payment systems.</p><p style="margin-left: 20px;">"The entire world is going to come into digital first," Schulman said at Web Summit, adding that PayPal's services already go hand-in-hand with cryptocurrencies. "As we thought about it, digital wallets are a natural complement to digital currencies. We've got over 360 million digital wallets and we need to embrace cryptocurrencies."</p><p>Sanja Kon, vice president of global partnerships at the cryptocurrency payments processor company UTRUST, also spoke at Web Summit about the increasing adoption of digital payments:</p><p style="margin-left: 20px;">"Physical cash is becoming more and more obsolete. And the next step in the evolution is digital currency."</p><p>Kon noted some of the inherent advantages of cryptocurrencies, namely ownership. </p><p style="margin-left: 20px;">"For many people, this is really the main benefit of cryptocurrency: Users owning cryptocurrencies are able to control how they spend their money without dealing with any intermediary authority like a bank or a government, for example," Kon said, adding that there are no bank fees associated with cryptocurrencies, and that international transaction fees are significantly lower than wire transfers of fiat currency.</p><p>Kon said cryptocurrencies have unique growth opportunities in areas where people aren't integrated into modern banking systems:</p><p style="margin-left: 20px;">"With cryptocurrencies and blockchain, with the use of just a smartphone and access to internet, Bitcoin and cryptocurrencies can be available to populations of people and users without access to the traditional banking system."</p>
Bitcoin as 'digital gold'<p>Still, it could take years for people to start using cryptocurrencies for everyday purchases on a large scale. Despite this, many cryptocurrency advocates see digital currencies, particularly Bitcoin, as a way to store value—digital gold, essentially.</p><p style="margin-left: 20px;">"I don't think Bitcoin is going to be used as a transactional currency anytime in the next five years," billionaire investor Mike Novogratz recently told <a href="https://www.bloomberg.com/news/articles/2020-10-23/novogratz-says-bitcoin-is-digital-gold-not-a-currency-for-now?srnd=markets-vp" target="_blank">Bloomberg</a>. "Bitcoin is being used as a store of value. [...] "Bitcoin as a gold, as digital gold, is just going to keep going higher. More and more people are going to want it as some portion of their portfolio."</p><p>There are obvious parallels between gold and Bitcoin: Both are mined, do not degrade over time, are finite in supply, and aren't directly tied to the value of fiat currency, making them <a href="https://www.reuters.com/article/us-gold-inflation/gold-as-an-inflation-hedge-well-sort-of-idUSKCN1GD516" target="_blank" rel="noopener noreferrer">relatively invulnerable to inflation</a>. The obvious objection is that the price of Bitcoin, and cryptocurrencies in general, is far more volatile than gold.</p><p>But for investors who believe the inherent value of cryptocurrency technology will prove itself over the long term, these price fluctuations are just bumps on the long road to the future of currency. </p><p style="margin-left: 20px;">"It's no longer a debate if crypto is a thing, if Bitcoin is an asset, if the blockchain is going to be part of the financial infrastructure," Novogratz said. "It's not if, it's when, and so every single company has to have a plan now."</p>
Singapore has approved the sale of a lab-grown meat product in an effort to secure its food supplies against disease and climate change.
Approve for your dining pleasure<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="dd3f57f8baf14e654812d30a309d1f17"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/307gysA18_E?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span><p><a href="https://www.ju.st/en-us" target="_blank" rel="noopener noreferrer">Eat Just</a>, a company that produces animal-alternative food products, announced the news earlier this week. In what the company is calling a world first, Singapore has given it permission for a small-scale commercial launch of their GOOD Meat brand product line. For the initial run, the cultured chicken meat will be sold as an ingredient in "chicken bites."</p><p>"Singapore has long been a leader in innovation of all kinds, from information technology to biologics to now leading the world in building a healthier, safer food system. I'm sure that our regulatory approval for cultured meat will be the first of many in Singapore and in countries around the globe," Josh Tetrick, co-founder and CEO of Eat Just, <a href="https://www.businesswire.com/news/home/20201201006251/en/Eat-Just-Granted-World%E2%80%99s-First-Regulatory-Approval-for-Cultured-Meat" target="_blank" rel="noopener noreferrer">said in a release</a>.</p><p>According to the release, Eat Just underwent an extensive safety review by the Singapore Food Agency. It provided officials "details on the purity, identity and stability of chicken cells during the manufacturing process, as well as a detailed description of the manufacturing process which demonstrated that harvested cultured chicken met quality controls and a rigorous food safety monitoring system." It also demonstrated the consistency of its production by running more than 20 cycles in its 1,200-liter bioreactors.</p><p>While Eat Just did not offer details on its propriety process, it likely follows <a href="https://www.newscientist.com/article/mg24032080-400-accelerating-the-cultured-meat-revolution/" target="_blank" rel="noopener noreferrer">one similar to other lab-grown meats</a>. It starts with muscle cell samples drawn from a living animal. Technicians then isolate stem cells from the sample and culture them <em>in vitro</em>. These cultured stem cells are then placed in a bioreactor, essentially a fermenter for fleshy cells. The bioreactor contains scaffolding materials to keep the growing tissue from falling apart as well as a growth material—the sugars, salts, and other nutrients the tissue needs to grow. As the cells grow, they begin to differentiate into the muscle, fat, and other cells of meat tissue. Once grown, the tissues are formed into a meat product to be shipped to restaurants and supermarkets.</p>
An abattoir abatement?<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yNDg2Mjg5OS9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTYyODg1NDI3N30.AYmFJfWQbPjK-o1IatyFHL-OLjcfXBMmQKYyvz4oT3s/img.jpg?width=980" id="8a82d" class="rm-shortcode" data-rm-shortcode-id="93f824fe4c6f397ab2b65e4665847e71" data-rm-shortcode-name="rebelmouse-image" />
A graph showing the number of animals slaughtered in the United States per year from 1961–2018.
Credit: Our World in Data<p>Singapore's approval is an important step in support for clean meats—so-called because they don't require animal slaughter and would likely leave a reduced carbon footprint—but hurdles remain before widespread adoption is possible.</p><p>The most glaring is the price. The first lab-grown hamburger was eaten in London in 2013. <a href="https://www.bbc.com/news/science-environment-23576143" target="_blank" rel="noopener noreferrer">It cost roughly $330,000</a>. As with any new technology, investment, iteration, and improved manufacturing will see the price drop substantially and quickly. For comparison, Eat Just's chicken will be priced equivalent to premium chicken.</p><p>Other hurdles include up-scaling production, <a href="https://www.nature.com/articles/d41586-019-00373-w" target="_blank" rel="noopener noreferrer">the need for further research</a>, and developing techniques to reliably produce in-demand meats such as fish and beef. Finally, not all countries may be as receptive as Singapore. Countries with large, entrenched meat industries may protect this legacy industry through a protracted and difficult regulatory process. Though, the meat industry itself is investing in lab-grown meat. Tyson Foods, for example, has <a href="https://euromeatnews.com/Article-Tyson-Foods-announces-investment-in-clean-meat/697" target="_blank" rel="noopener noreferrer">invested in the food-tech startup Memphis Meats</a>, the company that debuted the world's first beef meatball.</p><p>"I would imagine what will happen is the U.S., Western Europe and others will see what Singapore has been able to do, the rigours of the framework that they put together. And I would imagine that they will try to use it as a template to put their own framework together," <a href="https://www.reuters.com/article/us-eat-just-singapore/singapore-approves-sale-of-lab-grown-meat-in-world-first-idUSKBN28C06Z" target="_blank" rel="noopener noreferrer">Tetrick told Reuter's during an interview</a>.</p><p>Regardless of the challenges, the demand for meat substitutes is present and growing. In 2020, plant-based substitutes like Beyond Meat and Impossible foods <a href="https://bigthink.com/coronavirus/plant-based-meat" target="_self">gained a significant foothold in supermarkets</a> as meat-packing factories became coronavirus hotspots. The looming threat of climate change has also turned people away from meat as animal products. Livestock production is environmentally taxing and leaves <a href="http://css.umich.edu/factsheets/carbon-footprint-factsheet" target="_blank" rel="noopener noreferrer">a much larger carbon footprint</a> than grain and vegetable production. </p><p>Then there's the moral concern of animal cruelty. In 2018 alone, 302 million cows, 656 million turkeys, 1.48 billion pigs, and a gob-smacking 68 billion chickens were <a href="https://ourworldindata.org/grapher/animals-slaughtered-for-meat" target="_blank" rel="noopener noreferrer">slaughtered for meat worldwide</a>. And those figures do not include chickens killed in dairy or egg production.</p><p>If brought to scale and widely available, clean meats could become serious competitors to traditional meat. <a href="https://bigthink.com/technology-innovation/meat-alternatives" target="_self">One report has even predicted</a> that 60 percent of the meat people eat by 2040 won't come from slaughtered animals. It could be just the thing for people looking for a meat substitute but who find tofurkey as distasteful as, well tofurkey.</p>