from the world's big
Want to know where the next Ebola outbreak will strike? Follow the bats
A global Ebola pandemic could cause tens of millions of deaths. To prevent this, researchers at Lehigh University have created a predictive model based on the migratory patterns of bats.
Ebola first emerged from what is now the Democratic Republic of Congo (DRC) (then Zaire) in 1976. It’s named after the Ebola River, which is where the outbreak began. At the time, 284 people died. Several more small outbreaks occurred in Africa over the '80s and '90s, garnering the attention of the medical community. Although scientists have been studying the disease for decades, its natural reservoir has never been found—meaning the animal from which it originates. The stakes weren’t that high, until 2014.
The world was gripped by the West African Ebola crisis at that time (2014-2016). It quickly spread over several West African countries, including Sierra Leone, Liberia, Mali, Nigeria, and Senegal. Over 28,000 were infected and 11,000 died. Soon, cases were reported further afield, including a handful in Spain, Italy, the U.K., and the U.S.
Note that such outbreaks haven’t ceased; on May 8 of this year, the DRC announced yet another. So far, 51 Ebola cases have been reported and 25 people have died. This is the ninth such outbreak in the country’s history. Ebola is spread through contact with infected bodily fluids. Endemic poverty and government corruption often force people in these countries to acquire bush meat to supplement their diet. Sometimes the animal they’ve captured is infected. Burial practices in this part of Africa too, where the body is carefully prepared for funeral by relatives, often causes the virus to spread.
The latest outbreak occurred in a rural area and was contained quickly. Even so, Bill Gates and others warn that a worldwide pandemic of Ebola or a similar disease could kill 30 million in as little as six months. Despite this, the U.S. and other countries are woefully unprepared. As a result, medical researchers have been scrambling to find ways to predict the timing and location of outbreaks so that they can be contained. Now, a group of researchers think they’ve found a reliable method, developed through following the migratory patterns of bats.
Bill Gates claims a worldwide pandemic could kill 30 million people in as little as six months, and could potentially wipe out the human race. (Credit: Getty Images.)
Bats are known carriers of Ebola. Although studies so far have not found them to be the progenitor of the virus, they still might be the source. Bats are widely consumed across Africa, with hundreds of thousands of them eaten annually. For years, scientists have said that since bats are known carriers, their migratory patterns may serve as a predictive model for the spread of the disease. Now, in a study published in the journal Scientific Reports, researchers at Lehigh University in Pennsylvania have developed a framework for such a model.
Associate professor of bioengineering Javier Buceta co-authored the study. He worked alongside Paolo Bocchini, an associate professor of civil and environmental engineering, and postdoctoral researcher Graziano Fiorillo. Buceta told Live Science, "Traditionally, scientists studying the [spread] of diseases like Ebola have operated under the assumption that the disease moves uniformly. In reality, diseases that are spread by animal hosts depend on how those hosts migrate.”
Buceta and colleagues gathered satellite data, information on seasonal changes, bat birth rates, habitat data, food availability, and bat death rates. They also looked at the infection rate of these mammals and how long it would take them to get over the disease. Researchers fed all this into a computer model or algorithm to see if they could retroactively predict hotspots for the Ebola outbreak that occurred in 2014.
A predictive model would allow medical organizations to rush resources to a hotspot before an outbreak turns into an epidemic. (Credit: Getty Images)
Buceta told Medical News Today:
We needed to study the random fluctuations of available resources over the entire African continent at high resolution; it was a massive computational and probabilistic challenge. We recognized that from a mathematical point of view, the problem is similar to the random propagation of seismic waves in a region subject to earthquakes, and we could adapt our tools.
The first case in the 2014 outbreak was a two-year-old boy in a village called Meliandou in Guinea. The Lehigh University model was able to retroactively predict the outbreak starting there. Being able to accurately predict where and when an outbreak will strike can help organizations mobilize doctors, vaccination programs, and public health campaigns where they are needed most in order to contain the outbreak before it spreads.
Though prescient, it’s also precarious. According to Buceta, "In our model, the appearance of outbreaks is tightly linked to fluctuations in environmental conditions which have an impact on both bat migration patterns and infection rates.” He added, "Such findings, strongly suggest that environmental factors play a key role in the spread of the Ebola virus among bats." As a result of their breakthrough, Buceta and colleagues have received a grant from the National Institutes of Health (NIH) to further develop their model. They should be able to use it in many different countries, and be able to track other potential pandemic-causing diseases, as well.
To learn more about the efforts to contain the next great Ebola outbreak, click here:
Join multiple Tony and Emmy Award-winning actress Judith Light live on Big Think at 2 pm ET on Monday.
From "if-by-whiskey" to the McNamara fallacy, being able to spot logical missteps is an invaluable skill.
- A fallacy is the use of invalid or faulty reasoning in an argument.
- There are two broad types of logical fallacies: formal and informal.
- A formal fallacy describes a flaw in the construction of a deductive argument, while an informal fallacy describes an error in reasoning.
Appeal to privacy<p>When someone behaves in a way that negatively affects (or could affect) others, but then gets upset when others criticize their behavior, they're likely engaging in the appeal to privacy — or "mind your own business" — fallacy. Examples:<br></p><ul><li>Someone who speeds excessively on the highway, considering his driving to be his own business.</li><li>Someone who doesn't see a reason to bathe or wear deodorant, but then boards a packed 10-hour flight.</li></ul><p>Language to watch out for: "You're not the boss of me." "Worry about yourself."</p>
Sunk cost fallacy<p>When someone argues for continuing a course of action despite evidence showing it's a mistake, it's often a sunk cost fallacy. The flawed logic here is something like: "We've already invested so much in this plan, we can't give up now." Examples:<br></p><ul><li>Someone who intentionally overeats at an all-you-can-eat buffet just to get their "money's worth"</li><li>A scientist who won't admit his theory is incorrect because it would be too painful or costly</li></ul><p>Language to watch out for: "We must stay the course." "I've already invested so much...." "We've always done it this way, so we'll keep doing it this way."</p>
If-by-whiskey<p>This fallacy is named after a speech given in 1952 by <a href="https://en.wikipedia.org/wiki/Noah_S._Sweat" target="_blank">Noah S. "Soggy" Sweat, Jr.</a>, a state representative for <a href="https://en.wikipedia.org/wiki/Mississippi" target="_blank">Mississippi</a>, on the subject of whether the state should legalize alcohol. Sweat's argument on prohibition was (to paraphrase):<br></p><p><em>If, by whiskey, you mean the devil's brew that causes so many problems in society, then I'm against it. But if whiskey means the oil of conversation, the philosopher's wine, "</em><em>the stimulating drink that puts the spring in the old gentleman's step on a frosty, crispy morning;" then I am certainly for it.</em></p>
Slippery slope<p>This fallacy involves arguing against a position because you think choosing it would start a chain reaction of bad things, even though there's little evidence to support your claim. Example:<br></p><ul><li>"We can't allow abortion because then society will lose its general respect for life, and it'll become harder to punish people for committing violent acts like murder."</li><li>"We can't legalize gay marriage. If we do, what's next? Allowing people to marry cats and dogs?" (Some people actually made this <a href="https://www.daytondailynews.com/news/national/cats-marrying-dogs-and-five-other-things-same-sex-marriage-won-mean/dLV9jKqkJOWUFZrSBETWkK/" target="_blank">argument</a> before same-sex marriage was legalized in the U.S.)</li></ul><p>Of course, sometimes decisions <em>do </em>start a chain reaction, which could be bad. The slippery slope device only becomes a fallacy when there's no evidence to suggest that chain reaction would actually occur.</p><p>Language to watch out for: "If we do that, then what's next?"</p>
"There is no alternative"<p><span style="background-color: initial;">A modification of the </span><a href="https://en.wikipedia.org/wiki/False_dilemma" target="_blank" style="background-color: initial;">false dilemma</a><span style="background-color: initial;">, this fallacy (often abbreviated to TINA) argues for a specific position because there are no realistic alternatives. Former British Prime Minister Margaret Thatcher used this exact line as a slogan to defend capitalism, and it's still used today to that same end: Sure, capitalism has its problems, but we've seen the horrors that occur when we try anything else, so there is no alternative.</span><br></p><p>Language to watch out for: "If I had a magic wand…" "What <em>else</em> are we going to do?!"</p>
Ad hoc arguments<p>An ad hoc argument isn't really a logical fallacy, but it is a fallacious rhetorical strategy that's common and often hard to spot. It occurs when someone's claim is threatened with counterevidence, so they come up with a rationale to dismiss the counterevidence, hoping to protect their original claim. Ad hoc claims aren't designed to be generalizable. Instead, they're typically invented in the moment. <a href="https://rationalwiki.org/wiki/Ad_hoc" target="_blank">RationalWiki</a> provides an example:<br></p><p style="margin-left: 20px;">Alice: "It is clearly said in the Bible that the Ark was 450 feet long, 75 feet wide and 45 feet high."</p><p style="margin-left: 20px;">Bob: "A purely wooden vessel of that size could not be constructed; the largest real wooden vessels were Chinese treasure ships which required iron hoops to build their keels. Even the <em>Wyoming</em> which was built in 1909 and had iron braces had problems with her hull flexing and opening up and needed constant mechanical pumping to stop her hold flooding."</p><p style="margin-left: 20px;">Alice: "It's possible that God intervened and allowed the Ark to float, and since we don't know what gopher wood is, it is possible that it is a much stronger form of wood than any that comes from a modern tree."</p>
Snow job<p><span style="background-color: initial;">This fallacy occurs when someone doesn't really have a strong argument, so they just throw a bunch of irrelevant facts, numbers, anecdotes and other information at the audience to confuse the issue, making it harder to refute the original claim. Example:</span><br></p><ul><li>A tobacco company spokesperson who is confronted about the health risks of smoking, but then proceeds to show graph after graph depicting many of the other ways people develop cancer, and how cancer metastasizes in the body, etc.</li></ul><p>Watch out for long-winded, data-heavy arguments that seem confusing by design.</p>
McNamara fallacy<p>Named after <a href="https://en.wikipedia.org/wiki/Robert_McNamara" target="_blank">Robert McNamara</a>, the <a href="https://en.wikipedia.org/wiki/United_States_Secretary_of_Defense" target="_blank">U.S. secretary of defense</a> from 1961 to 1968, this fallacy occurs when decisions are made based solely on <em>quantitative metrics or observations,</em> ignoring other factors. It stems from the Vietnam War, in which McNamara sought to develop a formula to measure progress in the war. He decided on bodycount. But this "objective" formula didn't account for other important factors, such as the possibility that the Vietnamese people would never surrender.<br></p><p>You could also imagine this fallacy playing out in a medical situation. Imagine a terminal cancer patient has a tumor, and a certain procedure helps to reduce the size of the tumor, but also causes a lot of pain. Ignoring quality of life would be an example of the McNamara fallacy.</p><p>Language to watch out for: "You can't measure that, so it's not important."</p>
A new study looks at what would happen to human language on a long journey to other star systems.
- A new study proposes that language could change dramatically on long space voyages.
- Spacefaring people might lose the ability to understand the people of Earth.
- This scenario is of particular concern for potential "generation ships".
Generation Ships<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="a1e6445c7168d293a6da3f9600f534a2"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/H2f0Wd3zNj0?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span>
Many of the most popular apps are about self-improvement.
Emotions are the newest hot commodity, and we can't get enough.