Big ideas.
Once a week.
Subscribe to our weekly newsletter.
George Soros last Sunday blamed Angela Merkel for the euro crisis - and gave her three months to fix it. Speaking at the 7th Festival of Economics in Trento (Italy), the renowned hedge-fund magnate pinpointed the moment when Germany, traditionally the engine of European integration, reversed gears:
“The process [of European integration] culminated with the Maastricht Treaty and the introduction of the euro. It was followed by a period of stagnation which, after the crash of 2008, turned into a process of disintegration. The first step was taken by Germany when, after the bankruptcy of Lehman Brothers, Angela Merkel declared that the virtual guarantee extended to other financial institutions should come from each country acting separately, not by Europe acting jointly. It took financial markets more than a year to realize the implication of that declaration, showing that they are not perfect.”
Maastricht’s creation of a monetary union without a political one is now seen as a fundamental flaw of the euro project. A consequent, and almost equally common view is that Mediterranean countries then took advantage of the stability provided by the euro to engage in gross fiscal irresponsibility. But Soros decried the euro’s political deficit as a ‘Third-World’ burden on countries on the eurozone’s economic periphery: heavily indebted, in a currency they can’t control (i.e. devaluate).
The ‘centre’ (read: Germany) deserves quite a bit of blame, Soros said, for “designing a flawed system, enacting flawed treaties, pursuing flawed policies and always doing too little too late”, continuing that “[i]n the 1980’s Latin America suffered a lost decade; a similar fate now awaits Europe.”
Indeed, one of the remarkable constants in Europe’s reaction to the unfolding crisis, at crisis summit after crisis summit, is to do just enough to avert disaster, but nowhere near enough to fix the fundamental flaws of the system. Mainly, snorts Soros, because they don’t understand the system’s flaws. Or is there a hidden agenda? If Germany does just enough to save the euro, but not correct its inherent imbalance, then the European Union will become, in the words of George Soros, “a German empire with the periphery as the hinterland.”
Between total collapse and a German-dominated collection of economically subservient states, Soros sees a third option. He foresees that European autorities (i.e. the German government and the Bundesbank) have a “three months’ window” to correct mistakes and reverse the trend towards disintegration.
But this will require a much larger commitment from the European summit at the end of June than the ‘temporary relief’ offered by previous ones. After the window closes, the gap between market demands and the eurozone’s options will become unbridgeable. The subsequent breakup of the euro could chain-react and cause “a collapse of the Schengen Treaty, the common market, and the European Union itself.”
All of which is both terrible and important, but what - I hear you think [1] - does this have to do with maps?
Although reports of its demise are at least three months premature, the death of the euro would also kill off an interesting experiment in monetary geography, which has been made possible by a peculiar geo-locator built into every euro coin: its face bears the imprint of the eurozone member state for which it was minted.
Progression of foreign euro coin penetration into France (March-June-September 2002)
Progression of foreign euro coin penetration into France (January 2003 - January 2007 - December 2011)
This allows for all kinds of statistical fun, including the maps published in November 2002 by INED, France’s National Institute for Demographic Studies. The euro coin diffusion study [2], conducted just a few months after their introduction on New Year’s Day 2002, showed the rapid infiltration of the coinage of France’s neighbours into the Hexagon.
These were the early, heady days of the eurozone. In the intervening decade, the monetary attitudes has completely turned: instead of uniting the continent, the euro may well turn out to be the bomb in the European beehive. So when INED took another snapshot of euro coin diffusion in France earlier this year, it could be that their two studies prove to be the bookends on a very short experiment in European monetary union.
In another decade’s time, there might not be any euro coins anymore - at least not in circulation as legal tender. So perhaps this is your last chance to see a proper euro coin diffusion analysis in action. Narrow your eyes and let your imagination do the work: there are the French statisticians, mounting their horses, charging the unsuspecting public, shouting: “It’s a good day to survey!”
Their study relies on surveys conducted over the last ten years, during which respondents were asked to empty their purses and pockets of euro coins. Some conclusions:
Foreign coin share
In 2002, 24% of the surveyed Frenchmen and -women had at least one foreign coin in their purse, by 2005, this had increased to 53% and by the end of 2011 to 89%.
The total share of foreign coins in French purses increased from 5% in March 2002 to 34% in December 2011. This is far from a 'statistically perfect' mix. France minted only 20% of all euro coins, implying a statistical ideal of 80% in non-French euro coins in French purses. "[I]t is clear that the mixing process has been slower than predicted by physicists and mathematicians", the researchers find.
Geographic differentiation
The slowness of the mixing process has a definite geographic angle: in northeastern France, close to Germany, over half of coins are foreign, while in Brittany, in France's far west, three quarters are still French. In 2003, the coins were 15% and 5% foreign in the respective regions.
Relative slowness may be attributable to two different types of foreign coin diffusion: in border regions, where coin infiltration is intense but ineffective, as coins move back across the border with equal speed; and as a result of long-distance travel, which is more effective if it weren't so relatively limited in scope.
Coin mobility
Coin mobility is different per denomination. The 1 euro coin is the best mixer, with foreign coins representing 60% of the French total. The foreign 2 euro and 50 cent coins mix almost equally well, representing 56% and 55% of the French total, respectively.
Smaller denominations are less 'sociable', with 20 cent at 45%, 10 cent at 34% and 5 cent at 23%. The worst mixers are foreign 1 and 2 cent-coins, both at 12%.
Coin 'nationality'
The most frequent 'nationalities' of foreign coins found in French purses in December 2011 were, in descending order: Spanish, German, Belgian and Italian - all direct neighbours of France [3]. Spain and Germany each represent just over 25% of foreign coins in France, Belgium and Italy each just under 15%.
As the weight of the eurozone countries' respective demographies and economies takes more effect, the Belgian and Spanish share should continue to decrease, while the German and Italian share keeps increasing.
Penetration of neighbours' euro coins into France (December 2011)
Cross-border commerce
The degree of penetration of foreign coins into the French hinterland is a good indication of the strength of cross-border commerce.
Many thanks to Jean-Pierre Muyl, who alerted me to the updated study. It’s viewable in its entirety here (in French and English versions).
______________
[1] Yes, there’s an app for that too.
[2] See Strange Maps #359 for an overview of the 2002 study.
[3] Luxembourg also mints its own euros, but since each country does so proportionally, the tiny Grand Duchy's coins don't make much of an impact - it comes in at 9th place, with about 2%. Andorra, in between Spain and France, has adopted the euro but does not mint its own. Switzerland, on France's eastern border, is outside the European Union and the eurozone, holding on to its Swiss franc.
What the world will look like 4°C warmer
Will your grandchildren live in cities on Antarctica?
Micronesia is gone – sunk beneath the waves. Pakistan and South India have been abandoned. And Europe is slowly turning into a desert. This is the world, 4°C warmer than it is now.
But there is also good news: Western Antarctica is no longer icy and uninhabitable. Smart cities thrive in newly green and pleasant lands. And Northern Canada, Scandinavia and Siberia produce bountiful harvests to feed the hundreds of millions of climate refugees who now call those regions home.
This map, which shows some of the effects a 4°C rise in average temperature could have on the planet, is 13 years old, but it seems to get more contemporary as it ages (and the planet warms). Antarctica is white with snow and ice, on the ground and, traditionally, on most maps. This map has turned the continent's western end incongruously green. And recent reports confirm that Antarctica is indeed turning green.
Few serious scientists doubt that climate change is happening, or that it is man-made. But the fact remains that many still have a hard time grasping global warming, partly as a convenient way of ignoring the destructive impact it is predicted to have.
Those on the fact-based side of this argument should realise that continuously bombarding the opposition with doom and gloom is likely to reinforce their resistance to accepting the new paradigm.
This map offers an alternative: lots of misery and disaster, but also plenty of hope and solutions. Not solutions that will lead us back to the climate of a few decades ago – costly and pointless – but solutions that work for the world as it will be, when it will be much warmer than it is now.
First, the bad news. Brown indicates 'Uninhabitable due to floods, drought or extreme weather'. Say goodbye to the Eastern Seaboard of the U.S., to Mexico and Central America, to the middle third of South America. In Africa, Mozambique and Madagascar are gone; Asia loses much of the Indian subcontinent, including all of Pakistan; Indochina is abandoned, as is most of Indonesia. As the map mentions, “The last inhabitants of (the South-west U.S. are) migrating north. The Colorado river is a mere trickle"; “Deglaciation means (Peru) is dry and uninhabitable"; and “Bangladesh is largely abandoned, as is South India. (In) Pakistan, isolated communities remain in pockets".
Orange is not much better: 'Uninhabitable desert'. That's most of the U.S. and the rest of South America, almost the entirety of Africa and the southern halves of Europe and Asia. “Deserts have encroached on (Southern Europe), rivers have dried up and the Alps are now snow-free. Goats and other hardy animals are kept at the fringes", the map predicts.
Red is for lands lost to the rising tide (assuming +4°C adds two metres to ocean levels). This may not seem a lot, but this is where populations are concentrated. In the U.S. for instance, counties directly on the shoreline constitute less than 10% of the total land area (not including Alaska), but account for 40% of the total population.

A warmer climate could even lead to reforestation in certain areas of the world, including the Sahel and Western Australia. The regions abandoned to desertification are empty, but not useless: they will be used for solar farming (green dots) and geothermal energy (red dots). Giant wind farms off the coasts of South America, Alaska and in the North Sea will generate the remainder of the planet's energy needs.
This map was first published by New Scientist, and republished by Parag Khanna for his book Connectography. Khanna speculates: “The entire population of the Arctic region today is less than 4 million. Could it be 400 million within the coming 20 years?"
Now is the time to buy property in Greenland – before it too turns green...
Map found here at Parag Khanna.
Strange Maps #842
Got a strange map? Let me know at strangemaps@gmail.com.
Twenty years after 9/11, hindsight is 20/20
Hindsight is 20/20, particularly when you have had 20 years to think about what happened.
- The 20th anniversary of 9/11 arrives with pessimism and a sense of defeat.
- 9/11 caused a national trauma that lasted for years.
- We should remember this when analyzing the mistakes that America made in its war on terrorism during the subsequent 20 years.
As the 20th anniversary of the 9/11 terrorist attack approached, I was disappointed to see how negative the media coverage is. The overall reportage is defeatist, focusing almost exclusively on how America made countless mistakes and accomplished little if anything — perhaps even making global problems worse.
An article by Garrett Graff in The Atlantic was typical of the tone. Titled "After 9/11, the U.S. Got Almost Everything Wrong," it concluded the following, each conclusion shown as a subheadline: (1) "As a society, we succumbed to fear." (2) "We chose the wrong way to seek justice." (3) "At home, we reorganized the government the wrong way." (4) "Abroad, we squandered the world's goodwill." (5) "We picked the wrong enemies."
For the sake of argument, let's assume that everything in that article is exactly correct. While there are plenty of lessons to be learned from America's many foreign (mis)adventures pre- and post-9/11, we should also remember this: hindsight is 20/20, particularly when you have had 20 years to think about what happened.
So, let's rewind the tape two decades. I can tell you exactly where I was, what I was doing, and what I was thinking on September 11, 2001. Every one of us can.
*****
The phone rang sometime around 7:50 am Central Time. My father was on the other side. He told me that an airplane had crashed into the World Trade Center.
I was not interested. Surely, it was an accident. Besides, I was a sophomore in college and had far more important things to worry about: Tuesdays were my busy days. From 10 am to noon, I had a microbiology laboratory. Then from 1 pm to 5 pm, I had an organic chemistry lab. Groggily, I hung up the phone and went back to sleep.
About 15 minutes later, the phone rings again. It's my dad. "The second tower has been hit. You need to wake up. We're under attack." I got up this time. I went upstairs and turned on the TV. My mouth dropped in disbelief. I called a friend and told her to wake up, too.
Classes at my university were not canceled, so I got into my car and headed to school. I turned on the radio and listened as reporters described how the first tower of the World Trade Center had just collapsed. Since I had never been to New York City, I distinctly remember thinking, "At least one tower will be there if I ever get to visit." Then the other tower collapsed.
When I arrived at the microbiology lab, one of the professors had pulled a TV into the hallway so that we could listen to the latest news. The teaching assistant reminded us that, even though none of us felt like working, we still had assignments that needed to get done. All of us sat in silence as we worked. In the top right corner of my lab notebook, where I always wrote down the date, I added the following line: "WTC Disaster."
After lab, I headed to the student center for lunch. People were crowded around televisions. In the hallway, I recall one student saying, "This is what we get for electing George Bush" — a rather odd sentiment given that, up until that point in his presidency, Bush was focused on education policy.
Naturally, students started discussing ideas about who might have done this. Iraq? Iran? Palestinians? Nobody knew. What we did believe is this: we are going to get attacked again. It was not a matter of if but when and where.
*****
My experience was not unique. Just about anyone who is old enough to remember 9/11 can recall the exact details of that day. How many other days are etched into your memory like that? Very few, if any. The point is this: we experienced a collective trauma that day. And the effects of that trauma lasted a very long time.
The truth is we were scared. The people in Bush's inner circle were scared — as in they believed that the president might be assassinated with a missile while aboard Air Force One. This fact comes through very clearly in a new Apple TV+ documentary, called 9/11: Inside the President's War Room. Former national security advisor Condoleezza Rice also notes that nearly 3,000 people were murdered on their watch. Naturally, they felt a responsibility never to allow something like 9/11 to happen again.
So, that is why the U.S. reacted the way that it did. More than three years after 9/11, we were still worried about terrorism — so much so, that Bush ran on a platform of beating it, and he won re-election. It was not until 2006 — more than five years after the attack — that Americans started to realize that things were not going according to plan, particularly in Iraq. As a result, the American people handed Congress to the Democrats, and in 2008, the presidency to Barack Obama.
But even then, the war on terrorism did not end. Obama made sure to hunt down Osama bin Laden, which successfully happened on May 2, 2011. (I remember exactly where I was when I heard that news, too.) After his death was reported, thousands of Americans were cheering in New York City and in front of the White House.
That is the emotional toll that 9/11 took on America. It is worth remembering that when we examine the past 20 years of foreign policy and war. Without a doubt, we made many terrible mistakes. But let's also have a bit of humility and empathy as we analyze those mistakes, remembering why we made them in the first place.
As Rice asks in the aforementioned documentary, "What would you have done?"
The term 'AI' overpromises: Here's how to make it work for humans instead
Is there actually anything deserving of the term AI?
One of the popular memes in literature, movies and tech journalism is that man's creation will rise and destroy it.
Lately, this has taken the form of a fear of AI becoming omnipotent, rising up and annihilating mankind.
The economy has jumped on the AI bandwagon; for a certain period, if you did not have "AI" in your investor pitch, you could forget about funding. (Tip: If you are just using a Google service to tag some images, you are not doing AI.)
However, is there actually anything deserving of the term AI? I would like to make the point that there isn't, and that our current thinking is too focused on working on systems without thinking much about the humans using them, robbing us of the true benefits.
What companies currently employ in the wild are nearly exclusively statistical pattern recognition and replication engines. Basically, all those systems follow the "monkey see, monkey do" pattern: They get fed a certain amount of data and try to mimic some known (or fabricated) output as closely as possible.
When used to provide value, you give them some real-life input and read the predicted output. What if they encounter things never seen before? Well, you better hope that those "new" things are sufficiently similar to previous things, or your "intelligent" system will give quite stupid responses.
But there is not the slightest shred of understanding, reasoning and context in there, just simple re-creation of things seen before. An image recognition system trained to detect sheep in a picture does not have the slightest idea what "sheep" actually means. However, those systems have become so good at recreating the output, that they sometimes look like they know what they are doing.
Isn't that good enough, you may ask? Well, for some limited cases, it is. But it is not "intelligent", as it lacks any ability to reason and needs informed users to identify less obvious outliers with possibly harmful downstream effects.
The ladder of thinking has three rungs, pictured in the graph below:
Imitation: You imitate what you have been shown. For this, you do not need any understanding, just correlations. You are able to remember and replicate the past. Lab mice or current AI systems are on this rung.
Intervention: You understand causal connections and are able to figure out what would happen if you now would do this, based on what you learned about the world in the past. This requires a mental model of the part of the world you want to influence and the most relevant of its downstream dependencies. You are able to imagine a different future. You meet dogs and small children on that rung, so it is not a bad place to be.
Counterfactual reasoning: The highest rung, where you wonder what would have happened, had you done this or that in the past. This requires a full world model and a way to simulate the world in your head. You are able to imagine multiple pasts and futures. You meet crows, dolphins and adult humans here.
In order to ascend from one rung to the next, you need to develop a completely new set of skills. You can't just make an imitation system larger and expect it to suddenly be able to reason. Yet this is what we are currently doing with our ever-increasing deep learning models: We think that by giving them more power to imitate, they will at some point magically develop the ability to think. Apart from self-delusional hope and selling nice stories to investors and newspapers, there is little reason to believe that.
And we haven't even touched the topic of computational complexity and economical and ecological impact of ever-growing models. We might simply not be able to grow our models to the size needed, even if the method worked (which it doesn't, so far).
Whatever those systems create is the mere semblance of intelligence and in pursuing the goal of generating artificial intelligence by imitation, we are following a cargo cult.
Instead, we should get comfortable with the fact that the current ways will not achieve real AI, and we should stop calling it that. Machine learning (ML) is a perfectly fitting term for a tool with awesome capabilities in the narrow fields where it can be applied. And with any tool, you should not try to make the entire world your nail, but instead find out where to use it and where not.
Machines are strong when it comes to quickly and repeatedly performing a task with minimal uncertainty. They are the ruling class of the first rung.
Humans are strong when it comes to context, understanding and making sense with very little data at hand and high uncertainties. They are the ruling class of the second and third rung.
So what if we focus our efforts away from the current obsession with removing the human element from everything and thought about combining both strengths? There is an enormous potential in giving machine learning systems the optimal, human-centric shape, in finding the right human-machine interface, so that both can shine. The ML system prepares the data, does some automatable tasks and then hands the results to the human, who further handles them according to context.
ML can become something like good staff to a CEO, a workhorse to a farmer or a good user interface to an app user: empowering, saving time, reducing mistakes.
Building a ML system for a given task is rather easy and will become ever easier. But finding a robust, working integration of the data and the pre-processed results of the data with the decision-maker (i.e. human) is a hard task. There is a reason why most ML projects fail at the stage of adoption/integration with the organization seeking to use them.
Solving this is a creative task: It is about domain understanding, product design and communication. Instead of going ever bigger to serve, say, more targetted ads, the true prize is in connecting data and humans in clever ways to make better decisions and be able to solve tougher and more important problems.
Republished with permission of the World Economic Forum. Read the original article.
The secret to how scorpions, spiders, and ants puncture tough skin
These animals to grow scalpel-sharp and precisely shaped tools that are resistant to breaking.


