Can Silicon Valley save the world? Or are social networks ruining everything?
"At times, it seems as if we are condemned to try to understand our own time with conceptual frameworks more than half a century old." Historian Niall Ferguson says it's time for an update.
At times, it seems as if we are condemned to try to understand our own time with conceptual frameworks more than half a century old. Since the financial crisis, many economists have been reduced to recycling the ideas of John Maynard Keynes, who died in 1946. Confronted with populism, writers on American and European politics repeatedly confuse it with fascism, as if the era of the world wars is the only history they have ever studied. Analysts of international relations seem to be stuck with terminology that dates from roughly the same period: realism or idealism, containment or appeasement, deterrence or disarmament. George Kennan's ‘Long Telegram’ was dispatched just two months before Keynes’s death; Hugh Trevor Roper's Last Days of Hitler was published the following year. Yet all this was seventy years ago. Our own era is profoundly different from the mid-twentieth century. The near-autarkic, commanding and controlling states that emerged from the Depression, the Second World War and the early Cold War exist today, if at all, only as pale shadows of their former selves. The bureaucracies and party machines that ran them are defunct or in decay. The administrative state is their final incarnation. Today, the combination of technological innovation and international economic integration has created entirely new forms of network—ranging from the criminal underworld to the rarefied 'overworld' of Davos—that were scarcely dreamt of by Keynes, Kennan or Trevor-Roper.
Winston Churchill famously observed, 'The longer you can look back, the farther you can look forward.' We, too, must look back longer and ask ourselves the question: is our age likely to repeat the experience of the period after 1500, when the printing revolution unleashed wave after wave of revolution? Will the new networks liberate us from the shackles of the administrative state as the revolutionary networks of the sixteenth, seventeenth and eighteenth centuries freed our ancestors from the shackles of spiritual and temporal hierarchy? Or will the established hierarchies of our time succeed more quickly than their imperial predecessors in co- opting the networks, and enlist them in their ancient vice of waging war?
A libertarian utopia of free and equal netizens—all interconnected, sharing all available data with maximum transparency and minimal privacy settings—has a certain appeal, especially to the young. It is romantic to imagine these netizens, like the workers in [Fritz] Lang’s "Metropolis", spontaneously rising up against the world’s corrupt elites, then unleashing the might of artificial intelligence to liberate themselves from the drudgery of work, too. Those who try to look forward without looking back very easily fall into the trap of such wishful thinking. Since the mid-1990s, computer scientists and others have fantasized about the possibility of a 'global brain'—a self- organizing 'planetary superorganism'. In 1997 Michael Dertouzos looked forward to an era of 'computer-aided peace'. 'New information technologies open up new vistas of non-zero sumness,' wrote one enthusiast in 2000. Governments that did not react swiftly by decentralizing would be 'swiftly...punished'. N. Katherine Hayles was almost euphoric. 'As inhabitants of globally interconnected networks,' she wrote in 2006, 'we are joined in a dynamic coevolutionary spiral with intelligent machines as well as with the other biological species with whom we share the planet.' This virtuous upward spiral would ultimately produce a new 'cognisphere'. Three years later, Ian Tomlin envisioned 'infinite forms of federations between people...that overlook...differences in religion and culture to deliver the global compassion and cooperation that is vital to the survival of the planet'. 'The social instincts of humans to meet and share ideas,' he declared, 'might one day be the single thing that saves our race from its own self destruction.' 'Informatization,' wrote another author, would be the third wave of globalization. 'Web 3.0' would produce 'a contemporary version of a "Cambrian explosion"' and act as 'the power-steering for our collective intelligence'.
The masters of Silicon Valley have every incentive to romanticize the future. Balaji Srinivasanconjures up heady visions of the millennial generation collaborating in computer 'clouds', freed from geography, and paying each other in digital tokens, emancipated from the state payment systems. Speaking at the 2017 Harvard Commencement, Mark Zuckerberg called on the new graduates to help 'create a world where everyone has a sense of purpose: by taking on big meaningful projects together, by redefining equality so everyone has the freedom to pursue purpose, and by building community across the world'. Yet Zuckerberg personifies the inequality of superstar economics. Most of the remedies he envisages for inequality—'universal basic income, affordable childcare, healthcare that [isn’t] tied to one company...continuous education'—cannot be achieved globally but are only viable as national policies delivered by the old twentieth-century welfare state. And when he says that 'the struggle of our time’ is between 'the forces of freedom, openness and global community against the forces of authoritarianism, isolationism and nationalism,' he seems to have forgotten just how helpful his company has been to the latter.
Histories of futurology give us little reason to expect much, if any, of the Silicon Valley vision of utopia to be realized. Certainly, if Moore’s Law continues to hold, computers should be able to simulate the human brain by around 2030. But why would we expect this to have the sort of utopian outcomes imagined in the preceding paragraph? Moore’s Law has been in operation at the earliest since Charles Babbage’s 'Analytical Engine' was (partly) built before his death in 1871, and certainly since the Second World War. It cannot be said that there has been commensurate exponential improvement in our productivity, much less our moral conduct as a species. There is a powerful case to be made that the innovations of the earlier industrial revolutions were of more benefit to mankind than those of the most recent one. And if the principal consequence of advanced robotics and artificial intelligence really is going to be large- scale unemployment, the chances are surely quite low that a majority of mankind will uncomplainingly devote themselves to harmless leisure pursuits in return for some modest but sufficient basic income. Only the sedative-based totalitarianism imagined by Aldous Huxley would make such a social arrangement viable. A more likely outcome is a repeat of the violent upheavals that ultimately plunged the last great Networked Age into the chaos that was the French Revolution.
Moreover, the suspicion cannot be dismissed that, despite all the utopian hype, less benign forces have already learned how to use and abuse the 'cognisphere' to their advantage. In practice, the Internet depends for its operation on submarine cables, fibre-optic wires, satellite links and enormous warehouses full of servers. There is nothing utopian about the ownership of that infrastructure, nor the oligopolistic arrangements that make ownership of major web platforms so profitable. Vast new networks have been made possible but, like the networks of the past, they are hierarchical in structure, with small numbers of super-connected hubs towering over the mass of sparsely connected nodes. And it is no longer a mere possibility that this network can be instrumentalized by corrupt oligarchs or religious fanatics to wage a new and unpredictable kind of war in cyberspace. That war has commenced. Indices of geopolitical risk suggest that conventional and even nuclear war may not be far behind. Nor can it be ruled out that a 'planetary superorganism' created by the Dr Strangeloves of artificial intelligence may one day run amok, calculating—not incorrectly—that the human race is by far the biggest threat to the long-run survival of the planet itself and exterminating the lot of us.
'I thought once everybody could speak freely and exchange information and ideas, the world is automatically going to be a better place,' said Evan Williams, one of the co-founders of Twitter in May 2017. 'I was wrong about that.' The lesson of history is that trusting in networks to run the world is a recipe for anarchy: at best, power ends up in the hands of the Illuminati, but more likely it ends up in the hands of the Jacobins. Some today are tempted to give at least 'two cheers for anarchism'. Those who lived through the wars of the 1790s and 1800s learned an important lesson that we would do well to re-learn: unless one wishes to reap one revolutionary whirlwind after another, it is better to impose some kind of hierarchical order on the world and to give it some legitimacy. At the Congress of Vienna, the five great powers agreed to establish such an order, and the pentarchy they formed provided a remarkable stability for the better part of the century that followed. Just over 200 years later, we confront the same choice they faced. Those who favour a world run by networks will end up not with the interconnected utopia of their dreams but with a world divided between FANG and BAT and prone to all the pathologies discussed above, in which malignant sub networks exploit the opportunities of the World Wide Web to spread virus-like memes and mendacities.
The alternative is that another pentarchy of great powers recognizes their common interest in resisting the spread of jihadism, criminality and cyber-vandalism, to say nothing of climate change. In the wake of the 2017 WannaCry episode, even the Russian government must understand that no state can hope to rule Cyberia for long: that malware was developed by the American NSA as a cyber weapon called EternalBlue, but was stolen and leaked by a group calling themselves the Shadow Brokers. It took a British researcher to find its 'kill switch', but only after hundreds of thousands of computers had been infected, including American, British, Chinese, French and Russian machines. What could better illustrate the common interest of the great powers in combating Internet anarchy? Conveniently, the architects of the post-1945 order created the institutional basis for such a new pentarchy in the form of the permanent members of the UN Security Council, an institution that retains the all-important ingredient of legitimacy. Whether or not these five great powers can make common cause once again, as their predecessors did in the nineteenth century, is the great geopolitical question of our time.
Six centuries ago, in Siena, the Torre del Mangia of the Palazzo Publico cast a long shadow over the Piazza del Campo, the fan- like space that was by turns a marketplace, a meeting place and, twice a year, a racetrack. The tower's height was to make a point: it reached exactly the same elevation as the city’s cathedral, which stood on Siena’s highest hill, symbolizing the parity of temporal and spiritual hierarchies. A century ago, in Lang’s "Metropolis", hierarchical power was symbolized by the skyscrapers of Manhattan, which still keep the south and east of Central Park in shade for large portions of the day. When the first great towers were built in New York, they seemed to be appropriately imposing accommodation for the hierarchical corporations that dominated the US economy.
By contrast, today's dominant technology companies eschew the vertical. Facebook’s headquarters in Menlo Park, designed by Frank Gehry, is a sprawling campus of open-plan offices and play-areas—a 'single room that fits thousands of people', in Mark Zuckerberg’s words, or (perhaps more accurately) an immense kindergarten for geeks. The main building at the new 'Apple Park' in Cupertino is a gigantic circular spaceship with only four storeys (above ground)—'a centre for creativity and collaboration', designed by the late Steve Jobs, Norman Foster and Jonathan Ive as if to host a lattice-like network, each node an equal, with a uniform number of edges, but just one restaurant. Google’s new headquarters in Mountain View, set amid 'trees, landscaping, cafes, and bike paths', will be made of 'lightweight block-like structures which can be moved around easily', as if constructed from Lego and located in a nature reserve: an office without foundations or a floor-plan, mimicking the constantly evolving network it hosts. Silicon Valley prefers to lie low, and not only for fear of earthquakes. Its horizontal architecture reflects the reality that it is the most important hub of a global network: the world's town square.
On the other side of the United States, however—on New York City’s 5th Avenue—there looms a fifty-eight-storey building that represents an altogether different organizational tradition. And no one individual in the world has a bigger say in the choice between networked anarchy and world order than the absent owner of that dark tower.
Adapted from THE SQUARE AND THE TOWER: Networks and Power, from Freemasons to Facebook by Niall Ferguson, published by Penguin Press, an imprint of Penguin Publishing Group, a division of Penguin Random House, LLC. Copyright © 2017 by Niall Ferguson.
What would happen if you tripled the US population? Join Matthew Yglesias and Charles Duhigg at 1pm ET on Monday, September 28.
Whether or not women think beards are sexy has to do with "moral disgust"
- A new study found that women perceive men with facial hair to be more attractive as well as physically and socially dominant.
- Women tend to associate more masculine faces with physical strength, social assertiveness, and formidability.
- Women who display higher levels of "moral disgust," or feelings of repugnance toward taboo behaviors, are more likely to prefer hairy faces.
Beards and perceptions of masculinity<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yMjU5OTg0MC9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTY0NzkxMjM3N30.cH-GqNwP5GVqvstgJWAhBPn1B_lYpVEAI0I7iax7EQw/img.jpg?width=1245&coordinates=0%2C1900%2C0%2C849&height=700" id="caae6" class="rm-shortcode" data-rm-shortcode-id="cb0a355a4e8e1899789bc45f3f7aef56" data-rm-shortcode-name="rebelmouse-image" />
Photo Credit: Wikimedia<p>The study used 919 American (mostly white) women ages 18-70 who rated 30 pictures of men they were shown with various stages of facial hair growth. The photographs depicted men with faces that had been digitally altered to look more feminine or more masculine, with a beard and without a beard. The women rated the men according to perceived attractiveness for long-term and short-term relationships. The study found that the more facial hair the men had, the higher the men were rated on their attractiveness, particularly for their suitability for a long-term relationship.</p><p>Part of this might be attributed to facial masculinity — i.e. protruding brow ridge, wide cheekbones, thick jawline, and deeply set narrow eyes — which conveys information to a woman about a man's underlying health and formidability. Women tend to associate more masculine faces with physical strength and social assertiveness. It can also indicate a man with a superior immune response. The researchers suggested that their findings favoring bearded men could be due to the fact that facial hair enhances the masculine facial features on a man's face, like creating the illusion of a thicker jaw line. This could communicate direct benefits to women like resources and protection that would enhance survival among mothers and their infants. In other words, while a beard doesn't mean superior genetics in and of itself, it might be a primitive, ornamental way of saying, "Hey girl, I'm a testosterone-fueled lean, mean, pathogen fighting machine." <br></p><p>It could also be that a beard becomes its own destiny. The researchers in this study cite prior research that found that by growing a beard, men felt more masculine and had higher levels of serum testosterone, which was linked to a higher level of social dominance. They also tended to subscribe to more old-school beliefs about gender roles in their relationships with women as compared to men with clean-shaven faces.<span></span><br></p>
What does disgust have to do with beard preference?<p>Obviously, not all women dig beards. The researchers were particularly interested in what traits make a women prefer bearded men over clean-shaven faces. They looked into several factors including a woman's disgust levels on various concepts, her desire to become pregnant, and her exposure to facial hair in her personal life. </p><p>According to the study, women who were not into facial hair were turned-off by potential parasites or other critters they imagined could be in the hair or skin. Women ranking high on this "ectoparasite disgust" scale might have viewed beards as a sign of poor grooming habits. However, women who ranked higher in levels of "pathogen" did find the bearded men to be desirable, possibly because they perceived beards as a signal of good health and immune function. An intriguing discovery in the study was links to morality. Women who displayed higher levels of "moral disgust," or feelings of repugnance toward taboo behaviors, were more likely to prefer hairy faces. The authors opined that this could reflect a link between beardedness, politically conservative outlooks, and traditional views regarding performances of masculinity in heterosexual relationships.</p>
Additional findings<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yMjU5OTg1My9vcmlnaW4uZ2lmIiwiZXhwaXJlc19hdCI6MTYyNDI1NjUyOX0.P9B8WbmJR0q4nfzYZKbuNSA-2SAigVWJgrQE-_Gxlds/img.gif?width=980" id="49143" class="rm-shortcode" data-rm-shortcode-id="2ed3b1d6f20fc170bf2974646e565e8d" data-rm-shortcode-name="rebelmouse-image" />Giphy<p>The correlations that existed between married and single women's rating on the attractiveness of beards were not particularly clear, although the researchers noted that single and married women who wanted children tended to find beards more attractive than the women who didn't want children. They also found that women with bearded husbands found beards to be more attractive, which might indicate that social exposure to beards influences how desirable they are perceived of as being. Or it could be that men with wives who like beards grow beards.</p><p>It's important to note that culture plays a huge role in how attractive women perceive certain male characteristics as being. This study looked at a small, culturally specific group of American women, so no big, universal claims should be made about masculinity, facial hair, and male desirability to women. However, research like this is important in highlighting how human grooming decisions are driven by much more than fashion trends. Sociobiological, economic, and ecological factors all play a part in the way we choose to present ourselves.</p>
Dominique Crenn, the only female chef in America with three Michelin stars, joins Big Think Live.
Having been exposed to mavericks in the French culinary world at a young age, three-star Michelin chef Dominique Crenn made it her mission to cook in a way that is not only delicious and elegant, but also expressive, memorable, and true to her experience.
New experiments find weird quantum activity in supercold gas.
Quantum Mechanics, Onions, and a Theory of Everything<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="036ae7b8dd661df2d125a3421a0299ba"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/bcVruA0AJ-o?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span>
Researchers say that moral self-licensing occurs "because good deeds make people feel secure in their moral self-regard."
Books about race and anti-racism have dominated bestseller lists in the past few months, bringing to prominence authors including Ibram Kendi, Ijeoma Oluo, Reni Eddo-Lodge, and Robin DiAngelo.