Researchers use algorithms to find the longest straight-line distances on land and at sea
- What links a small town in Portugal and a huge port city in China?
- The answer may surprise even inhabitants of both places: the world's longest straight line over land
- That line and its maritime equivalent were determined not by exploration but by calculation
Why do some smart folk spout such bad ideas? Marilynne Robinson says it's because we teach them "higher twaddle.” She's right, but the situation is worse than she fears.
You mad, bro? The way that Facebook (and Twitter) manipulates your brain should be the very thing that outrages us the most.
Social media has been, without a doubt, one of the biggest explosions in connectivity in human history. That's the good part. The bad part is that the minds of the people within these companies have manipulated users into an addictive cycle. You're already familiar with it: post content, receive rewards (likes, comments, etc). But the staggering of the rewards is the habit-forming part, and the reason most moderately heavy social media users check their apps or newsfeeds some 10-to-50 times a day. And to add to the problem — these algorithms have been strengthend to show you more and more outrageous content. It genuinely depletes your ability to be outraged by things in real life (for instance, a sexual predator for a President). Molly Crockett posits that we should all be aware of the dangers of these algorithms... and that we might have to start using them a lot less if we want to have a normal society back.
Users don't need better media literacy to beat fake news. We need social media to be frank about its commercial interests.
Wikipedia has come a long, long way. Back when teachers and education institutions were banning it as an information source for students, did anyone think that by 2017 "the encyclopedia that anyone can edit" would gain global trust? Wikipedia had a rough start and some very public embarrassments, explains Katherine Maher, the executive director at the Wikimedia Foundation, but it has been a process of constant self-improvement. Maher attributes its success to the Wikimedia community who are doggedly committed to accuracy, and are genuinely thankful to find errors — both factual and systemic ones — so they can evolve away from them. So what has Wikimedia gotten right that social platforms like Facebook haven't yet? "The whole conversation around fake news is a bit perplexing to Wikimedians, because bad information has always existed," says Maher. The current public discourse focuses on the age-old problem of fake news, rather than the root cause: the commercial interests that create a space where misinformation doesn't just thrive — it's rewarded. Why doesn't Facebook provide transparency and context for its algorithms? An explanation for 'why am I seeing this news?' could allow users to make good decisions based on where that information comes from, and what its motive is. "We [at Wikimedia] hold ourselves up to scrutiny because we think that scrutiny and transparency creates accountability and that accountability creates trust," says Maher.
Here's one use for all that harvested personal data that you might not object to. Algorithms and big data are no longer just for profit; they can bring us self-awareness and growth.
Who knows more about you than anyone else? Perhaps it’s not so much who, but what. Our intimacy with our devices has surpassed our closeness with most of our friends and family, says Nichol Bradford, and an algorithm never forgets – it will remember everything you ever typed into a search box, how you voted, when you were sick, where your scroll slowed down on a page, how quickly you clicked a picture that it mathematically knew you would like. Until now, big data like this has been used purely for profit, so that media companies can sell advertising, and e-commerce sites can move units. But that’s about to change, explains Bradford. There is tech emerging that can not only track your external behavior, wishes and desires, but read your inner biological signals and interpret micro-expressions on your face to accurately assess your psychological state. If you put this technology into the hands of individuals, not just companies, it could help us manage our habits. This technology could first show us who we really are – objectively, with none of our ego-protective denial or projection – then be a tool to change our behavior and thinking patterns for the better. Nichol Bradford is the author of The Sisterhood.
SMARTER FASTER trademarks owned by The Big Think, Inc. All rights reserved.