Once a week.
Subscribe to our weekly newsletter.
HP Wants to Build a Personal Data Stock Exchange
Looking to make a cut of the billions of dollars in annual revenue Internet companies make by selling personal data, Hewlett-Packard wants to patent its idea for an open data market.
What's the Latest Development?
HP Labs, the research arm of Hewlett-Packard, is seeking to patent its model for a personal data stock exchange. In the model, Internet users would set prices on bundles of personal information, from age to weight to drink preferences, seeking to attract buyers from the open market. Those buyers would most likely be businesses, such as pharmaceutical companies, looking to use personal and demographic information to advance a particular product or vein of research. "A trusted market operator could take a small cut of each transaction and help arrive at a realistic price for a sale."
What's the Big Idea?
With Internet companies making billions of dollars each year by selling your personal information, new business models that promise more user control are gaining momentum. "Startups like Personal and Singly are working on these challenges already. The World Economic Forum recently called an individual's data an emerging 'asset class'." Ultimately, "giving people control on a trustworthy market could encourage more and new kinds of data to be shared. ... For example, a person might feel comfortable putting an anonymous health record on the market after visiting a hospital, or sharing automobile GPS locations to help cities decide where to build new roads."
Photo credit: Shutterstock.com
We explore the history of blood types and how they are classified to find out what makes the Rh-null type important to science and dangerous for those who live with it.
- Fewer than 50 people worldwide have 'golden blood' — or Rh-null.
- Blood is considered Rh-null if it lacks all of the 61 possible antigens in the Rh system.
- It's also very dangerous to live with this blood type, as so few people have it.
Golden blood sounds like the latest in medical quackery. As in, get a golden blood transfusion to balance your tantric midichlorians and receive a free charcoal ice cream cleanse. Don't let the New-Agey moniker throw you. Golden blood is actually the nickname for Rh-null, the world's rarest blood type.
As Mosaic reports, the type is so rare that only about 43 people have been reported to have it worldwide, and until 1961, when it was first identified in an Aboriginal Australian woman, doctors assumed embryos with Rh-null blood would simply die in utero.
But what makes Rh-null so rare, and why is it so dangerous to live with? To answer that, we'll first have to explore why hematologists classify blood types the way they do.
A (brief) bloody history
Our ancestors understood little about blood. Even the most basic of blood knowledge — blood inside the body is good, blood outside is not ideal, too much blood outside is cause for concern — escaped humanity's grasp for an embarrassing number of centuries.
Absence this knowledge, our ancestors devised less-than-scientific theories as to what blood was, theories that varied wildly across time and culture. To pick just one, the physicians of Shakespeare's day believed blood to be one of four bodily fluids or "humors" (the others being black bile, yellow bile, and phlegm).
Handed down from ancient Greek physicians, humorism stated that these bodily fluids determined someone's personality. Blood was considered hot and moist, resulting in a sanguine temperament. The more blood people had in their systems, the more passionate, charismatic, and impulsive they would be. Teenagers were considered to have a natural abundance of blood, and men had more than women.
Humorism lead to all sorts of poor medical advice. Most famously, Galen of Pergamum used it as the basis for his prescription of bloodletting. Sporting a "when in doubt, let it out" mentality, Galen declared blood the dominant humor, and bloodletting an excellent way to balance the body. Blood's relation to heat also made it a go-to for fever reduction.
While bloodletting remained common until well into the 19th century, William Harvey's discovery of the circulation of blood in 1628 would put medicine on its path to modern hematology.
Soon after Harvey's discovery, the earliest blood transfusions were attempted, but it wasn't until 1665 that first successful transfusion was performed by British physician Richard Lower. Lower's operation was between dogs, and his success prompted physicians like Jean-Baptiste Denis to try to transfuse blood from animals to humans, a process called xenotransfusion. The death of human patients ultimately led to the practice being outlawed.4
The first successful human-to-human transfusion wouldn't be performed until 1818, when British obstetrician James Blundell managed it to treat postpartum hemorrhage. But even with a proven technique in place, in the following decades many blood-transfusion patients continued to die mysteriously.
Enter Austrian physician Karl Landsteiner. In 1901 he began his work to classify blood groups. Exploring the work of Leonard Landois — the physiologist who showed that when the red blood cells of one animal are introduced to a different animal's, they clump together — Landsteiner thought a similar reaction may occur in intra-human transfusions, which would explain why transfusion success was so spotty. In 1909, he classified the A, B, AB, and O blood groups, and for his work he received the 1930 Nobel Prize for Physiology or Medicine.
What causes blood types?
It took us a while to grasp the intricacies of blood, but today, we know that this life-sustaining substance consists of:
- Red blood cells — cells that carry oxygen and remove carbon dioxide throughout the body;
- White blood cells — immune cells that protect the body against infection and foreign agents;
- Platelets — cells that help blood clot; and
- Plasma — a liquid that carries salts and enzymes.6,7
Each component has a part to play in blood's function, but the red blood cells are responsible for our differing blood types. These cells have proteins* covering their surface called antigens, and the presence or absence of particular antigens determines blood type — type A blood has only A antigens, type B only B, type AB both, and type O neither. Red blood cells sport another antigen called the RhD protein. When it is present, a blood type is said to be positive; when it is absent, it is said to be negative. The typical combinations of A, B, and RhD antigens give us the eight common blood types (A+, A-, B+, B-, AB+, AB-, O+, and O-).
Blood antigen proteins play a variety of cellular roles, but recognizing foreign cells in the blood is the most important for this discussion.
Think of antigens as backstage passes to the bloodstream, while our immune system is the doorman. If the immune system recognizes an antigen, it lets the cell pass. If it does not recognize an antigen, it initiates the body's defense systems and destroys the invader. So, a very aggressive doorman.
While our immune systems are thorough, they are not too bright. If a person with type A blood receives a transfusion of type B blood, the immune system won't recognize the new substance as a life-saving necessity. Instead, it will consider the red blood cells invaders and attack. This is why so many people either grew ill or died during transfusions before Landsteiner's brilliant discovery.
This is also why people with O negative blood are considered "universal donors." Since their red blood cells lack A, B, and RhD antigens, immune systems don't have a way to recognize these cells as foreign and so leaves them well enough alone.
How is Rh-null the rarest blood type?
Let's return to golden blood. In truth, the eight common blood types are an oversimplification of how blood types actually work. As Smithsonian.com points out, "[e]ach of these eight types can be subdivided into many distinct varieties," resulting in millions of different blood types, each classified on a multitude of antigens combinations.
Here is where things get tricky. The RhD protein previously mentioned only refers to one of 61 potential proteins in the Rh system. Blood is considered Rh-null if it lacks all of the 61 possible antigens in the Rh system. This not only makes it rare, but this also means it can be accepted by anyone with a rare blood type within the Rh system.
This is why it is considered "golden blood." It is worth its weight in gold.
As Mosaic reports, golden blood is incredibly important to medicine, but also very dangerous to live with. If a Rh-null carrier needs a blood transfusion, they can find it difficult to locate a donor, and blood is notoriously difficult to transport internationally. Rh-null carriers are encouraged to donate blood as insurance for themselves, but with so few donors spread out over the world and limits on how often they can donate, this can also put an altruistic burden on those select few who agree to donate for others.
Some bloody good questions about blood types
A nurse takes blood samples from a pregnant woman at the North Hospital (Hopital Nord) in Marseille, southern France.
Photo by BERTRAND LANGLOIS / AFP
There remain many mysteries regarding blood types. For example, we still don't know why humans evolved the A and B antigens. Some theories point to these antigens as a byproduct of the diseases various populations contacted throughout history. But we can't say for sure.
In this absence of knowledge, various myths and questions have grown around the concept of blood types in the popular consciousness. Here are some of the most common and their answers.
Do blood types affect personality?
Japan's blood type personality theory is a contemporary resurrection of humorism. The idea states that your blood type directly affects your personality, so type A blood carriers are kind and fastidious, while type B carriers are optimistic and do their own thing. However, a 2003 study sampling 180 men and 180 women found no relationship between blood type and personality.
The theory makes for a fun question on a Cosmopolitan quiz, but that's as accurate as it gets.
Should you alter your diet based on your blood type?
Remember Galen of Pergamon? In addition to bloodletting, he also prescribed his patients to eat certain foods depending on which humors needed to be balanced. Wine, for example, was considered a hot and dry drink, so it would be prescribed to treat a cold. In other words, belief that your diet should complement your blood type is yet another holdover of humorism theory.
Created by Peter J. D'Adamo, the Blood Type Diet argues that one's diet should match one's blood type. Type A carriers should eat a meat-free diet of whole grains, legumes, fruits, and vegetables; type B carriers should eat green vegetables, certain meats, and low-fat dairy; and so on.
However, a study from the University of Toronto analyzed the data from 1,455 participants and found no evidence to support the theory. While people can lose weight and become healthier on the diet, it probably has more to do with eating all those leafy greens than blood type.
Are there links between blood types and certain diseases?
There is evidence to suggest that different blood types may increase the risk of certain diseases. One analysis suggested that type O blood decreases the risk of having a stroke or heart attack, while AB blood appears to increase it. With that said, type O carriers have a greater chance of developing peptic ulcers and skin cancer.
None of this is to say that your blood type will foredoom your medical future. Many factors, such as diet and exercise, hold influence over your health and likely to a greater extent than blood type.
What is the most common blood type?
In the United States, the most common blood type is O+. Roughly one in three people sports this type of blood. Of the eight well-known blood types, the least common is AB-. Only one in 167 people in the U.S. have it.
Do animals have blood types?
They most certainly do, but they are not the same as ours. This difference is why those 17th-century patients who thought, "Animal blood, now that's the ticket!" ultimately had their tickets punched. In fact, blood types are distinct between species. Unhelpfully, scientists sometimes use the same nomenclature to describe these different types. Cats, for example, have A and B antigens, but these are not the same A and B antigens found in humans.
Interestingly, xenotransfusion is making a comeback. Scientists are working to genetically engineer the blood of pigs to potentially produce human compatible blood.
Scientists are also looking into creating synthetic blood. If they succeed, they may be able to ease the current blood shortage, while also devising a way to create blood for rare blood type carriers. While this may make golden blood less golden, it would certainly make it easier to live with.* While antigens are typically proteins, they can be other molecules as well, such as polysaccharides.
As bad as this sounds, a new essay suggests that we live in a surprisingly egalitarian age.
- A new essay depicts 700 years of economic inequality in Europe.
- The only stretch of time more egalitarian than today was the period between 1350 to approximately the year 1700.
- Data suggest that, without intervention, inequality does not decrease on its own.
Economic inequality is a constant topic. No matter the cycle — boom or bust — somebody is making a lot of money, and the question of fairness is never far behind.
A recently published essay in the Journal of Economic Literature by Professor Guido Alfani adds an intriguing perspective to the discussion by showing the evolution of income inequality in Europe over the last several hundred years. As it turns out, we currently live in a comparatively egalitarian epoch.
Seven centuries of economic history
Figure 8 from Guido Alfani, Journal of Economic Literature, 2021.
This graph shows the amount of wealth controlled by the top ten percent in certain parts of Europe over the last seven hundred years. Archival documentation similar to — and often of a similar quality as — modern economic data allows researchers to get a glimpse of what economic conditions were like centuries ago. Sources like property tax records and documents listing the rental value of homes can be used to determine how much a person's estate was worth. (While these methods leave out those without property, the data is not particularly distorted.)
The first part of the line, shown in black, represents work by Prof. Alfani and represents the average inequality level of the Sabaudian State in Northern Italy, The Florentine State, The Kingdom of Naples, and the Republic of Venice. The latter part, in gray, is based on the work of French economist Thomas Piketty and represents an average of inequality in France, the United Kingdom, and Sweden during that time period.
Despite the shift in location, the level of inequality and rate of increase are very similar between the two data sets.
Apocalyptic events cause decreases in inequality
Note that there are two substantial declines in inequality. Both are tied to truly apocalyptic events. The first is the Black Death, the common name for the bubonic plague pandemic in the 14th century, which killed off anywhere between 30 and 50 percent of Europe. The second, at the dawn of the 20th century, was the result of World War I and the many major events in its aftermath.
The 20th century as a whole was a time of tremendous economic change, and the periods not featuring major wars are notable for having large experiments in distributive economic policies, particularly in the countries Piketty considers.
The slight stall in the rise of inequality during the 17th century is the result of the Thirty Years' War, a terrible religious conflict that ravaged Europe and left eight million people dead, and of major plagues that affected South Europe. However, the recurrent outbreaks of the plague after the Black Death no longer had much effect on inequality. This was due to a number of factors, not the least of which was the adaptation of European institutions to handle pandemics without causing such a shift in wealth.
In 2010, the last year covered by the essay, inequality levels were similar to those of 1340, with 66 percent of the wealth of society being held by the top ten percent. Also, inequality levels were continuing to rise, and the trends have not ended since. As Prof. Alfani explained in an email to BigThink:
"During the decade preceding the Covid pandemic, economic inequality has shown a slow tendency towards further inequality growth. The Great Recession that began in 2008 possibly contributed to slow down inequality growth, especially in Europe, but it did not stop it. However, the expectation is that Covid-19 will tend to increase inequality and poverty. This, because it tends to create a relatively greater economic damage to those having unstable occupations, or who need physical strength to work (think of the effects of the so-called "long-Covid," which can prove physically invalidating for a long time). Additionally, and thankfully, Covid is not lethal enough to force major leveling dynamics upon society."
Can only disasters change inequality?
That is the subject of some debate. While inequality can occur in any economy, even one that doesn't grow all that much, some things appear to make it more likely to rise or fall.
Thomas Piketty suggested that the cause of changes in inequality levels is the difference in the rate of return on capital and the overall growth rate of the economy. Since the return on capital is typically higher than the overall growth rate, this means that those who have capital to invest tend to get richer faster than everybody else.
While this does explain a great deal of the graph after 1800, his model fails to explain why inequality fell after the Black Death. Indeed, since the plague destroyed human capital and left material goods alone, we would expect the ratio of wealth over income to increase and for inequality to rise. His model can provide explanations for the decline in inequality in the decades after the pandemic, however- it is possible that the abundance of capital could have lowered returns over a longer time span.
The catastrophe theory put forth by Walter Scheidel suggests that the only force strong enough to wrest economic power from those who have it is a world-shattering event like the Black Death, the fall of the Roman Empire, or World War I. While each event changed the world in a different way, they all had a tremendous leveling effect on society.
But not even this explains everything in the above graph. Pandemics subsequent to the Black Death had little effect on inequality, and inequality continued to fall for decades after World War II ended. Prof. Alfani suggests that we remember the importance of human agency through institutional change. He attributes much of the post-WWII decline in inequality to "the redistributive policies and the development of the welfare states from the 1950s to the early 1970s."
What does this mean for us now?
As Professor Alfani put it in his email:
"[H]istory does not necessarily teach us whether we should consider the current trend toward growth in economic inequality as an undesirable outcome or a problem per se (although I personally believe that there is some ground to argue for that). Nor does it teach us that high inequality is destiny. What it does teach us, is that if we do not act, we have no reason whatsoever to expect that inequality will, one day, decline on its own. History also offers abundant evidence that past trends in inequality have been deeply influenced by our collective decisions, as they shaped the institutional framework across time. So, it is really up to us to decide whether we want to live in a more, or a less unequal society."
Our love-hate relationship with browser tabs drives all of us crazy. There is a solution.
- A new study suggests that tabs can cause people to be flustered as they try to keep track of every website.
- The reason is that tabs are unable to properly organize information.
- The researchers are plugging a browser extension that aims to fix the problem.
A lot of ideas that people had about the internet in the 1990s have fallen by the wayside as technology and our usage patterns evolved. Long gone are things like GeoCities, BowieNet, and the belief that letting anybody post whatever they are thinking whenever they want is a fundamentally good idea with no societal repercussions.
While these ideas have been abandoned and the tools that made them possible often replaced by new and improved ones, not every outdated part of our internet experience is gone. A new study by a team at Carnegie Mellon makes the case that the use of tabs in a web browser is one of these outdated concepts that we would do well to get rid of.
How many tabs do you have open right now?
We didn't always have tabs. Introduced in the early 2000s, tabs are now included on all major web browsers, and most users have had access to them for a little over a decade. They've been pretty much the same since they came out, despite the ever changing nature of the internet. So, in this new study, researchers interviewed and surveyed 113 people on their use of — and feelings toward — the ubiquitous tabs.
Most people use tabs for the short-term storage of information, particularly if it's information that is needed again soon. Some keep tabs that they know they'll never get around to reading. Others used them as a sort of external memory bank. One participant described this action to the researchers:
"It's like a manifestation of everything that's on my mind right now. Or the things that should be on my mind right now... So right now, in this browser window, I have a web project that I'm working on. I don't have time to work on it right now, but I know I need to work on it. So it's sitting there reminding me that I need to work on it."
You suffer from tab overload
Unfortunately, trying to use tabs this way can cause a number of problems. A quarter of the interview subjects reported having caused a computer or browser to crash because they had too many tabs open. Others reported feeling flustered by having so many tabs open — a situation called "tab overload" — or feeling ashamed that they appeared disorganized by having so many tabs up at once. More than half of participants reported having problems like this at least two or three times a week.
However, people can become emotionally invested in the tabs. One participant explained, "[E]ven when I'm not using those tabs, I don't want to close them. Maybe it's because it took efforts [sic] to open those tabs and organize them in that way."
So, we have a tool that inefficiently saves web pages that we might visit again while simultaneously reducing our productivity, increasing our anxiety, and crashing our machines. And yet we feel oddly attached to them.
Either the system is crazy or we are.
Skeema: The anti-tab revolution
The researchers concluded that at least part of the problem is caused by tabs not being an ideal way of organizing the work we now do online. They propose a new model that better compartmentalizes tabs by task and subtask, reflects users' mental models, and helps manage the users' attention on what is important right now rather than what might be important later.
To that end, the team also created Skeema, an extension for Google Chrome, that treats tabs as tasks and offers a variety of ways to organize them. Users of an early version reported having fewer tabs and windows open at one time and were better able to manage the information they contained.
Tabs were an improvement over having multiple windows open at the same time, but they may have outlived their usefulness. While it might take a paradigm shift to fully replace the concept, the study suggests that taking a different approach to tabs might be worth trying.
And now, excuse me, while I close some of the 87 tabs I currently have open.