Once a week.
Subscribe to our weekly newsletter.
Artificial intelligence will both disrupt and benefit the workplace, Stanford scholar says
Visiting scholar James Timbie says that the artificial intelligence revolution will involve humans and machines working together, with the best results coming from humans supported by intelligent machines.
Artificial intelligence offers both promise and peril as it revolutionizes the workplace, the economy and personal lives, says James Timbie of the Hoover Institution, who studies artificial intelligence and other technologies.
In tomorrow’s workplace, many routine jobs now performed by workers will increasingly be assumed by machines, leaving more complicated tasks to humans who see the big picture and possess interpersonal skills, a Stanford scholar says.
Artificial intelligence and other advancing technologies promise advances in health, safety and productivity, but large-scale economic disruptions are inevitable, said James Timbie, an Annenberg Distinguished Visiting Fellow at the Hoover Institution. He trained at Stanford as a physicist, served as a senior advisor at the State Department from 1983 to 2016 where he played a key role in arms control and disarmament, and now studies the impact of emerging technologies such as artificial intelligence.
Timbie discussed what the future may hold for workers in a chapter in the new book, Beyond Disruption: Technology’s Challenge to Governance, which he co-edited with Hoover’s George P. Shultz and Jim Hoagland. He was recently interviewed on the subject.
How will the emergence of artificial intelligence affect individual workers in the future?
Artificial intelligence combined with other advancing technologies – such as robotics and 3D printing – will lead to more efficient production of goods and services. Machines can be trained to perform a wide range of non-routine cognitive tasks, and advanced robotics can increasingly perform manual tasks. Society as a whole will benefit from increased productivity and lower costs, but many individual workers will be adversely affected. Research indicates that on the order of half of today’s workers are in industries vulnerable to disruption in the near term. In some cases – truck drivers – machines will replace workers. In other fields – education and medicine – work will be transformed, with machines assuming some tasks in close coordination with skilled humans performing other tasks.
Will well-paying “cognitive” jobs be lost to automation?
Many well-paying “cognitive” jobs are vulnerable to disruption, perhaps more over time than the well-paying factory jobs that were lost to globalization. A wide range of vulnerable occupations traditionally filled by well-educated, well-paid workers includes tax preparers, radiologists, paralegals, loan underwriters, insurance adjusters, financial analysts, translators, and even some journalists and software engineers.
How can humans and machines work together for greater efficiency and productivity?
One example is medical diagnosis. A diagnosis is a determination of how information on a patient fits into a pattern characteristic of a disease. This is something machines do well. Machines trained with the digital records and outcomes of millions of previous patients can produce a diagnosis for a sick patient, along with recommendations for treatment and perhaps further tests. Machines can take into account far more data and keep up with the latest research better than any doctor. The doctor’s primary role would be to convey the outcome to the patient, and help the patient understand and accept it, so the patient follows through with the treatment plan.
Research indicates that in many fields, the best results will come from humans supported by intelligent machines – a combination of a doctor and a machine, a teacher and a machine, etc. In the workplace of the near future, machines would continue to do the computational work they do well, while leaving other tasks to humans who see the big picture and have interpersonal skills.
How is the artificial intelligence revolution different from 20th–century labor and tech disruptions?
One big difference is the rate of change. The transition from manual labor to steam power, and the subsequent transition from steam to electricity, played out over decades. The mechanization of agriculture took a generation, so it was sufficient to educate the children of farmers with the new skills necessary for new occupations. Today the changes are coming so fast that many workers themselves will need to learn new skills for new jobs.
Another problem concerns inequality. Advancing technology increases national wealth and income, and the GDP grows. But these benefits are distributed unevenly. This growing inequality is a continuation of a long-term trend. According to Census Bureau data, median household income is about what it was in 1999, while GDP is up 38 percent. Most of the gains have gone to the upper end. The spread of automation contributes to this growing inequality in wealth and income.
Consider tax preparation software. A lot of people benefit because it is cheap and easy and they can do their taxes themselves. But many people who earned their living as tax preparers now find their jobs and income threatened.
How can society best protect workers and prepare them for this new future?
The challenge is to facilitate transitions to new occupations with new skills.
In addition, new jobs will be created even as traditional jobs disappear. Over the 200 years since the Luddite rebellion, a movement led by workers in 19th-century England who opposed the introduction of weaving technology, gains in productivity through advancing technology have led over time to new industries and new jobs. That could continue, or this time could be different.
In addition, there are more than 6 million job openings unfilled today, according to the Department of Labor. Employers cannot find qualified candidates for many well-paying jobs, which means there are potential opportunities for displaced workers with appropriate training.
These new jobs will not necessarily be in nearby locations, nor are they likely to pay as well, at least initially. New jobs require new skills.
Some advocate a guaranteed basic income. My view is that there is no shortage of work that needs to be done. Money is not the only thing; a sense of self-worth and standing in the community are also important. So, rather than pay people not to work, better to support transitions to new jobs.
The existing adjustment assistance program did not do much to counter the impact of job losses attributed to globalization; it could be expanded to provide income and assistance for training and relocation for layoffs due to automation as well as foreign competition.
Finally, the rapid pace of change reinforces the benefit of a habit of life-long education. Community colleges and internet courses provide low-cost education and training on a wide variety of subjects.
A Harvard professor's study discovers the worst year to be alive.
- Harvard professor Michael McCormick argues the worst year to be alive was 536 AD.
- The year was terrible due to cataclysmic eruptions that blocked out the sun and the spread of the plague.
- 536 ushered in the coldest decade in thousands of years and started a century of economic devastation.
The past year has been nothing but the worst in the lives of many people around the globe. A rampaging pandemic, dangerous political instability, weather catastrophes, and a profound change in lifestyle that most have never experienced or imagined.
But was it the worst year ever?
Nope. Not even close. In the eyes of the historian and archaeologist Michael McCormick, the absolute "worst year to be alive" was 536.
Why was 536 so bad? You could certainly argue that 1918, the last year of World War I when the Spanish Flu killed up to 100 million people around the world, was a terrible year by all accounts. 1349 could also be considered on this morbid list as the year when the Black Death wiped out half of Europe, with up to 20 million dead from the plague. Most of the years of World War II could probably lay claim to the "worst year" title as well. But 536 was in a category of its own, argues the historian.
It all began with an eruption...
According to McCormick, Professor of Medieval History at Harvard University, 536 was the precursor year to one of the worst periods of human history. It featured a volcanic eruption early in the year that took place in Iceland, as established by a study of a Swiss glacier carried out by McCormick and the glaciologist Paul Mayewski from the Climate Change Institute of The University of Maine (UM) in Orono.
The ash spewed out by the volcano likely led to a fog that brought an 18-month-long stretch of daytime darkness across Europe, the Middle East, and portions of Asia. As wrote the Byzantine historian Procopius, "For the sun gave forth its light without brightness, like the moon, during the whole year." He also recounted that it looked like the sun was always in eclipse.
Cassiodorus, a Roman politician of that time, wrote that the sun had a "bluish" color, the moon had no luster, and "seasons seem to be all jumbled up together." What's even creepier, he described, "We marvel to see no shadows of our bodies at noon."
...that led to famine...
The dark days also brought a period of coldness, with summer temperatures falling by 1.5° C. to 2.5° C. This started the coldest decade in the past 2300 years, reports Science, leading to the devastation of crops and worldwide hunger.
...and the fall of an empire
In 541, the bubonic plague added considerably to the world's misery. Spreading from the Roman port of Pelusium in Egypt, the so-called Plague of Justinian caused the deaths of up to one half of the population of the eastern Roman Empire. This, in turn, sped up its eventual collapse, writes McCormick.
Between the environmental cataclysms, with massive volcanic eruptions also in 540 and 547, and the devastation brought on by the plague, Europe was in for an economic downturn for nearly all of the next century, until 640 when silver mining gave it a boost.
Was that the worst time in history?
Of course, the absolute worst time in history depends on who you were and where you lived.
Native Americans can easily point to 1520, when smallpox, brought over by the Spanish, killed millions of indigenous people. By 1600, up to 90 percent of the population of the Americas (about 55 million people) was wiped out by various European pathogens.
Like all things, the grisly title of "worst year ever" comes down to historical perspective.
Quantum theory has weird implications. Trying to explain them just makes things weirder.
- The weirdness of quantum theory flies in the face of what we experience in our everyday lives.
- Quantum weirdness quickly created a split in the physics community, each side championed by a giant: Albert Einstein and Niels Bohr.
- As two recent books espousing opposing views show, the debate still rages on nearly a century afterward. Each "resolution" comes with a high price tag.
Albert Einstein and Niels Bohr, two giants of 20th century science, espoused very different worldviews.
To Einstein, the world was ultimately rational. Things had to make sense. They should be quantifiable and expressible through a logical chain of cause-and-effect interactions, from what we experience in our everyday lives all the way to the depths of reality. To Bohr, we had no right to expect any such order or rationality. Nature, at its deepest level, need not follow any of our expectations of well-behaved determinism. Things could be weird and non-deterministic, so long as they became more like what we expect when we traveled from the world of atoms to our world of trees, frogs, and cars. Bohr divided the world into two realms, the familiar classical world, and the unfamiliar quantum world. They should be complementary to one another but with very different properties.
The two scientists spent decades arguing about the impact of quantum physics on the nature of reality. Each had groups of physicists as followers, all of them giants of their own. Einstein's group of quantum weirdness deniers included quantum physics pioneers Max Planck, Louis de Broglie, and Erwin Schrödinger, while Bohr's group had Werner Heisenberg (of uncertainty principle fame), Max Born, Wolfgang Pauli, and Paul Dirac.
Almost a century afterward, the debate rages on.
Einstein vs. Bohr, Redux
Two books — one authored by Sean Carroll and published last fall and another published very recently and authored by Carlo Rovelli — perfectly illustrate how current leading physicists still cannot come to terms with the nature of quantum reality. The opposing positions still echo, albeit with many modern twists and experimental updates, the original Einstein-Bohr debate.
Albert Einstein and Niels Bohr, two giants of 20th century science, espoused very different worldviews.
I summarized the ongoing dispute in my book The Island of Knowledge: Are the equations of quantum physics a computational tool that we use to make sense of the results of experiments (Bohr), or are they supposed to be a realistic representation of quantum reality (Einstein)? In other words, are the equations of quantum theory the way things really are or just a useful map?
Einstein believed that quantum theory, as it stood in the 1930s and 1940s, was an incomplete description of the world of the very small. There had to be an underlying level of reality, still unknown to us, that made sense of all its weirdness. De Broglie and, later, David Bohm, proposed an extension of the quantum theory known as hidden variable theory that tried to fill in the gap. It was a brilliant attempt to appease the urge Einstein and his followers had for an orderly natural world, predictable and reasonable. The price — and every attempt to deal with the problem of figuring out quantum theory has a price tag — was that the entire universe had to participate in determining the behavior of every single electron and all other quantum particles, implicating the existence of a strange cosmic order.
Later, in the 1960s, physicist John Bell proved a theorem that put such ideas to the test. A series of remarkable experiments starting in the 1970s and still ongoing have essentially disproved the de Broglie-Bohm hypothesis, at least if we restrict their ideas to what one would call "reasonable," that is, theories that have local interactions and causes. Omnipresence — what physicists call nonlocality — is a hard pill to swallow in physics.
Credit: Public domain
Yet, the quantum phenomenon of superposition insists on keeping things weird. Here's one way to picture quantum superposition. In a kind of psychedelic dream state, imagine that you had a magical walk-in closet filled with identical shirts, the only difference between them being their color. What's magical about this closet? Well, as you enter this closet, you split into identical copies of yourself, each wearing a shirt of a different color. There is a you wearing a blue shirt, another a red, another a white, etc., all happily coexisting. But as soon as you step out of the closet or someone or something opens the door, only one you emerges, wearing a single shirt. Inside the closet, you are in a superposition state with your other selves. But in the "real" world, the one where others see you, only one copy of you exists, wearing a single shirt. The question is whether the inside superposition of the many yous is as real as the one you that emerges outside.
To Einstein, the world was ultimately rational... To Bohr, we had no right to expect any such order or rationality.
The (modern version of the) Einstein team would say yes. The equations of quantum physics must be taken as the real description of what's going on, and if they predict superposition, so be it. The so-called wave function that describes this superposition is an essential part of physical reality. This point is most dramatically exposed by the many-worlds interpretation of quantum physics, espoused in Carroll's book. For this interpretation, reality is even weirder: the closet has many doors, each to a different universe. Once you step out, all of your copies step out together, each into a parallel universe. So, if I happen to see you wearing a blue shirt in this universe, in another, I'll see you wearing a red one. The price tag for the many-worlds interpretation is to accept the existence of an uncountable number of non-communicating parallel universes that enact all possibilities from a superstition state. In a parallel universe, there was no COVID-19 pandemic. Not too comforting.
Bohm's team would say take things as they are. If you stepped out of the closet and someone saw you wearing a shirt of a given color, then this is the one. Period. The weirdness of your many superposing selves remains hidden in the quantum closet. Rovelli defends his version of this worldview, called relational interpretation, in which events are defined by the interactions between the objects involved, be them observers or not. In this example, the color of your shirt is the property at stake, and when I see it, I am entangled with this specific shirt of yours. It could have been another color, but it wasn't. As Rovelli puts it, "Entanglement… is the manifestation of one object to another, in the course of an interaction, in which the properties of the objects become actual." The price to pay here is to give up the hope of ever truly understanding what goes on in the quantum world. What we measure is what we get and all we can say about it.
What should we believe?
Both Carroll and Rovelli are master expositors of science to the general public, with Rovelli being the more lyrical of the pair.
There is no resolution to be expected, of course. I, for one, am more inclined to Bohr's worldview and thus to Rovelli's, although the interpretation I am most sympathetic to, called QBism, is not properly explained in either book. It is much closer in spirit to Rovelli's, in that relations are essential, but it places the observer on center stage, given that information is what matters in the end. (Although, as Rovelli acknowledges, information is a loaded word.)
We create theories as maps for us human observers to make sense of reality. But in the excitement of research, we tend to forget the simple fact that theories and models are not nature but our representations of nature. Unless we nurture hopes that our theories are really how the world is (the Einstein camp) and not how we humans describe it (the Bohr camp), why should we expect much more than this?
Maybe eyes really are windows into the soul — or at least into the brain, as a new study finds.
- Researchers find a correlation between pupil size and differences in cognitive ability.
- The larger the pupil, the higher the intelligence.
- The explanation for why this happens lies within the brain, but more research is needed.
What can you tell by looking into someone's eyes? You can spot a glint of humor, signs of tiredness, or maybe that they don't like something or someone.
But outside of assessing an emotional state, a person's eyes may also provide clues about their intelligence, suggests new research. A study carried out at the Georgia Institute of Technology shows that pupil size is "closely related" to differences in intelligence between individuals.
The scientists found that larger pupils may be connected to higher intelligence, as demonstrated by tests that gauged reasoning skills, memory, and attention. In fact, the researchers claim that the relationship of intelligence to pupil size is so pronounced, that it came across their previous two studies as well and can be spotted just with your naked eyes, without any additional scientific instruments. You should be able to tell who scored the highest or the lowest on the cognitive tests just by looking at them, say the researchers.
The pupil-IQ link
The connection was first noticed across memory tasks, looking at pupil dilations as signs of mental effort. The studies involved more than 500 people aged 18 to 35 from the Atlanta area. The subjects' pupil sizes were measured by eye trackers, which use a camera and a computer to capture light reflecting off the pupil and cornea. As the scientists explained in Scientific American, pupil diameters range from two to eight millimeters. To determine average pupil size, they took measurements of the pupils at rest when the participants were staring at a blank screen for a few minutes.
Another part of the experiment involved having the subjects take a series of cognitive tests that evaluated "fluid intelligence" (the ability to reason when confronted with new problems), "working memory capacity" (how well people could remember information over time), and "attention control" (the ability to keep focusing attention even while being distracted). An example of the latter involves a test that attempts to divert a person's focus on a disappearing letter by showing a flickering asterisk on another part of the screen. If a person pays too much attention to the asterisk, they might miss the letter.
The conclusions of the research were that having a larger baseline pupil size was related to greater fluid intelligence, having more attention control, and even greater working memory capacity, although to a smaller extent. In an email exchange with Big Think, author Jason Tsukahara pointed out, "It is important to consider that what we find is a correlation — which should not be confused with causation."
The researchers also found that pupil size seemed to decrease with age. Older people had more constricted pupils but when the scientists standardized for age, the pupil-size-to-intelligence connection still remained.
Why are pupils linked to intelligence?
The connection between pupil size and IQ likely resides within the brain. Pupil size has been previously connected to the locus coeruleus, a part of the brain that's responsible for synthesizing the hormone and neurotransmitter norepinephrine (noradrenaline), which mobilizes the brain and body for action. Activity in the locus coeruleus affects our perception, attention, memory, and learning processes.
As the authors explain, this region of the brain "also helps maintain a healthy organization of brain activity so that distant brain regions can work together to accomplish challenging tasks and goals." Because it is so important, loss of function in the locus coeruleus has been linked to conditions like Alzheimer's disease, Parkinson's, clinical depression, and attention deficit hyperactivity disorder (ADHD).
The researchers hypothesize that people who have larger pupils while in a restful state, like staring at a blank computer screen, have "greater regulation of activity by the locus coeruleus." This leads to better cognitive performance. More research is necessary, however, to truly understand why having larger pupils is related to higher intelligence.
In an email to Big Think, Tsukahara shared, "If I had to speculate, I would say that it is people with greater fluid intelligence that develop larger pupils, but again at this point we only have correlational data."
Do other scientists believe this?
As the scientists point out in the beginning of their paper, their conclusions are controversial and, so far, other researchers haven't been able to duplicate their results. The research team addresses this criticism by explaining that other studies had methodological issues and examined only memory capacity but not fluid intelligence, which is what they measured.