The Fourth Industrial Revolution is here. We need a new education model.
The job market of tomorrow will require people to develop their technical capacity in tandem with human-only skills.
Kevin Dickinson has been an independent writing consultant since 2011. During that time, he's worked as an educator, editor, journalist, and researcher, and written on subjects ranging from religion to Dr. Seuss, film history to Mars' surplus of iron oxide.
- Technological advancements are predicted to take as many as 75 million jobs from humans worldwide before 2022. However, 133 million new jobs are expected to be created in that same time.
- Software developer jobs are growing more than 4x faster than other occupations, a demand that translates to a median wage of $105,590 per year (or $50.77 per hour).
- Kenzie Academy, an online software and UX engineering school with an innovative tuition model, teaches technical skills along with soft skills like problem-solving, critical thinking, and team collaboration.
Every now and then, seismic shifts remap the economic landscape. While these afford opportunities for some, they can also swallow the jobs people and communities rely on to support careers and livelihoods. Just ask any lamplighter, log driver, or switchboard operator.
Even jobs that are the staples of history—our butchers, bakers, and candlestick makers—feel the aftershocks. Not long ago, these professions were the linchpins of any community. Today, they are split between small, artisanal craftspeople and mega-factories where a handful of people produce enough supply to provision several communities.
And we're already charting the tremors of the next shift. Called the Fourth Industrial Revolution by Klaus Schwab, founder and executive chairman of the World Economic Forum, it will see artificial intelligence, digital technology, and advancements in automation supplant vast swaths of the human workforce across many industries.
The Fourth Industrial Revolution is already underway. Image: Shutterstock
Can we future-proof our careers and livelihoods for this enormous change? Yes, and organizations like Kenzie Academy are moving quickly to help workers develop the skills that will remain firmly in demand in the Fourth Industrial Revolution.
Don't go the way of the lamplighter
Lamplighters went extinct because electric lines and power grids made their jobs obsolete. Switchboard operators endured a similar fate. As noted by the World Economic Forum in The Future of Jobs Report 2018: "There are complex feedback loops between new technology, jobs and skills. New technologies can drive business growth, job creation and demand for specialist skills but they can also displace entire roles when certain tasks become obsolete or automated."
According to that report, 75 million current jobs are potentially on the line in the upcoming revolution. Unsurprisingly, manufacturing is forecasted to continue hemorrhaging jobs. Despite greater overall output, the U.S. has lost about 7.5 million jobs since 1980. Many blame global trade and shifts in competition for the losses. While those have certainly been catalytic, so has automation and other technological advances.
Other industries that could automate a substantial portion of their workforces include agriculture, food services, transportation, and other forms of manual labor.
At first blush, this places the report in line with folk knowledge that sees the common denominator for occupations in decline to be a lack of high-level education. However, the World Economic Forum also predicts occupations such as paralegals, accountants, administration managers, executive secretaries, and data entry clerks to contract.
That's because the common denominator isn't education; it's job-ready skills.
Precision and manual labor can be performed better, and more safely, by a machine. Similarly, as artificial intelligence advances, digital technology will be able to outperform people in speed and accuracy when it comes to many mental labors. To name a few: memory, mathematics, data collection, time management, and pattern recognition. And the more repetitive an occupation's core functions, the higher the risk it can be automated or computerized.
Hard skills, meet soft skills
The World Economic Forum has defined a new set of skills (left) most required for the jobs of the future. Importantly, they're a mix of hard and soft skills. On the right are the 10 skills that are becoming less important.
Source: Future of Jobs Report 2018, World Economic Forum
So, is the future job market some judgment day-scenario where technology and artificial intelligence take all the jobs to render humans obsolete? Hardly. The bleak picture above is only half the prognosis. The World Economic Forum's report also foresees 133 million new jobs emerging by 2022 to offset the losses.
The catch? Those jobs require tech skills that many working-age people aren't currently trained for.
Schools like Kenzie Academy understand that in-demand soft skills including creativity, innovation, active learning, critical thinking, emotional intelligence, and problem-solving—that is, "human skills"—are not easily duplicated by an app. That is why they are aiming to teach hard skills like technical design and programming alongside the ability to work with a team, problem solving, and even interpersonal skills like interviewing and networking.
Millions of new jobs will emerge in the technology sector: data analysts, machine-learning specialists, software and application developers, new-technology specialists, and Kenzie is taking the lead to make people job-ready.
The fastest-growing occupation in America
Photo: Kenzie Academy
Software developers are already enjoying the windfall of the Fourth Industrial Revolution. The Bureau of Labor Statistics projects software development to be among the United States' fastest-growing occupations from 2018–28, increasing at the "much faster than average" rate of 21 percent. In 2018, that demand translated to a median wage of $105,590 per year (or $50.77 per hour).
Kenzie Academy, a campus-based and online software and UX engineering school, focuses its educational model on software development and UX design to prepare its students for that future. Co-founder and CEO Chok Ooi explains the school's philosophy: "Students learn by building projects and solving problems daily under the guidance of industry practitioners. We teach technical skills along with workplace skills like problem-solving, critical thinking, and team collaboration which are equally important for students to master."
Notice the overlap of both hard and soft skills that match the World Economic Forum's analysis. Kenzie teaches students the technical skills and the soft, human skills that are not reproducible in the digital space. Both are essential to the 21st-century marketplace and thriving in a world community bound by shared, interconnected technology.
"It's not just a skill; it's a new language that controls a majority of our world and knowing it will give you opportunities to work in new fields and be ready for the future of work. It is a language that transcends borders and can allow people to work with organizations around the world," says Steven Miller, team member at Kenzie Academy.
Speeding up adaptation
The solution seems easy enough: adaptation. If the skillsets of the current workforce are no longer marketable, we need to develop ways to build new ones or upskill old ones. Were it so simple. Unfortunately, many social and economic barriers impose themselves between large portions of the population and the education and networks necessary for entry into these occupations.
"Our current education system adapts to change too slowly and operates too ineffectively for this new world," Stephane Kasriel, former CEO of Upwork, writes in an article for the World Economic Forum.
Kasriel argues that our education system must be overhauled to meet the future's challenges. It should be a lifelong pursuit, one accessible to citizens regardless of social and economic status. It should also be rewired to equip people with the "meta-skills" machines aren't good at yet, like entrepreneurship, teamwork, and curiosity—not designed toward rote memorization of facts on a test.
He adds: "Skills, not college pedigree, will be what matters for the future workforce—so while we should make sure college is affordable, we should also make sure higher education is still worth the cost, or revisit it entirely and leverage more progressive approaches to skills training. Skills-focused vocational programmes, as well as other ways to climb the skill ladder (such as apprenticeships), should be widely accessible and affordable."
Rethinking student debt
Another barrier is financial. Few people can afford to pay for a bachelor's degree and those who can't take on immense debt to try. This leads to an untenable pattern where the debt, not the learning, becomes the lifelong pursuit.
Kenzie Academy's solution is a unique income share agreement that doesn't force students to repay their tuition fees until they earn a baseline of $40,000 a year. When they begin, they repay 13 percent of income for up to four years. The school also secured $100 million in financing to help further reduce the financial burden.
"There are millions of Americans who are barred from high-quality post-secondary education because of where they live and their financial situation. And many who are 'lucky enough' to go to college find themselves buried in debt and without a job," said Ooi in a release announcing the funding. "This $100 million will level the playing field, enabling deserving individuals, regardless of their background, to access high-quality training that leads to a high paying job in tech for only $100 upfront."
Is the future secure?
The jobs landscape in 2022. Source: Future of Jobs Report 2018, World Economic Forum
Will software development and other emerging jobs one day go the way of log drivers and lamplighters? Will Silicon Valley become tomorrow's Rust Belt? While possible, that future is incredibly unlikely or, at the very least, far off.
In a 2013 study out of Oxford University, researchers used a Gaussian process classifier to estimate the probability that occupations could be computerized. The researchers assigned a probability for 702 jobs. The probability that software development would be computerized was 4.2 percent. The top 10 emerging jobs roles listed by the World Economic Forum in its Future of Jobs Report: 2018 held similarly low probability. (For the record, the researchers found that occupations such as telemarketers, insurance underwriters, and mathematical technicians all faced a 99 percent probability of computerization.)
Because of their proximity, artificial intelligence and programming jobs are certainly interconnected. Despite this, the trend today is for A.I.-powered tools to take on programming's busywork, leaving the programmer the time to solve novel and complex problems in creative ways.
Of course, no one can divine the future. Some paradigm shift may one day invent an app that's better at being human than, well, humans. Until then, the future of work looks to value the very skills that make us human—and some technical know-how too.
Ready to learn the skills needed for the future of work? Click here to learn more: Kenzie.Academy
These Roman Emperors were infamous for their debauchery and cruelty.
- Roman Emperors were known for their excesses and violent behavior.
- From Caligula to Elagabalus, the emperors exercised total power in the service of their often-strange desires.
- Most of these emperors met violent ends themselves.
We rightfully complain about many of our politicians and leaders today, but historically speaking, humanity has seen much worse. Arguably no set of rulers has been as debauched, ingenious in their cruelty, and prone to excess as the Roman Emperors.
While this list is certainly not exhaustive, here are seven Roman rulers who were perhaps the worst of the worst in what was one of the largest empires that ever existed, lasting for over a thousand years.
Officially known as Gaius (Gaius Caesar Augustus Germanicus), Caligula was the third Roman Emperor, ruling from 37 to 41 AD. He acquired the nickname "Caligula" (meaning "little [soldier's] boot") from his father's soldiers during a campaign.
While recognized for some positive measures in the early days of his rule, he became famous throughout the ages as an absolutely insane emperor, who killed anyone when it pleased him, spent exorbitantly, was obsessed with perverse sex, and proclaimed himself to be a living god.
Caligula gives his horse Incitatus a drink during a banquet. Credit: An engraving by Persichini from a drawing by Pinelli, from "The History of the Roman Emperors" from Augustus to Constantine, by Jean Baptiste Louis Crevier. 1836.
Among his litany of misdeeds, according to the accounts of Caligula's contemporaries Philo of Alexandria and Seneca the Younger, he slept with whomever he wanted, brazenly taking other men's wives (even on their wedding nights) and publicly talking about it.
He also had an insatiable blood thirst, killing for mere amusement. Once, as reports historian Suetonius, when the bridge across the sea at Puteoli was being blessed, he had a number of spectators who were there to inspect it thrown off into the water. When some tried to cling to the ships' rudders, Caligula had them dislodged with hooks and oars so they would drown. On another occasion, he got so bored that he had his guards throw a whole section of the audience into the arena during the intermission so they would be eaten by wild beasts. He also allegedly executed two consuls who forgot his birthday.
Suetonius relayed further atrocities of the mad emperor's character, writing that Caligula "frequently had trials by torture held in his presence while he was eating or otherwise enjoying himself; and kept an expert headsman in readiness to decapitate the prisoners brought in from gaol." One particular form of torture associated with Caligula involved having people sawed in half.
He caused mass starvation and purposefully wasted money and resources, like making his troops stage fake battles just for theater. If that wasn't enough, he turned his palace into a brothel and was accused of incest with his sisters, Agrippina the Younger, Drusilla, and Livilla, whom he also prostituted to other men. Perhaps most famously, he was planning to appoint his favorite horse Incitatus a consul and went as far as making the horse into a priest.
In early 41 AD, Caligula was assassinated by a conspiracy of Praetorian Guard officers, senators, and other members of the court.
Fully named Nero Claudius Caesar, Nero ruled from 54 to 68 AD and was arguably an even worse madman than his uncle Caligula. He had his step-brother Britannicus killed, his wife Octavia executed, and his mother Agrippina stabbed and murdered. He personally kicked to death his lover Poppeaea while she was pregnant with his child — a horrific action the Roman historian Tacitus depicted as "a casual outburst of rage."
He spent exorbitantly and built a 100-foot-tall bronze statue of himself called the Colossus Neronis.
He is also remembered for being strangely obsessed with music. He sang and played the lyre, although it's not likely he really fiddled as Rome burned in what is a popular myth about this crazed tyrant. As misplaced retribution for the fire which burned down a sizable portion of Rome in the year 64, he executed scores of early Christians, some of them outfitted in animal skins and brutalized by dogs, with others burned at the stake.
He died by suicide.
Roman Emperor Nero in the burning ruins of Rome. July 64 AD.Credit: From an original painting by S.J. Ferris. (Photo by Kean Collection / Getty Images)
Like some of his counterparts, Commodus (a.k.a. Lucius Aelius Aurelius Commodus) thought he was a god — in his case, a reincarnation of the Greek demigod Hercules. Ruling from 176 to 192 AD, he was also known for his debauched ways and strange stunts that seemed designed to affirm his divine status. Numerous statues around the empire showed him as Hercules, a warrior who fought both men and beasts. He fought hundreds of exotic animals in an arena like a gladiator, confusing and terrifying his subjects. Once, he killed 100 lions in a single day.
Emperor Commodus (Joaquin Phoenix) questions the loyalty of his sister Lucilla (Connie Nielsen) In Dreamworks Pictures' and Universal Pictures' Oscar-winning drama "Gladiator," directed by Ridley Scott.Credit: Photo By Getty Images
The burning desire to kill living creatures as a gladiator for the New Year's Day celebrations in 193 AD brought about his demise. After Commodus shot hundreds of animals with arrows and javelins every morning as part of the Plebeian Games leading up to New Year's, his fitness coach (aptly named Narcissus), choked the emperor to death in his bath.
Officially named Marcus Aurelius Antoninus II, Elagabalus's nickname comes from his priesthood in the cult of the Syrian god Elagabal. Ruling as emperor from 218 to 222 AD, he was so devoted to the cult, which he tried to spread in Rome, that he had himself circumcised to prove his dedication. He further offended the religious sensitivities of his compatriots by essentially replacing the main Roman god Jupiter with Elagabal as the chief deity. In another nod to his convictions, he installed on Palatine Hill a cone-like fetish made of black stone as a symbol of the Syrian sun god Sol Invictus Elagabalus.
His sexual proclivities were also not well received at the time. He was likely transgender (wearing makeup and wigs), had five marriages, and was quite open about his male lovers. According to the Roman historian (and the emperor's contemporary) Cassius Dio, Elagabalus prostituted himself in brothels and taverns and was one of the first historical figures on record to be looking for sex reassignment surgery.
He was eventually murdered in 222 in an assassination plot engineered by his own grandmother Julia Maesa.
Emperor for just eight months, from April 19th to December 20th of the year 69 AD, Vitellius made some key administrative contributions to the empire but is ultimately remembered as a cruel glutton. He was described by Suetonius as overly fond of eating and drinking, to the point where he would eat at banquets four times a day while sending out the Roman navy to get him rare foods. He also had little social grace, inviting himself over to the houses of different noblemen to eat at their banquets, too.
Vitellius dragged through the streets of Rome.Credit: Georges Rochegrosse. 1883.
He was also quite vicious and reportedly either had his own mother starved to death or approved a poison with which she committed suicide.
Vitellius was ultimately murdered in brutal fashion by supporters of the rival emperor Vespasian, who dragged him through Rome's streets, then likely beheaded him and threw his body into the Tiber river. "Yet I was once your emperor," were supposedly his last words, wrote historian Cassius Dio.
Marcus Aurelius Antoninus I ruled Rome from 211 to 217 AD on his own (while previously co-ruling with his father Septimius Severus from 198). "Caracalla"' was his nickname, referencing a hooded coat from Gaul that he brought into Roman fashion.
He started off his rise to individual power by murdering his younger brother Geta, who was named co-heir by their father. Caracalla's bloodthirsty tyranny didn't stop there. He wiped out Geta's supporters and was known to execute any opponents to his or Roman rule. For instance, he slaughtered up to 20,000 citizens of Alexandria after a local theatrical satire dared to mock him.
Geta Dying in His Mother's Arms.Credit: Jacques Pajou (1766-1828)
One of the positive outcomes of his rule was the Edict of Caracalla, which gave Roman citizenship to all free men in the empire. He was also known for building gigantic baths.
Like others on this list, Caracalla met a brutal end, being assassinated by army officers, including the Praetorian prefect Opellius Macrinus, who installed himself as the next emperor.
As the second emperor, Tiberius (ruling from 42 BC to 16 AD) is known for a number of accomplishments, especially his military exploits. He was one of the Roman Empire's most successful generals, conquering Pannonia, Dalmatia, Raetia, and parts of Germania.
He was also remembered by his contemporaries as a rather sullen, perverse, and angry man. In the chapter on his life from The Lives of the Twelve Caesars by the historian Suetonius, Tiberius is said to have been disliked from an early age for his personality by even his family. Suetonius wrote that his mother Antonia often called him "an abortion of a man, that had been only begun, but never finished, by nature."
"Orgy of the Times of Tiberius on Capri".Painting by Henryk Siemiradzki. 1881.
Suetonius also paints a damning picture of Tiberius after he retreated from public life to the island of Capri. His years on the island would put Jeffrey Epstein to shame. A horrendous pedophile, Tiberius had a reputation for "depravities that one can hardly bear to tell or be told, let alone believe," Suetonius wrote, describing how "in Capri's woods and groves he arranged a number of nooks of venery where boys and girls got up as Pans and nymphs solicited outside bowers and grottoes: people openly called this 'the old goat's garden,' punning on the island's name."
There's much, much more — far too salacious and, frankly, disgusting to repeat here. For the intrepid or morbidly curious reader, here's a link for more information.
After he died, Tiberius was fittingly succeeded in emperorship by his grandnephew and adopted grandson Caligula.
New studies stretch the boundaries of physics, achieving quantum entanglement in larger systems.
- New experiments with vibrating drums push the boundaries of quantum mechanics.
- Two teams of physicists create quantum entanglement in larger systems.
- Critics question whether the study gets around the famous Heisenberg uncertainty principle.
Recently published research pushes the boundaries of key concepts in quantum mechanics. Studies from two different teams used tiny drums to show that quantum entanglement, an effect generally linked to subatomic particles, can also be applied to much larger macroscopic systems. One of the teams also claims to have found a way to evade the Heisenberg uncertainty principle.
One question that the scientists were hoping to answer pertained to whether larger systems can exhibit quantum entanglement in the same way as microscopic ones. Quantum mechanics proposes that two objects can become "entangled," whereby the properties of one object, such as position or velocity, can become connected to those of the other.
An experiment performed at the U.S. National Institute of Standards and Technology in Boulder, Colorado, led by physicist Shlomi Kotler and his colleagues, showed that a pair of vibrating aluminum membranes, each about 10 micrometers long, can be made to vibrate in sync, in such a way that they can be described to be quantum entangled. Kotler's team amplified the signal from their devices to "see" the entanglement much more clearly. Measuring their position and velocities returned the same numbers, indicating that they were indeed entangled.
Tiny aluminium membranes used by Kotler's team.Credit: Florent Lecoq and Shlomi Kotler/NIST
Evading the Heisenberg uncertainty principle?
Another experiment with quantum drums — each one-fifth the width of a human hair — by a team led by Prof. Mika Sillanpää at Aalto University in Finland, attempted to find what happens in the area between quantum and non-quantum behavior. Like the other researchers, they also achieved quantum entanglement for larger objects, but they also made a fascinating inquiry into getting around the Heisenberg uncertainty principle.
The team's theoretical model was developed by Dr. Matt Woolley of the University of New South Wales. Photons in the microwave frequency were employed to create a synchronized vibrating pattern as well as to gauge the positions of the drums. The scientists managed to make the drums vibrate in opposite phases to each other, achieving "collective quantum motion."
The study's lead author, Dr. Laure Mercier de Lepinay, said: "In this situation, the quantum uncertainty of the drums' motion is canceled if the two drums are treated as one quantum-mechanical entity."
This effect allowed the team to measure both the positions and the momentum of the virtual drumheads at the same time. "One of the drums responds to all the forces of the other drum in the opposing way, kind of with a negative mass," Sillanpää explained.
Theoretically, this should not be possible under the Heisenberg uncertainty principle, one of the most well-known tenets of quantum mechanics. Proposed in the 1920s by Werner Heisenberg, the principle generally says that when dealing with the quantum world, where particles also act like waves, there's an inherent uncertainty in measuring both the position and the momentum of a particle at the same time. The more precisely you measure one variable, the more uncertainty in the measurement of the other. In other words, it is not possible to simultaneously pinpoint the exact values of the particle's position and momentum.
Heisenberg's Uncertainty Principle Explained. Credit: Veritasium / Youtube.com
Big Think contributor astrophysicist Adam Frank, known for the 13.8 podcast, called this "a really fascinating paper as it shows that it's possible to make larger entangled systems which behave like a single quantum object. But because we're looking at a single quantum object, the measurement doesn't really seem to me to be 'getting around' the uncertainty principle, as we know that in entangled systems an observation of one part constrains the behavior of other parts."
Ethan Siegel, also an astrophysicist, commented, "The main achievement of this latest work is that they have created a macroscopic system where two components are successfully quantum mechanically entangled across large length scales and with large masses. But there is no fundamental evasion of the Heisenberg uncertainty principle here; each individual component is exactly as uncertain as the rules of quantum physics predicts. While it's important to explore the relationship between quantum entanglement and the different components of the systems, including what happens when you treat both components together as a single system, nothing that's been demonstrated in this research negates Heisenberg's most important contribution to physics."The papers, published in the journal Science, could help create new generations of ultra-sensitive measuring devices and quantum computers.
"The question is which are okay, which are not okay."
- As the material that makes all living things what/who we are, DNA is the key to understanding and changing the world. British geneticist Bryan Sykes and Francis Collins (director of the Human Genome Project) explain how, through gene editing, scientists can better treat illnesses, eradicate diseases, and revolutionize personalized medicine.
- But existing and developing gene editing technologies are not without controversies. A major point of debate deals with the idea that gene editing is overstepping natural and ethical boundaries. Just because they can, does that mean that scientists should be edit DNA?
- Harvard professor Glenn Cohen introduces another subcategory of gene experiments: mixing human and animal DNA. "The question is which are okay, which are not okay, why can we generate some principles," Cohen says of human-animal chimeras and arguments concerning improving human life versus morality.
SMARTER FASTER trademarks owned by Freethink Media, Inc. All rights reserved.