The revolution is already happening.
The following article was originally published by our sister site, Big Think Edge.
Business leaders know they must prepare for technological upheavals in the years ahead. But keeping up-to-date on new technologies—to say nothing of understanding their complexities and forecasting those shifts—is an overwhelming task.
To help organizations find their footing, the CompTIA Emerging Technology Community releases an annual list of the top 10 emerging technologies. What makes this list special is that it focuses on "which emerging technologies have the most potential for near-term business impact."
Here are CompTIA's picks along with a quick encapsulation of each technology and some potential business use cases.
The holy grail of artificial intelligence research is general AI, a machine that is self-aware and commands intelligence equal to a person's. These theoretical systems would be our intellectual equals—well, until v2.0 drops and we fall to a distant second.
Until then we have narrow AI, which are systems that perform very specific tasks. That may seem too limited, but narrow AI already powers systems like SPAM filters, Google Maps, and virtual assistants such as Siri. And its use cases are projected to diversify even more.
As Max Tegmark, physicist and machine-learning researcher, told Big Think in an interview: "What we're seeing now is that machine intelligence is spreading out a little bit from those narrow peaks and getting a bit broader."
Chatbots, logistics, self-driving cars, virtual nursing assistants, personalized textbooks and tutors, and even artificial creativity: These are just a few of the applications that narrow AI can improve or bring to light in the coming years.
5G and the Internet of Things
5G may not seem very exciting. We already have 4G, so what's another G? But the difference will be exponential. 5G networks may ultimately be 100 times faster than 4G, allowing many more devices to connect, reducing latency to practically zero, and providing more reliable signals.
This wireless technology will provide the backbone for the internet of things (IoT), which will expand the power of the internet beyond computers and across a wide range of objects, processes, and environments. The IoT is the keystone technology for such futuristic scenes as smart cities, robot-driven agriculture, and self-driving highway systems.
For businesses, this one-two combo will continue recent trends and power them to the next level. Remote offices become more dependable under the 5G paradigm, and real-time data sharing of, say, live events or desktop captures will be seamless. As for the IoT, it helps remove intermediate steps that bog down productivity. Why have someone waste their time collecting data from the factory floor when the factory floor can collect, curate, and send it to them?
Serverless computing isn't truly "serverless." Sans tapping into some seriously dark arts, it's impossible to provide computational resources without a physical server somewhere. Instead, this technology distributes those resources more effectively. When an application is not in use, no resources are allocated. When they are needed, the computing power auto-scales.
This technological shift means companies no longer need to worry over infrastructure or reserving bandwidth, which in turn promises the golden ticket of ease of use and cost savings.
As Eric Knorr, editor in chief of International Data Group Enterprise, writes: "One of the beauties of this architecture is that you get charged by the cloud provider only when a service runs. You don't need to pay for idle capacity—or even think about capacity. Basically, the runtime sits idle waiting for an event to occur, whereupon the appropriate function gets swapped into the runtime and executes. So you can build out a big, complex application without incurring charges for anything until execution occurs."
Biometrics allows a system to recognize users by biological markers such as their face, voice, or fingerprint. Many people already have one or several of these on their laptops and smartphones, but as the technology improves and becomes more ubiquitous, it may finally end the password paradigm.
Because most people have inefficient passwords, use the same one for every account, and never change them, hackers typically need only one hit to enjoy carte blanche over someone's personal and professional data. Even those who do passwords correctly can find managing the system a nightmare.
For these reasons, biometrics promises much-needed security of sensitive data. A fingerprint is much more difficult to hack with raw computational power than a password, and that difficulty is increased by magnitudes when multiple markers are used in tandem.
With hardware costs lowering, processing power increasing, and high-profile players such as Google and Facebook entering the game, virtual reality's day may have finally come. And the more widespread acceptance of augmented reality apps in smartphones may make such technologies an easier sell moving forward.
The recently announced Microsoft Mesh and its competitors hope to capitalize on our new remote-work era. The concept combines these "mixed-reality" technologies to create virtual shared spaces that business teams can use to hold meetings or work on projects.
And Peter Diamandis, chairman and CEO of the XPRIZE Foundation, imagines this technology can revolutionize the customer experience in retail. Customers could, for example, try clothes on a virtual avatar or sit in their amphitheater seats before making a purchase.
It may be surprising that Bitcoin, the much-hyped cryptocurrency, didn't make the list. But the technology's online ledger, the blockchain, has supplanted the digital denomination as the rising business star.
Unlike traditional, centralized records, a blockchain is decentralized. The permanent record is not stored in one location but exists on nodes spread across the system. This design makes it difficult to lose records or tamper with them.
As tech entrepreneur Elad Gil told Big Think in an interview: "[Blockchain] systems are effectively censorship proof or seizure resistant. In other words, the government can't come and take your asset if you're in a country that has very bad governance, or it means that no third party can suddenly, accidentally erase your data, or you can't hack a third party to access your data (although obviously, you can still hack a blockchain)."
This is why blockchain has caught the attention of organizations that need to store records (i.e., all organizations). And the potential use cases are impressive. Blockchain could be used by hospitals to store and share health records. It could underpin a secure online voting platform. It could track logistics across international supply chains. And, of course, there are numerous applications for cybersecurity, too.
The first industrial robot punched the clock in 1962. Technological advancements have steadily widened robotics' workforce representation since, and in the coming years, robots will continue moving from factories to First Street to perform rudimentary tasks such as cleaning and delivery.
Such advancements have kept the Luddite fires burning for more than a century now, so one challenge faced by organization leaders will be reassuring their teams that the robots aren't here to replace them. In fact, as more people move into soft-skilled, human-focused jobs, they'll likely find the transition a beneficial one.
"Introducing robots into a workplace can be a complex and dynamic undertaking. While it may start with workers feeling like their jobs are being threatened, the end result is a warehouse full of happier, healthier humans who remain the centerpiece of a competitive business," writes Melonee Wise, CEO of Fetch Robotics, for the World Economic Forum.
Natural Language Processing
Natural language processing is a subfield of AI that aims to develop systems that can analyze and communicate through human language. Sound easy? If so, it's only because you're reading these words with a mind endowed by evolution with the gift of language.
Algorithms aren't so lucky. They have trouble parsing the eclectic hodgepodge of symbols, gestures, sounds, and cultural cues that we use to express meaning and ideas.
"There's an obvious problem with applying deep learning to language. It's that words are arbitrary symbols, and as such they are fundamentally different from imagery. Two words can be similar in meaning while containing completely different letters, for instance; and the same word can mean various things in different contexts," writes Will Knight for MIT Technology Review.
When algorithms finally crack language, the business use cases will be substantial. Think chatbots, virtual editors, market analysis, instant translation of live conversations, resume readers, and phone auto-attendants that don't send every caller into a rage.
Quantum computing is "the exploitation of collective properties of quantum states, such as superposition and entanglement, to perform computation." Translation: It solves problems faster and more accurately—in some cases, ones that stump even modern supercomputers.
While we shouldn't expect the quantum PC any time soon, we can expect quantum computers to become the backbone for the emerging technologies listed above. These machines already exist today, and IBM has announced plans to build a 1,000 qubit version by 2023, a milestone physicist Jay Gambetta told Science would reflect an "inflection point."
Adoption of this technology could make big data more manageable. It could cut costly and complex development time through speedy simulations and solve multivariable optimization problems with ease. Finally, it may make currently intractable problems manageable, such as those faced in the processing of natural language.
Quantum computing also illustrates why it's important that organizational leaders don't develop tunnel vision. To focus on one emerging technology or one model of the future is to risk your company's well-being. It's not a question of which technology will dominate, but the potentials each technology brings and how they may work together.
"The innovation that will be delivered by these technologies, especially as I said, when they're leveraged in tandem, will be staggering over the next few years and will enable customer solutions that will actually have paradigm shifting impact for those that act on them," Mike Haines, chair of the Emerging Technology Community's executive council, said on the CompTIA Biz Tech podcast.
Navigating these technological shifts will certainly challenge business leaders for years to come. But by keeping an open mind to the possibilities, they can chart a path that predicts dangers and capitalize on these emerging technologies.
Make innovation central to your organizational culture with lessons 'For Business' from Big Think Edge. At Edge, more than 350 experts, academics, and entrepreneurs come together to teach essential skills in career development and lifelong learning. Prepare for the future of work with lessons such as:
- Make Room for Innovation: A Framework for Creating a Culture of Innovation, with Lisa Bodell, Founder and CEO, Futurethink
- Worrying About the Robo-pocalypse Is a First-World Problem, with Bill Nye, the Science Guy, Mechanical Engineer, and TV Personality
- How to Supercharge Collaboration: The 4 Benefits of Remote Teams, with Erica Dhawan, Collaboration Consultant and Co-Author, Get Big Things Done
- Design for Good: How to Provide Products that Align with Consumer Goals—and Transform the Attention Economy, with Tristan Harris, Former Design Ethicist, Google, and Co-Founder, Center for Humane Technology
- Confront Inefficiencies: Essential Questions for Examining Your Organization in an Honest Way, with Andrew Yang, CEO and Founder, Venture for America
- Earn the Right to Win: Develop and Execute a Competitive Strategy, with Bill McDermott, CEO, ServiceNow, and Author, Winner's Dream
Request a demo today!
How our fantasy world of the past has become everyday reality.
This article was originally published on our sister site, Freethink. Freethink has partnered with the Build for Tomorrow podcast, to go inside new episodes each month. Subscribe here to learn more about the crazy, curious things from history that shaped us, and how we can shape the future.
Jason Feifer, Entrepreneur Magazine Editor-in-Chief and host of the Build for Tomorrow podcast, has a fun hobby: he combs through newspaper archives to discover how people who lived 100 years ago envisioned life in the 21st century.
One pipe dream of yore: climate-controlled housing.
We, mere commoners of today, live better than the royalty of the past.
"When heating is done all electrically, and I want 70 degrees in my home, I shall set the thermostat at 70 and the temperature will not rise above that point. This temperature will be maintained uniformly regardless of the weather outside," predicted Charles Steinmetz from 1921, an electrical engineer in Schenectady (who apparently went by the enviable nickname "Forger of Thunderbolts").
And that's only looking at life 100 years ago, a blip in the full scope of human life. When you rewind further, Feifer explains, you realize just how much amazing stuff populates our everyday life.
"We live in a fantasy world, and we barely pause to appreciate it," Feifer says.
Here are 5 ways that we, mere commoners of today, live better than the royalty of the past.
1. Royal Purple
You may already know that purple was a royal color — but do you know why?
For starters, purple was exceptionally expensive because of what it took to make it: it had to be distilled from the dehydrated mucous glands that lie just behind the rectum of a particular snail. Per the BBC:
"It took tens of thousands of desiccated hypobranchial glands, wrenched from the calcified coils of spiny murex sea snails before being dried and boiled, to colour even a single small swatch of fabric, whose fibres, long after staining, retained the stench of the invertebrate's marine excretions."
Adding insult to olfactory injury, these snails had to be imported to Europe from Lebanon (the name "Tyrian purple" refers to Tyre, Lebanon).
Then, of course, there was the law — it was illegal for commoners to wear purple. Only royals could wear it.
"In the 11th and 12th Centuries, Europe begins to develop dense urban spaces as we'd recognize them today. And this creates a social problem," Feifer says.
"Suddenly you need a way of distinguishing who the different people are in the city ... A serf had to wear the clothing of a serf. Lords had to wear the clothing of a lord. Everyone is identifiable, so you're never able to pass as someone you're not."
2. Peace and quiet... and private sex
The next time you curl up in bed to read a book in private (wearing purple for good measure), be sure to remind yourself of your royal stature...at least compared to Europe in the Middle Ages.
"For most people in the middle ages, the concept of personal space just literally didn't exist," Feifer says.
"You worked and ate and lived squished up against other people. And at night, entire families would share a bed. Sometimes, strangers or travelers would hop in bed with them too, to keep warm. This wasn't weird for them. It's just... how it was."
Royal sex required a witness.
In fact, Feifer explains, the privacy we experience during intimate moments was a luxury even royalty weren't afforded.
In societies where family lineage determines who rules the land, proving lineage is of the utmost importance. But since DNA tests were many centuries away, proving that the man and woman in question were the true parents required a...notary, of sorts.
"When you're royalty, sex is not just sex — it is an official act of extending the royal bloodline, which is a matter of the state," Feifer says. "So it must be...confirmed. This meant that royal sex required a witness."
Speaking of privacy, yet another way that we have it better than royals of yesterday is our bathrooms — thanks to modern plumbing, many of us can safely assume nobody can watch us do our business.
"If you're in the aristocracy, living in a castle, what we would call the toilet or the bathroom or the water closet, it's called the garter robe," Andrew Rabin, a professor of English at the University of Louisville, explains on the podcast.
"And basically what it is, is a hole ... that goes outside the castle, so that you would sit on this hole and do your business. And it would literally drip down the side of the castle."
Royals may have had to suffer the injustice of possibly being spotted doing their business high in their castles, but what about common folk? The premise is similar, but much more...down to earth. Fortunately, the engineers of their day figured out an important hack.
"In medieval cities, the second floor of a house would jut out onto the street. This was for two reasons — one, because it enabled them to build wider streets, which was useful because streets could be crowded places full of animals," Feifer says.
"But two, because that way, people could walk underneath these overhanging second floors."
If these people happened to walk too close to the edge of these overhangs, Rabin explains, they risked ending up with "a surprising new way of styling (their) hair."
But even when their hair stayed urine-and feces-free, they were still affronted by a panoply of scents from the street — a potent blend of human and animal excrement and body odors. Rabin says it's a misconception that people in the Middle Ages didn't bathe — they did bathe, it just didn't do all that much to mitigate the onslaught of noxious smells.
Royals were lucky to have access to perfume, but the same can't be said of all those in their employ. So take a deep breath and rest assured that what you smell is the stuff of dreams to medieval royalty.
Literacy is one thing, but in England at the turn of the last millennium, a commoner wasn't even allowed to speak the same language as their rulers.
"Following the Norman conquest of 1066, when various groups from France invaded and occupied England, the ruling class in England spoke a language called Norman French," Feifer said.
"In fact, several generations of rulers would pass before any of them could speak the language of their people. You know Richard the Lionhart, aka Richard the First of England, who shows up in countless movies like King of Heaven, and is portrayed with an English accent? Nah. He didn't speak English."
The average person wasn't likely to have an occasion to speak to the King anyway, but the implications of this differentiation between royals and common people extended into legal affairs. This remained true even after Norman French was a distant memory.
"Even after the monarchy and the court had abandoned French and were speaking English, if you were a barrister, if you were a lawyer, you still had to learn how to speak this 'Law French,'" Rabin said.
So, in this sense, all you need to do to live like royalty today is speak the same language as your elected leader and be able to read a legal document (even if the text itself might feel more like Law French than English).
The final way to live like a royal is to indulge your sweet tooth.
"Sugar cane is a modern invention. Sugar beets: modern invention. Corn syrup: modern invention — and it takes a lot of factory processing to get that sweetness," says Kara Cooney, a professor of Egyptian Art and Architecture, and Chair of the Department of Near Eastern Languages and Cultures at UCLA, in the episode.
"Sugar in the ancient world was from fruit. If you had access to fruit ... and if you juiced it...you could get sugar. But it was a hard thing to get your hands on."
Like the color purple, sugar was rare because it was hard to procure and to make. Modern society has managed to flip the script.
"Back then, consuming sugar was a sign of status," Cooney says, "Now, thousands of years later, industrial sugar is one of the cheapest substances available. So the marker of status has flipped."
The next time you treat yourself to your favorite dessert, enjoy it as the royal delight it really is.
For more, be sure to check out the Build for Tomorrow episode here.
This article was originally published on our sister site, Freethink. Freethink would love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at firstname.lastname@example.org.
Say hello to your new colleague, the Workplace Environment Architect.
As some countries begin to pull out of pandemic-induced lockdown, and the corporate engines of "return to the office" begin to whir, an open question hangs: What kind of jobs will people return to following months of work-from-home exile in "Remotopia"?
Will the online "big-bang" of the 2020s (when everything that could go online did go online) accelerate digitally enabled jobs? And which jobs will top the post-pandemic jobs list, in the next, new future of work?
Over the past several years, the Cognizant Center for the Future of Work has published a series of reports on the Jobs of the Future that propose new roles which will emerge over the next decade and be central to businesses and employees everywhere. Because of the virus, time has compressed, resulting in a handful of these jobs of the future becoming 'jobs of the now'.
And the top jobs are...
The following is a top-ten summary of professions emerging in the wake of the pandemic.
1. Work from Home Facilitator – Prior to 2020, it's estimated that less than 5% of companies had remote policies. Now, with the full post-pandemic expectation that remote work remains the norm, companies want to apply lessons learned to optimize the work-from-home experience. Far from being a futuristic job of tomorrow, WFH facilitators have become undeniable "jobs of the now."
2. Fitness Commitment Counsellor – We cringe at the extra kilos, pounds and stones packed-on during months of pandemic-induced lockdown. To remedy the situation, predictive and preventative approaches to counselling, paired with digital wearables like Apple Watches and FitBit dashboards couple human accountability to maintaining fitness. And per the Cognizant Jobs of the Future (CJoF) Index, it's a role that grew 28.7% in Q1 '21.
3. Smart Home Design Manager – A lasting lesson of the virus for many will be that "everyone's home is their castle." The rise of smart home design managers will boom as homes are built – or retrofitted – with dedicated home office spaces, replete with routers in the right place, soundproofing, separate voice-driven entrances, and even Gorilla Glass wall screens.
4. XR Immersion Counsellor – As Zoom-intensive "Remotopia" inexorably gives way to 3D realms of virtual space, XR immersion counselors will work with technical artists and software engineering, training and workforce collaboration leads to massively scale the rollout of best-in-class AR and VR for learn-by-doing workforce training and collaboration (using platforms like Strivr) or apprenticeships (such as Mursion, for example) to get employees productive – fast.
5. Workplace Environment Architect – Everything from health screenings to "elevator commutes" in post-pandemic office architecture is about to go through a major rethink. The importance of employee well-being, and how human-centered design of a company's real estate holdings can impact it, are now crucial to the future of work.
6. Algorithm Bias Auditor – "All online, all the time" lifestyles for work and leisure accelerated the competitive advantage derived from algorithms by digital firms everywhere. But from Brussels to Washington, given the increasing statutory scrutiny on data, it's a near certainty that when it comes to how they're built, verification through audits will help ensure the future workforce is also the fair workforce.
7. Data Detective – Openings for data scientists remain the fastest growing job in the tech-heavy "Algorithms, Automation and AI" family of the CJoF Index since its inception, and continued to see 42% growth in Q1 '21. Given this high demand, they're also scarce; that's where data detectives help bridge the gap to get companies to investigate the mysteries in big data.
8. Cyber Calamity Forecaster – Aside from COVID-19, it's arguable that the other, big catastrophe of 2020 was the continued onslaught of both massive state-sponsored cyberattacks like Solar Winds, down to individual bad actors promulgating ransomware exploits. The ability to forecast events like these is critical to forewarn of culture events. The CJoF Index bears this out: growth in openings for Cyber Calamity Forecasters grew 28% in Q1 '21.
9. Tidewater Architect – The global challenge of climate change and sea level rise will remain an omnipresent challenge. Tidewater architects will work with nature – not against it – in some of the biggest civil engineering projects of the 21st century. And per the CJoF Index, openings for these jobs grew 37% in Q1 '21.
10. Human-Machine Teaming Manager – Pandemic or no, the unceasing rise of robots in the workplace continues unabated. Human-Machine Teaming Managers will operate at the intersection of people and robots and create seamless collaborations. Already, openings for forerunner roles like robotics technicians grew 50% in the Q1 '21 CJoF Index.
While it is impossible to predict exactly how global labour markets will rebound in the wake of the virus, leaders can and should use the future of work as a prism for their own organizations to plan ahead. If there's one lesson the pandemic has taught us, it's to anticipate change.
Leaders need to see how the future of work will play out in real time through leading indicators that reveal how the jobs market is adapting in the face of technology-based innovation and disruption. The CJoF Index uses real data on US job openings to see the imagined possibilities of jobs of the future starting to emerge.
By combining strategic planning resources like "21 Jobs of the Future" and the CJoF Index, it's possible to get a look into the not-too-distant future to see which roles are the top contenders in the post-COVID future.
2021 will be a reset moment, a period where more examples of the theoretical become "jobs-made-real". Before they can be built, however, jobs of the future have to be dreamed - and this requires vision and some imagination.
U.S. officials suspect a foreign adversary is targeting American personnel with some form of "directed-energy" weapon.
- In recent history, the first reports of a potential directed-energy attack on U.S. personnel came in 2016 from American diplomats working in Cuba.
- There's no "smoking gun" evidence of who's behind the attacks, but some U.S. officials suspect the Russians.
- Supporting that claim is the history of the so-called Moscow Signal, an event in which the Soviets blasted microwaves at the U.S. embassy in Moscow from 1953 to 1976.
Since 2016, more than 130 U.S. government personnel have suffered symptoms linked to Havana syndrome, an acute illness marked by sudden headache, nausea, and the hearing of loud sounds, akin to swarming cicadas. The cause of the illness remains a mystery. But a growing number of U.S. intelligence personnel and researchers fear that some form of "directed-energy" weapon — possibly firing microwave radiation — is to blame.
One of the first cases was reported in Havana, Cuba in 2017. The victim, if it indeed was an attack, was a U.S. Foreign Service officer who was living in a quiet Havana neighborhood among other American personnel. One night she was cleaning her kitchen. If it had been daytime, her kitchen window would have offered a view of a booth outside where Cuban police monitored foreigners like herself.
But at night, the kitchen's interior lights obstructed her view of the booth, The New Yorker reported. As she was cleaning, she suddenly felt a painful burst of pressure inside her head. The pain grew. She had heard rumors of U.S. personnel suffering strange "sonic attacks," and she remembered that a security officer had once advised: to protect yourself, step away from your current position. She did. The pain decreased. But for weeks she suffered headaches, dizziness, and confusion.
Over the past five years, at least 130 U.S. personnel have reported similar symptoms while working in places like China, Russia, and Washington, D.C. The cases vary in severity, but almost all involve sudden headaches and nausea. Some victims may have brain injuries.
A 2019 study published in JAMA found that victims had "significantly smaller" white matter volume and other "significant differences" in brain structure, though it's impossible to determine whether these differences were pre-existing or stem from a directed-energy attack.
What's causing Havana syndrome?
The U.S. hasn't reported a definitive cause of these cases, but intelligence agencies are actively investigating the possibility that bad actors are using some type of directed-energy weapon against U.S. personnel.
A December 2020 report from the National Academies of Sciences found that pulsed radiofrequency energy, which includes microwave radiation, "appears to be the most plausible mechanism in explaining these cases among those that the committee considered." (Other potential causes included infection and chemicals.)
A microwave weapon seems like a fitting culprit. One reason is that sufferers of Havana syndrome often hear loud noises, which is a phenomenon that's known to happen when people are bombarded with high-powered microwaves. In the 1960s, the American neuroscientist Allan H. Frey demonstrated that exposing people to microwaves can make them hear buzzing, clicking, hissing, and speech — even though the microwave device didn't produce any soundwaves. It was all, quite literally, in their heads.
What Americans Heard in Cuba Attacks: The Sound www.youtube.com
How is that possible? Researchers have hypothesized that the noises are induced by thermoelastic expansion of bones and soft tissue in the body: As microwaves strike people, they slightly warm the body, which causes expansion. This expansion might produce sound waves that travel to the ear. Frey and other researchers have proposed different theories about which parts of the body are expanding — those in the head, or those in the ear — but the principle is the same.
To induce auditory effects, a pulsed-microwave weapon needs to transmit 40 joules per square centimeter, according to a U.S. Army report. How much energy is that? Here's how astrophysicist Dr. Ethan Siegel explained it to the American Council on Science and Health (ACSH): "If you are talking about 40 J/cm2 over the entire human body, that's about as much energy as a fully loaded Harley Davidson going 100 mph."
The Moscow Signal
We know such weapons exist, or at least did at one time. During the Cold War, the Soviet Union fired microwaves at the U.S. embassy in Moscow from a nearby apartment building for more than two decades, from 1953 to 1976. The event was dubbed the Moscow Signal.
U.S. intelligence officials initially thought the Soviets were firing the microwaves in an attempt to control the minds of American personnel, but they later reasoned that the Soviets were trying to activate espionage devices inside the building or interfere with the health of the diplomats. To this day, "many questions remain unanswered" about the long-term health effects incurred by Americans who worked at the U.S. embassy in Moscow, according to a 2019 review.
U.S. Embassy in Moscow, RussiaDzerod
A more recent example of an energy-directed weapon is the active denial system, an American technology that uses non-lethal millimeter waves for crowd control. These waves, which the U.S. says are not classified as microwaves, cause a painful heating sensation on the skin. The U.S. is also developing or has developed stronger directed-energy weapons, including microwave weapons that can destroy electronic systems from a distance.
Still, if energy-directed weapons are indeed causing Havana syndrome, what they look like and how they operate remains a mystery.
Who's behind the directed-energy attacks?
There's currently no "smoking gun" evidence for who's responsible for the attacks. But in December 2020, the CIA established a task force to investigate the more than 130 reported cases of Havana syndrome among U.S. personnel. In April, President Joe Biden's administration recently released a statement:
"The White House is working closely with departments and agencies to address unexplained health incidents and ensure the safety and security of Americans serving around the world. Given that we are still evaluating reported incidents and that we need to protect the privacy of individuals reporting incidents, we cannot provide or confirm specific details at this time."
Although the U.S. hasn't officially announced suspects, an anonymous former national security official involved in investigations recently told Politico that Russia is likely behind the attacks. Specifically, the official pointed to Russia's foreign military intelligence agency, commonly called the GRU, whose operatives were present in the locations where American personnel have reported Havana syndrome.
"It looks, smells, and feels like the GRU," said the official. "When you are looking at the landscape, there are very few people who are willing, capable and have the technology. It's pretty simple forensics."
Why Russia would carry out these attacks remains unclear. But the cases have already had a measurable impact on U.S. foreign policy, namely a 50-percent personnel withdrawal from the American embassy in Cuba, a nation that's long been allied with Russia.
U.S. Embassy in CubaU.S. State Department
While U.S. intelligence agencies now seem to be taking these threats seriously, that wasn't always the case. In the first few years after Americans first reported Havana syndrome, some officials were skeptical of the idea that a foreign adversary would launch such brazen attacks, especially on U.S. soil. Some current and former officials say this skepticism has come at the expense of U.S. personnel.
Marc Polymeropoulos, a former CIA officer who was struck by Havana syndrome in a Moscow hotel room in 2017, told the New York Times about a painting created by a fellow CIA officer and Havana syndrome victim. Called "The Gunshot," the painting depicts a red splatter on a black background.
"It signified his feeling that we all wished we had been shot, a visible injury, so that our colleagues would more readily believe us."
A new agricultural revolution could forever change the planet.
- Vertical farming leverages cutting-edge technology to grow food in a new and better way.
- One of its many benefits is that it can increase crop yield by 700 percent.
- Vertical farming can help relieve pressure on scarce resources and boost Earth's biodiversity.
One day soon, you could eat bananas grown in downtown Manhattan.
It's a way of growing food that turns traditional agriculture on its head. With the required technologies now rapidly maturing, vertical farming is sprouting across the globe.
While there are still unresolved issues with this marriage of technology and agriculture, its promise may be irresistible. If it gets off the ground — literally — in a major way, it could solve the problem of feeding the Earth's 7.9 billion people. And that's just one of the benefits its proponents promise.
Vertical farms could take over the world | Hard Reset by Freethink www.youtube.com
Agriculture through time
When humankind began planting crops for nutrition about 12,000 years ago, the nature of our hunter-gatherer species fundamentally shifted. For the first time, it's believed, people began staying put.
With agriculture as their central mission, communities formed, with the now-familiar arrangement of residential areas surrounded by land dedicated to growing food. Even today, with modern transportation making the widespread consumption of non-local foods common, this land-allocation model largely survives: population centers surrounded by large areas for growing vegetables and fruit and raising livestock.
Challenges facing traditional agriculture
Credit: Genetics4Good / Wikimedia
As our population has grown, traditional agriculture has begun facing some big challenges:
- Farmland takes up a lot of space and destroys biodiversity. Our World in Data reports that half of all habitable land is used for agriculture. As Nate Storey of Plenty, Inc., a vertical farming startup, puts it, "It is probably one of the most defining acts of humanity: We literally changed the ecosystem of the entire planet to meet our dietary needs."
- The demand for farmland — both for produce and livestock — has led to a dangerous deforestation in several parts of the world. This also results in biodiversity loss and contributes to an increase in the greenhouse gases that drive climate change.
- Degradation of farmland, such as through soil erosion, poses a threat to agricultural productivity.
- Agriculture consumes copious amounts of water, which exacerbates water shortages. (Obviously, water shortages also reduce agricultural productivity.)
- Fertilizer run-off causes substantial environmental damage, such as algal blooms and fish kills.
- Pesticides can degrade the environment by affecting non-target organisms.
- The effects of climate change are already making agriculture more challenging due to significant shifts in weather, changes to growing seasons, and realignment of water supplies. Our climate is continuing to change in unexpected ways, and the only predictable aspect of what lies ahead is unpredictability.
Vertical farming proponents expect that a re-think of how we grow food can ultimately solve these problems.
What is vertical farming?
Credit: Freethink Media / Plenty, Inc.
Vertical farming is a form of agriculture that grows plants indoors in floor-to-ceiling, tower-like walls of plant-holding cells. Instead of growing plants in horizontal fields on the ground, as in traditional farming, you can think of vertical farming's "fields" as standing on edge and extending upward toward the ceiling. The plants need no soil or other aggregate medium in which to grow; their roots are typically held in a cell lining, often composed of coconut fiber.
Vertical flora is grown either aeroponically, in which water and nutrients are delivered to plants via misting, or hydroponically, in which plants are grown in nutrient-rich water. These are incredibly efficient systems, requiring 95% less irrigation than soil-grown plants. With vertical farming, Storey says that 99 percent of the moisture transpired by plants can be recaptured, condensed, and recirculated.
Plants, of course, also need light to grow, and vertical farms use increasingly efficient LED bulbs to keep plants thriving.
Vertical farms can increase crop yields by 700 percent
Credit: pressmaster / Adobe Stock
If vertical farming takes off the way its supporters believe it should and will, it may solve many of the aforementioned challenges facing agriculture.
Crop yields with vertical farming far exceed what's possible with traditional agriculture. Plenty, Inc.'s Shireen Santosham notes that the highly controlled growing environment of vertical farming has allowed her company to reduce the growing time for some crops to as little as 10 days. Without needing to consider weather or even sunlight, combined with the ability to operate 365 days a year, their system increases the potential annual yield by about 700 percent.
The land requirement for vertical farming is a mere fraction of that for traditional agriculture. Santosham says it can be done in a building the size of a big-box retail store that can be built pretty much anywhere that has adequate utilities, including within major urban centers. The tightly controlled environment of a vertical farm should also eliminate the need for applied pesticides.
Yet another benefit of vertical farming is the return of land currently needed for food production back to the planet. This could help facilitate Earth's recovery from deforestation and return much needed habitat to threatened or endangered species. Of course, if we ever colonize the moon or Mars, vertical farming will be the go-to option for feeding the colonists.
Several vertical farming company pioneers are already getting their high-quality crops into the hands, and mouths, of consumers. Plenty, Inc. has an eponymous line of greens, and Aerofarms has their FlavorSpectrum line. Both companies claim that their products are exceptionally tasty, a result of their carefully controlled growing environments in which computer-controlled lighting can be optimized to bring out the most desirable qualities of each crop.
Credit: Alesia Berlezova / Adobe Stock
The history of vertical farming
The idea of vertical farming isn't new, and experts have been questioning its viability since the term was first coined in 1915 by Gilbert Ellis Bailey, who was obviously way ahead of the available technology at the time. The first attempt to grow produce in a constructed environment was a Danish farmhouse factory that was built to grow cress, a peppery green related to mustard, in the 1950s.
The modern concept of a vertical farm arose in the New York classroom of Columbia University's Dickson Despommier in 1999. He presented the idea as a theoretical construct, a mental/mathematical exercise imagining how to farm in an environmentally sound manner. His class began with the notion of a rooftop garden before considering a "high-rise" version that might theoretically be able to grow enough rice to feed two percent of Manhattan's population at the time. The eureka moment was a question Dispommier asked: "If it can't be done using rooftops, why don't we just grow the crops inside the buildings? We already know how to cultivate and water plants indoors."
With the technological advances of the last few decades, vertical farming is now a reality. Our sister site, Freethink, recently paid Plenty, Inc. a visit. (See video above.)
Vertical farming today
Credit: Nelea Reazanteva / Adobe Stock
Today, growers across the globe are developing vertical farms. While the U.S. has more vertical farms than any other country, the industry is blooming everywhere.
There are currently over 2,000 vertical farms in the U.S. While more than 60 percent of these are owned by small growers, there are a few heavyweights as well. In addition to Wyoming's Plenty, Inc. and Newark's Aerofarms, there's also New York's Bowery Farming. There are also companies such as edengreen, based in Texas, whose mission is to help new entrants construct and operate vertical farms.
Japan comes in second, with about 200 vertical farms currently in operation. The largest vertical farming company there is SPREAD. Across Asia, vertical farms are operating in China, South Korea, Singapore, Thailand, and Taiwan. In Europe, vertical growers are in Germany, France, Netherlands, and the U.K. Germany is also home to the Association for Vertical Farming, "the leading global, non-profit organization that enables international exchange and cooperation in order to accelerate the development of the indoor/vertical farming industry."In the Middle East, whose desert land and scarcity of water present a particularly challenging agricultural environment, vertical farming is taking root, so to speak. The United Arab Emirates' Badia Farms is now producing more than 3,500 kilograms of high-quality produce each day and expects to increase that yield going forward. In Kuwait, NOX Management launched in the summer of 2020 with plans to produce 250 types of greens, with a daily output of 550 kg of salads, herbs, and cresses.
The economics of vertical farming
Credit: meryll / Adobe Stock
Building and operating a vertical farm is a costly endeavor, requiring a substantial initial investment in state-of-the-art technology, real estate, and construction. AgFunderNews (AFN) estimates that it can cost $15 million to construct a modern vertical farm. Fortunately, investors see the potential in vertical farming, and the industry has attracted more than $1 billion in investments since 2015. That includes $100 million for Aerofarms. Plenty, Inc raised $200 million in 2017 from a fund backed by such respected forward-thinkers as Jeff Bezos and Alphabet chairman Eric Schmidt.
AFN is particularly excited by the potential of what they call second-generation vertical farming technology. They cite advances in LED technology — expected to increase energy efficiency by 70 percent by 2030 — and increasingly sophisticated automation that can streamline the operation of vertical farms. AFN anticipates operating cost reduction of 12 percent due to improvements in lighting and another 20 percent from advances in automation.
BusinessWire says that the vertical farming produce market was valued at nearly $240 million in 2019, and they expect it to grow 20 percent annually to over $1 billion by 2027.
A welcome disruption
Veritical farming will be disruptive.
Vertical farming would eliminate the need for the arduous work of harvesting crops by hand from vast tracts of farmland. Current picking jobs, the company says, can be replaced by better-paying, full-time jobs available 365 days a year in better working conditions — and in the variety of geographic locations in which vertical farms can operate.
There are two caveats, however. First, the number of people needed to manage and harvest vertical farm crops will be far fewer than the many farmworkers required for less efficiently planted traditional fields. Second, with automation becoming ever-more capable — and perhaps a key to eventual profitability — one wonders just how many new jobs ultimately will be created.
But the societal benefits far outweigh any costs. As Plenty's Storey muses, "Like most everything in the world, we can only save our species if it makes economic sense." Thankfully, it does make economic sense.