from the world's big
Will coding become a basic life skill? Yes and no, say experts
Almost all experts agree that coding will become nearly as ubiquitous as literacy in the future. But the nature of coding in the future may be very different.
- Coding is increasingly being taught in high schools, and it's become a desirable skill even outside of the tech industry.
- Experts argue that coding is becoming the new literacy; a skill so fundamental that everyone should possess it to some degree.
- However, the nature of coding in the future is likely to be wildly different than it is today.
It's one of the most sought-after skills out there, and for good reason. Learning to program is difficult, despite what advocates of the "Learn to Code" movement may say. Human minds are a confluence of assumptions, biases, and irrational fantasies, and forcing these fickle things to speak in the rigorous language of computer programming takes work. Programming is difficult, but it's also extremely valuable and — increasingly — necessary.
Many believe that just as basic computer skills went from the realm of specialists to a life skill everyone possesses, so too will programming become ubiquitous. Learning to code might become as commonplace as learning to read. Will this really be the case? And if so, what will the programmers of the future look like?
Teaching students to code
In 2016, Gallup and Google partnered together to quantify exactly how prevalent programming classes were in K–12 education. They found that 40 percent of all schools offered at least one coding class, but the truly illuminating indicator was that just a year before, this number was 25 percent. One can only imagine how quickly coding has grown in the years since the 2016 report.
Apple CEO Tim Cook underscored the importance of learning to code during a conversation he had with President Trump at the White House Policy Advisory Board in March of 2019: "We believe strongly that it should be a requirement in the United States for every kid to have coding before they graduate from K–12 and become somewhat proficient at it." The city of Chicago appears to have listened to Cook. Chicago recently made having at least one credit of computer science a high school graduation requirement. Other municipalities and states are likely to follow suit.
There's a very clear trend here. Coding is becoming an increasingly core part of a modern education. It seems to check all the boxes: not only does it train children to think logically and rigorously, its also a skill that will help secure them a lucrative job in the future. Coding is clearly being adopted at a high rate, but how far will this adoption spread?
Will knowing how to code be as common as knowing how to read?
English professor Annette Vee certainly thinks so. In her book, Coding Literacy: How Computer Programming is Changing Writing, Vee compares the role of programming in society with the role that literacy has had historically. Vee notes that in the Middle Ages, "Writing was a specialized skill and people became defined by their writing." As time went on, however, literacy became increasingly common and increasingly necessary. "If you couldn't read, you were left out." Vee argues that the computationally illiterate will increasingly have to rely on others to navigate daily life in a way that will seriously hamper their prospects. "If you don't know how to program, you can carry on with a perfectly fine life. But this is going to change soon."
"Programming is too important to be left just to computer science departments," said Vee. "It can be taught effectively outside of computer science. If we assume that those who learn to write need to be English majors, we would be in trouble." This observation is also being reflected in the workplace. The tech industry isn't the only place where coding skills are valuable. Programming is an increasingly desired skill in the healthcare and finance industries, among others.
The impact of low-code platforms and machine learning
While the breadth of programming skills may increase in the future, its depth is likely to decrease. More people will become fluent programmers, but the share of expert programmers probably won't increase to the same degree. That number might even shrink as they become less necessary and as programming tools become more advanced and powerful.
Part of this is due to the rise of low-code platforms. As defined by Forrester Research, low-code platforms "enable rapid delivery of business applications with a minimum of hand-coding and minimal upfront investment in setup, training, and deployment." These are platforms such as Salesforce or AgilePoint that simplify specific technical challenges (such as Salesforce with customer relations) or act as a generic tool for quickly building applications (as is the case with AgilePoint).
Low-code platforms will make it easier for non-experts to contribute to software development in the near future, but they represent part of a larger trend, too. Automation and machine learning are quickly transforming the nature of work, and software development is no exception. An automated future might mean that nobody will really need to know how to program anymore. Google AI researcher Pete Warden believes this change will come quickly. "There will be a long ramp-up as knowledge diffuses through the developer community," wrote Warden in a 2017 blog post, "but in ten years I predict most software jobs won't involve programming."
In order for a machine-learning algorithm to work correctly, it needs access to the right kind of data. An algorithm that automatically identifies people's faces from photographs, for instance, needs to be trained on a dataset where people's faces are tagged, so it can know what to look for. Warden thinks that tasks like this will become the software developer's primary job in the future: "Instead of writing and maintaining intricate, layered tangles of logic, the developer has to become a teacher, a curator of training data and an analyst of results."
Investor and entrepreneur Mark Cuban also believes that this will be the case. He predicts that for this very reason, people who are experts in fields outside of computer science will become indispensable to software development. "Because it's just math and so, whatever we're defining the AI to do, someone's got to know the topic," he said on an episode of Recode Decode. "If you're doing an AI to emulate Shakespeare, somebody better know Shakespeare [...] The coding major who graduates this year probably has better short-term opportunity than the liberal arts major that's a Shakespeare expert, but long term, it's like people who learned COBOL or Fortran and thought that was the future and they were going to be covered forever."
Altogether, it looks as though coding will indeed become a basic life skill similar to literacy, but the nature of coding and computer science is also going to change in significant and unpredictable ways. As the need for expertise diminishes due to machine learning, everyone will likely become a novice programmer, familiar with coding just to the extent that it is relevant for their job. Everyone can read and write today, but not everyone can write a best-selling novel or a nuanced critique of Jane Austen. In the future, this relationship will likely hold true for programming as well; the masses will know enough about programming and computer science to make use of flexible, smart, and robust software tools, while a handful of experts will continue to push the field forward.
- China Is Teaching Kids to Code Much, Much Earlier than the U.S. ... ›
- Why coding skills alone won't save you from job automation - Big Think ›
- Are You Illiterate If You Don't Know How to Program? - Big Think ›
Innovation in manufacturing has crawled since the 1950s. That's about to speed up.
Health officials in China reported that a man was infected with bubonic plague, the infectious disease that caused the Black Death.
- The case was reported in the city of Bayannur, which has issued a level-three plague prevention warning.
- Modern antibiotics can effectively treat bubonic plague, which spreads mainly by fleas.
- Chinese health officials are also monitoring a newly discovered type of swine flu that has the potential to develop into a pandemic virus.
Bacteria under microscope
needpix.com<p>Today, bubonic plague can be treated effectively with antibiotics.</p><p style="margin-left: 20px;">"Unlike in the 14th century, we now have an understanding of how this disease is transmitted," Dr. Shanthi Kappagoda, an infectious disease physician at Stanford Health Care, told <a href="https://www.healthline.com/health-news/seriously-dont-worry-about-the-plague#Heres-how-the-plague-spreads" target="_blank">Healthline</a>. "We know how to prevent it — avoid handling sick or dead animals in areas where there is transmission. We are also able to treat patients who are infected with effective antibiotics, and can give antibiotics to people who may have been exposed to the bacteria [and] prevent them [from] getting sick."</p>
This plague patient is displaying a swollen, ruptured inguinal lymph node, or buboe.
Centers for Disease Control and Prevention<p>Still, hundreds of people develop bubonic plague every year. In the U.S., a handful of cases occur annually, particularly in New Mexico, Arizona and Colorado, <a href="https://www.cdc.gov/plague/faq/index.html" target="_blank">where habitats allow the bacteria to spread more easily among wild rodent populations</a>. But these cases are very rare, mainly because you need to be in close contact with rodents in order to get infected. And though plague can spread from human to human, this <a href="https://www.healthline.com/health-news/seriously-dont-worry-about-the-plague#Heres-how-the-plague-spreads" target="_blank">only occurs with pneumonic plague</a>, and transmission is also rare.</p>
A new swine flu in China<p>Last week, researchers in China also reported another public health concern: a new virus that has "all the essential hallmarks" of a pandemic virus.<br></p><p>In a paper published in the <a href="https://www.pnas.org/content/early/2020/06/23/1921186117" target="_blank">Proceedings of the National Academy of Sciences</a>, researchers say the virus was discovered in pigs in China, and it descended from the H1N1 virus, commonly called "swine flu." That virus was able to transmit from human to human, and it killed an estimated 151,700 to 575,400 people worldwide from 2009 to 2010, according to the Centers for Disease Control and Prevention.</p>There's no evidence showing that the new virus can spread from person to person. But the researchers did find that 10 percent of swine workers had been infected by the virus, called G4 reassortant EA H1N1. This level of infectivity raises concerns, because it "greatly enhances the opportunity for virus adaptation in humans and raises concerns for the possible generation of pandemic viruses," the researchers wrote.
So far, 30 student teams have entered the Indy Autonomous Challenge, scheduled for October 2021.
- The Indy Autonomous Challenge will task student teams with developing self-driving software for race cars.
- The competition requires cars to complete 20 laps within 25 minutes, meaning cars would need to average about 110 mph.
- The organizers say they hope to advance the field of driverless cars and "inspire the next generation of STEM talent."
Indy Autonomous Challenge<p>Completing the race in 25 minutes means the cars will need to average about 110 miles per hour. So, while the race may end up being a bit slower than a typical Indy 500 competition, in which winners average speeds of over 160 mph, it's still set to be the fastest autonomous race featuring full-size cars.</p><p style="margin-left: 20px;">"There is no human redundancy there," Matt Peak, managing director for Energy Systems Network, a nonprofit that develops technology for the automation and energy sectors, told the <a href="https://www.post-gazette.com/business/tech-news/2020/06/01/Indy-Autonomous-Challenge-Indy-500-Indianapolis-Motor-Speedway-Ansys-Aptiv-self-driving-cars/stories/202005280137" target="_blank">Pittsburgh Post-Gazette</a>. "Either your car makes this happen or smash into the wall you go."</p>
Illustration of the Indy Autonomous Challenge
Indy Autonomous Challenge<p>The Indy Autonomous Challenge <a href="https://www.indyautonomouschallenge.com/rules" target="_blank">describes</a> itself as a "past-the-post" competition, which "refers to a binary, objective, measurable performance rather than a subjective evaluation, judgement, or recognition."</p><p>This competition design was inspired by the 2004 DARPA Grand Challenge, which tasked teams with developing driverless cars and sending them along a 150-mile route in Southern California for a chance to win $1 million. But that prize went unclaimed, because within a few hours after starting, all the vehicles had suffered some kind of critical failure.</p>
Indianapolis Motor Speedway
Indy Autonomous Challenge<p>One factor that could prevent a similar outcome in the upcoming race is the ability to test-run cars on a virtual racetrack. The simulation software company Ansys Inc. has already developed a model of the Indianapolis Motor Speedway on which teams will test their algorithms as part of a series of qualifying rounds.</p><p style="margin-left: 20px;">"We can create, with physics, multiple real-life scenarios that are reflective of the real world," Ansys President Ajei Gopal told <a href="https://www.wsj.com/articles/autonomous-vehicles-to-race-at-indianapolis-motor-speedway-11595237401?mod=e2tw" target="_blank">The Wall Street Journal</a>. "We can use that to train the AI, so it starts to come up to speed."</p><p>Still, the race could reveal that self-driving cars aren't quite ready to race at speeds of over 110 mph. After all, regular self-driving cars already face enough logistical and technical roadblocks, including <a href="https://www.bbc.com/news/technology-53349313#:~:text=Tesla%20will%20be%20able%20to,no%20driver%20input%2C%20he%20said." target="_blank">crumbling infrastructure, communication issues</a> and the <a href="https://bigthink.com/paul-ratner/would-you-ride-in-a-car-thats-programmed-to-kill-you" target="_self">fateful moral decisions driverless cars will have to make in split seconds</a>.</p>But the Indy Autonomous Challenge <a href="https://static1.squarespace.com/static/5da73021d0636f4ec706fa0a/t/5dc0680c41954d4ef41ec2b2/1572890638793/Indy+Autonomous+Challenge+Ruleset+-+v5NOV2019+%282%29.pdf" target="_blank">says</a> its main goal is to advance the industry, by challenging "students around the world to imagine, invent, and prove a new generation of automated vehicle (AV) software and inspire the next generation of STEM talent."
A new Harvard study finds that the language you use affects patient outcome.
- A study at Harvard's McLean Hospital claims that using the language of chemical imbalances worsens patient outcomes.
- Though psychiatry has largely abandoned DSM categories, professor Joseph E Davis writes that the field continues to strive for a "brain-based diagnostic system."
- Chemical explanations of mental health appear to benefit pharmaceutical companies far more than patients.
Challenging the Chemical Imbalance Theory of Mental Disorders: Robert Whitaker, Journalist<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="41699c8c2cb2aee9271a36646e0bee7d"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/-8BDC7i8Yyw?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span><p>This is a far cry from Howard Rusk's 1947 NY Times editorial calling for mental healt</p><p>h disorders to be treated similarly to physical disease (such as diabetes and cancer). This mindset—not attributable to Rusk alone; he was merely relaying the psychiatric currency of the time—has dominated the field for decades: mental anguish is a genetic and/or chemical-deficiency disorder that must be treated pharmacologically.</p><p>Even as psychiatry untethered from DSM categories, the field still used chemistry to validate its existence. Psychotherapy, arguably the most efficient means for managing much of our anxiety and depression, is time- and labor-intensive. Counseling requires an empathetic and wizened ear to guide the patient to do the work. Ingesting a pill to do that work for you is more seductive, and easier. As Davis writes, even though the industry abandoned the DSM, it continues to strive for a "brain-based diagnostic system." </p><p>That language has infiltrated public consciousness. The team at McLean surveyed 279 patients seeking acute treatment for depression. As they note, the causes of psychological distress have constantly shifted over the millennia: humoral imbalance in the ancient world; spiritual possession in medieval times; early childhood experiences around the time of Freud; maladaptive thought patterns dominant in the latter half of last century. While the team found that psychosocial explanations remain popular, biogenetic explanations (such as the chemical imbalance theory) are becoming more prominent. </p><p>Interestingly, the 80 people Davis interviewed for his book predominantly relied on biogenetic explanations. Instead of doctors diagnosing patients, as you might expect, they increasingly serve to confirm what patients come in suspecting. Patients arrive at medical offices confident in their self-diagnoses. They believe a pill is the best course of treatment, largely because they saw an advertisement or listened to a friend. Doctors too often oblige without further curiosity as to the reasons for their distress. </p>
Image: Illustration Forest / Shutterstock<p>While medicalizing mental health softens the stigma of depression—if a disorder is inheritable, it was never really your fault—it also disempowers the patient. The team at McLean writes,</p><p style="margin-left: 20px;">"More recent studies indicate that participants who are told that their depression is caused by a chemical imbalance or genetic abnormality expect to have depression for a longer period, report more depressive symptoms, and feel they have less control over their negative emotions."</p><p>Davis points out the language used by direct-to-consumer advertising prevalent in America. Doctors, media, and advertising agencies converge around common messages, such as everyday blues is a "real medical condition," everyone is susceptible to clinical depression, and drugs correct underlying somatic conditions that you never consciously control. He continues,</p><p style="margin-left: 20px;">"Your inner life and evaluative stance are of marginal, if any, relevance; counseling or psychotherapy aimed at self-insight would serve little purpose." </p><p>The McLean team discovered a similar phenomenon: patients expect little from psychotherapy and a lot from pills. When depression is treated as the result of an internal and immutable essence instead of environmental conditions, behavioral changes are not expected to make much difference. Chemistry rules the popular imagination.</p>