from the world's big
Living with ADHD: how I learned to make distraction work for me
Could better teaching practices make paying attention easier for everyone?
I'm prone to experiencing 'blank' patches in conversation, when I suddenly realise I have no recollection of the past 30 or so seconds of what's been said, as if someone has skipped forward through the video feed of my life (occasionally, I resort to 'masking', or feigning comprehension – which is embarrassing). When watching television, I struggle not to move, often rising to pace and fidget, and I dread being the 'owner' of complicated documents and spreadsheets, as I'm very likely to miss some crucial detail.
This year, I twice missed a doctor's appointment because the surgery would send reminders only by paper mail. My reliance on to-do lists and prompts is unceasing, vigilant – else even the most essential tasks might be entirely forgotten. Occasionally I 'hyperfocus': the incessant flicker and hum of everyday life recedes as I lose track of time, pouring myself steadily into one topic, reading hundreds of pages or writing thousands of words.
I used to see all this primarily as a deficit but, having built a career that helped me better understand what I struggled with and which put those selfsame 'deficits' to good purpose, I no longer view things that way. Instead, these days I see my own distracted nature as a source of keen awareness for the fragility of all attention.
I work in instructional design, which is the practice of developing engaging and effective educational products and experiences to help others learn. In creating interactive classes and workshops, my aim is to cultivate the learners' attention and focus, but one of the first things I learned was that this is incredibly difficult, for everyone – neurotypical or otherwise. In fact, there are common rules of thumb that reflect how universally short attention spans really are: one is that even 10 minutes of lecturing is too long for some people to follow (think of the number of times that you've caught yourself, or someone near you, wilting during a long meeting, presentation or conference paper). The trick is to intersperse lectures with exercises and discussions. Moreover, research increasingly suggests that people are more likely to take in new ideas and information when it relates to something they already care about. All of this is magnified for people diagnosed with ADHD, who lack focus, unless there's a strong and clear connection to their immediate concerns, but who can nonetheless focus profoundly when this element of deep interest is present.
Working in instructional design has convinced me that our education system is poorly suited to nearly everyone, not just those diagnosed with ADHD. Most curricula lack a preliminary phase of collectively exploring students' existing interests, before introducing them to material in a way that will be relevant to what they already care about. Most classes, especially in secondary school and higher education, still rely on lectures of (far) more than five minutes straight. In contrast, notice how social media, video games and so many other aspects of our lives accommodate and exploit our fleeting attention spans, customising their design and content to fit our interests and grab hold of our attention. Many parents of children with ADHD despair over their children's greater interest in video games than mathematics, but perhaps they should be concerned with why the maths problems and classes cannot more commonly be made just as engaging as the games.
Some games and even a few special classrooms are indeed like this: GCSE coursework for maths in the UK has taken the lead on this, with gamified online homework. But why, in an age where we know that learning can be made nearly addictive, is this type of format not one of the standard ways we engage young (and older) minds? Redesigning curricula is a relatively inexpensive educational intervention, compared with revamping technology or adding classroom instructors.
Until this happens, the distracted can always practise 'learning to learn', as my psychologists used to call it. For me, this began in the 1990s with colour-coded folders and a planner, and has since grown into a sprawling Google calendar. Meticulously, I track each hour of my working life (and many personal hours, too). Obsessively, I declutter to avoid visual distraction. I return to my to-do lists over and over during the day.
I have also learned to make space for distraction – which can, after all, also mean being alive to one's surroundings, curious about new possibilities, and multifaceted in one's interests. Getting distracted (even taking note of which interesting distractions to return to later) has helped me think about learning differently: not all learning requires sustained focus, some forms of creative and conceptual thinking benefit from repeatedly returning to a topic so as to view it differently each time.
Therefore, in learning, as in life, it might be wise not merely to redirect the attention of those with ADHD but also to help them reflect on what draws their interest and why, using, for example, the age-old business of play – only with a reflective stage where children might come to recognise and learn from their own thought patterns, and develop the skill of 'metacognition', or thinking about their own thinking. This reflexive process is a core part of managing our attention, and of learning about the world and oneself, especially in an age that offers constant distractions.
I am keenly aware that I managed my ADHD in large part due to enormous privileges: financial resources, an excellent US public school system, and deeply motivated and switched-on parents. Few people with ADHD have these privileges, and many who are diagnosed end up on drugs that, when taken in childhood, can stunt physical growth, and which can be addictive, sometimes without longterm benefits. While it might be best for some to take medication for ADHD, it is troubling that so many get little else in the way of help and intervention, generally because medication is cheaper and more accessible than other educational support.
We can certainly continue to study and debate whether ADHD is biologically rooted, the product of our attention-fractured society, or more likely a complex result of interdependent social and biological factors. Yet so many debates on this topic remain stuck on the ills of the internet or the merits of medication, instead of redirecting our focus to the wider issues around attention and learning that concern us all. Better forms of pedagogy, reflective practice and communication will not resolve every problem related to human attention, but they could help everyone learn much better – not just those of us with this particular diagnosis.
Sarah Stein Lubrano
- A Surprising Strategy Makes Kids Persevere at Boring Tasks ›
- Can a Fidget Spinner Really Help You Focus? - Big Think ›
Construction of the $500 billion dollar tech city-state of the future is moving ahead.
- The futuristic megacity Neom is being built in Saudi Arabia.
- The city will be fully automated, leading in health, education and quality of life.
- It will feature an artificial moon, cloud seeding, robotic gladiators and flying taxis.
The Red Sea area where Neom will be built:
Saudi Arabia Plans Futuristic City, "Neom" (Full Promotional Video)<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="c646d528d230c1bf66c75422bc4ccf6f"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/N53DzL3_BHA?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span>
Are we genetically inclined for superstition or just fearful of the truth?
- From secret societies to faked moon landings, one thing that humanity seems to have an endless supply of is conspiracy theories. In this compilation, physicist Michio Kaku, science communicator Bill Nye, psychologist Sarah Rose Cavanagh, skeptic Michael Shermer, and actor and playwright John Cameron Mitchell consider the nature of truth and why some groups believe the things they do.
- "I think there's a gene for superstition, a gene for hearsay, a gene for magic, a gene for magical thinking," argues Kaku. The theoretical physicist says that science goes against "natural thinking," and that the superstition gene persists because, one out of ten times, it actually worked and saved us.
- Other theories shared include the idea of cognitive dissonance, the dangerous power of fear to inhibit critical thinking, and Hollywood's romanticization of conspiracies. Because conspiracy theories are so diverse and multifaceted, combating them has not been an easy task for science.
A growing body of research suggests COVID-19 can cause serious neurological problems.
- The new study seeks to track the health of 50,000 people who have tested positive for COVID-19.
- The study aims to explore whether the disease causes cognitive impairment and other conditions.
- Recent research suggests that COVID-19 can, directly or indirectly, cause brain dysfunction, strokes, nerve damage and other neurological problems.
Brain images of a patient with acute demyelinating encephalomyelitis.
COVID-19 and the brain<p>A growing body of research reveals alarming neurological complications among COVID-19 patients. On Wednesday, for example, researchers from University College London published a <a href="https://academic.oup.com/brain/article/doi/10.1093/brain/awaa240/5868408" target="_blank">study</a> in the journal Brain that describes how some patients have suffered temporary brain dysfunction, strokes, nerve damage, and other neurological problems concurrent with COVID-19.</p><p>Some patients suffered brain inflammation as a result of a rare disease called acute disseminated encephalomyelitis, which can cause numbness, seizures, and confusion. One patient in the study even hallucinated monkeys and lions in her home.</p>
Photo by Mario Tama/Getty Images<p>A separate study published in the <a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7198407/" target="_blank">Journal of Clinical Neuroscience</a> notes that some COVID-19 patients have also suffered neurological complications like impaired consciousness and acute cerebrovascular disease. The study notes that past viruses like MERS and SARS also seemed to cause neurological problems.</p><p>A troubling finding among this growing body of research is that some patients seem to suffer neurological damage even when respiratory symptoms aren't obvious. Additionally, scientists aren't sure whether damage from the disease will be permanent.</p><p style="margin-left: 20px;">"Given that the disease has only been around for a matter of months, we might not yet know what long-term damage COVID-19 can cause," Dr. Ross Paterson, joint first author of the University College London study, said in a <a href="https://www.eurekalert.org/pub_releases/2020-07/ucl-iid070620.php" target="_blank">press release</a>. "Doctors needs to be aware of possible neurological effects, as early diagnosis can improve patient outcomes."</p><p>If you've been diagnosed with COVID-19 and want to enroll in the study, visit <a href="https://www.cambridgebrainsciences.com/studies/covid-brain-study" target="_blank">cambridgebrainsciences.com/studies/covid-brain-study</a>.</p>
Coronavirus layoffs are a glimpse into our automated future. We need to build better education opportunities now so Americans can find work in the economy of tomorrow.