Once a week.
Subscribe to our weekly newsletter.
The Neuroscience of Creativity
Archimedes in the bathtub, Newton and the apple, Einstein's theory of special relativity -- Eureka! moments are what happens when hours of work come together in a single creative flash. In his new blog for Big Think, Sam McNerney will be dissecting these moments of genius, asking what is the nature of creativity? How do we arrive at new insights?
A philosophy-student-turned-science-writer whose work has appeared in Scientific American, McNerney will explore how we can use the empirical discoveries of cognitive science to enhance our lives, with a special focus on creativity, decision-making, and social psychology. Big Think asked him to tell us about his own writing inspiration.
Who are you?
Hello BigThink.com! My name is Sam McNerney and I’m moderately awesome. I’m originally from Minneapolis Minnesota, but I currently live in Manhattan. I consider myself a philosopher who is inspired and informed by the empirical side of cognitive science. I look forward to sharing my interests and ideas here at Big Think.
My passion for cognitive science started in college when I started taking philosophy classes and reading any popular psychology book I could get my hands on. After graduating, I started a cognitive science blog and began freelancing. Now, I want to use my background as a cognitive science enthusiast and blogger to figure out what makes humans tick. Hopefully, by exploring cutting edge research and original ideas about the brain I can write interesting blog posts worth reading and sharing. I also look forward to interacting with my readership.
What are some of the topics you're going to be exploring in this blog?
Anything cognitive science. However, I will focus on creativity, decision-making and social psychology. The idea behind the blog is to use cognitive science to inform the creative professional. Specifically, it’s for the businessperson, artists or anybody in between who could benefit from findings in cognitive science. It’s about translating findings in psychology and neuroscience so we can be more productive, make better decisions, be more creative, collaborate efficiently and solve problems effectively.
Why should creativity matter to everyone?
For most of human history creativity was something that came from the muses; it was otherworldly. Cognitive science shows this to be false. Creativity can be understood as mental processes that occur in that 3-pound organ we call the brain.
The 21st century still maintains its fair share of myths, however. First, while some people might be more creative by their nature, little evidence supports the idea that people are either creative or not. Instead, the science is showing that, with the exception of special cases, creativity is a skill that people can work on.
Second, let’s not forget that creativity is a catch term for a number of distinct cognitive processes. The creative process is about hard work, tinkering, daydreaming, other people, where you live and much more. Knowledge of any aspect of this process will certainly help us in our jobs, not matter what we do.
Why brain science and creativity? What's the connection?
You can’t understand creativity without understanding the brain science behind creativity! Well, that’s not entirely true. In fact, the great scientists, writers and inventors throughout history can teach us a lot about creativity and they didn’t know very much about how the brain really works. At the end of the day we can learn a lot about creativity from science and the humanities.
Where does your own inspiration come from? What's your writing process like?
I guess I have two sources of inspiration. The first is what seems to be my natural philosophical mind. For as long as I can remember, I’ve been interested in human behavior and asking the big questions. Most of the time I was horribly wrong with my theories, but my passion was there from the beginning.
The second source is other people and their ideas. I am constantly reading the latest pop psych books, interesting studies, other cognitive science blogs and watching TED talks and other lectures. Spending a few hours musing on these sources usually sparks a topic to write about.
I’m not sure what my writing process is like! It seems to change every week as I am still trying to identify a few useful strategies. That being said, I find it helps to write everything first and edit second As John Steinbeck said: “Write freely and as rapidly as possible and throw the whole thing on paper. Never correct or rewrite until the whole thing is down. Rewrite in process is usually found to be an excuse for not going on. It also interferes with flow and rhythm which can only come from a kind of unconscious association with the material.”
A Harvard professor's study discovers the worst year to be alive.
- Harvard professor Michael McCormick argues the worst year to be alive was 536 AD.
- The year was terrible due to cataclysmic eruptions that blocked out the sun and the spread of the plague.
- 536 ushered in the coldest decade in thousands of years and started a century of economic devastation.
The past year has been nothing but the worst in the lives of many people around the globe. A rampaging pandemic, dangerous political instability, weather catastrophes, and a profound change in lifestyle that most have never experienced or imagined.
But was it the worst year ever?
Nope. Not even close. In the eyes of the historian and archaeologist Michael McCormick, the absolute "worst year to be alive" was 536.
Why was 536 so bad? You could certainly argue that 1918, the last year of World War I when the Spanish Flu killed up to 100 million people around the world, was a terrible year by all accounts. 1349 could also be considered on this morbid list as the year when the Black Death wiped out half of Europe, with up to 20 million dead from the plague. Most of the years of World War II could probably lay claim to the "worst year" title as well. But 536 was in a category of its own, argues the historian.
It all began with an eruption...
According to McCormick, Professor of Medieval History at Harvard University, 536 was the precursor year to one of the worst periods of human history. It featured a volcanic eruption early in the year that took place in Iceland, as established by a study of a Swiss glacier carried out by McCormick and the glaciologist Paul Mayewski from the Climate Change Institute of The University of Maine (UM) in Orono.
The ash spewed out by the volcano likely led to a fog that brought an 18-month-long stretch of daytime darkness across Europe, the Middle East, and portions of Asia. As wrote the Byzantine historian Procopius, "For the sun gave forth its light without brightness, like the moon, during the whole year." He also recounted that it looked like the sun was always in eclipse.
Cassiodorus, a Roman politician of that time, wrote that the sun had a "bluish" color, the moon had no luster, and "seasons seem to be all jumbled up together." What's even creepier, he described, "We marvel to see no shadows of our bodies at noon."
...that led to famine...
The dark days also brought a period of coldness, with summer temperatures falling by 1.5° C. to 2.5° C. This started the coldest decade in the past 2300 years, reports Science, leading to the devastation of crops and worldwide hunger.
...and the fall of an empire
In 541, the bubonic plague added considerably to the world's misery. Spreading from the Roman port of Pelusium in Egypt, the so-called Plague of Justinian caused the deaths of up to one half of the population of the eastern Roman Empire. This, in turn, sped up its eventual collapse, writes McCormick.
Between the environmental cataclysms, with massive volcanic eruptions also in 540 and 547, and the devastation brought on by the plague, Europe was in for an economic downturn for nearly all of the next century, until 640 when silver mining gave it a boost.
Was that the worst time in history?
Of course, the absolute worst time in history depends on who you were and where you lived.
Native Americans can easily point to 1520, when smallpox, brought over by the Spanish, killed millions of indigenous people. By 1600, up to 90 percent of the population of the Americas (about 55 million people) was wiped out by various European pathogens.
Like all things, the grisly title of "worst year ever" comes down to historical perspective.
A new paper reveals that the Voyager 1 spacecraft detected a constant hum coming from outside our Solar System.
Voyager 1, humanity's most faraway spacecraft, has detected an unusual "hum" coming from outside our solar system. Fourteen billion miles away from Earth, the Voyager's instruments picked up a droning sound that may be caused by plasma (ionized gas) in the vast emptiness of interstellar space.
Launched in 1977, the Voyager 1 space probe — along with its twin Voyager 2 — has been traveling farther and farther into space for over 44 years. It has now breached the edge of our solar system, exiting the heliosphere, the bubble-like region of space influenced by the sun. Now, the spacecraft is moving through the "interstellar medium," where it recorded the peculiar sound.
Stella Koch Ocker, a doctoral student in astronomy at Cornell University, discovered the sound in the data from the Voyager's Plasma Wave System (PWS), which measures electron density. Ocker called the drone coming from plasma shock waves "very faint and monotone," likely due to the narrow bandwidth of its frequency.
While they think the persistent background hum may be coming from interstellar gas, the researchers don't yet know what exactly is causing it. It might be produced by "thermally excited plasma oscillations and quasi-thermal noise."
The new paper from Ocker and her colleagues at Cornell University and the University of Iowa, published in Nature Astronomy, also proposes that this is not the last we'll hear of the strange noise. The scientists write that "the emission's persistence suggests that Voyager 1 may be able to continue tracking the interstellar plasma density in the absence of shock-generated plasma oscillation events."
Voyager Captures Sounds of Interstellar Space www.youtube.com
The researchers think the droning sound may hold clues to how interstellar space and the heliopause, which can be thought of as the solar's system border, may be affecting each other. When it first entered interstellar space, the PWS instrument reported disturbances in the gas caused by the sun. But in between such eruptions is where the researchers spotted the steady signature made by the near-vacuum.
Senior author James Cordes, a professor of astronomy at Cornell, compared the interstellar medium to "a quiet or gentle rain," adding that "in the case of a solar outburst, it's like detecting a lightning burst in a thunderstorm and then it's back to a gentle rain."
More data from Voyager over the next few years may hold crucial information to the origins of the hum. The findings are already remarkable considering the space probe is functioning on technology from the mid-1970s. The craft has about 70 kilobytes of computer memory. It also carries a Golden Record created by a committee chaired by the late Carl Sagan, who taught at Cornell University. The 12-inch gold-plated copper disk record is essentially a time capsule, meant to tell the story of Earthlings to extraterrestrials. It contains sounds and images that showcase the diversity of Earth's life and culture.
A team of scientists managed to install onto a smartphone a spectrometer that's capable of identifying specific molecules — with cheap parts you can buy online.
- Spectroscopy provides a non-invasive way to study the chemical composition of matter.
- These techniques analyze the unique ways light interacts with certain materials.
- If spectrometers become a common feature of smartphones, it could someday potentially allow anyone to identify pathogens, detect impurities in food, and verify the authenticity of valuable minerals.
The quality of smartphone cameras has increased exponentially over the past decade. Today's smartphone cameras can not only capture photos that rival those of stand-alone camera systems but also offer practical applications, like heart-rate measurement, foreign-text translation, and augmented reality.
What's the next major functionality of smartphone cameras? It could be the ability to identify chemicals, drugs, and biological molecules, according to a new study published in the Review of Scientific Instruments.
The study describes how a team of scientists at Texas A&M turned a common smartphone into a "pocket-sized" Raman and emission spectral detector by modifying it with just $50 worth of extra equipment. With the added hardware, the smartphone was able to identify chemicals in the field within minutes.
The technology could have a wide range of applications, including diagnosing certain diseases, detecting the presence of pathogens and dangerous chemicals, identifying impurities in food, and verifying the authenticity of valuable artwork and minerals.
Raman and fluorescence spectroscopy
Raman and fluorescence spectroscopies are techniques for discerning the chemical composition of materials. Both strategies exploit the fact that light interacts with certain types of matter in unique ways. But there are some differences between the two techniques.
As the name suggests, fluorescence spectroscopy measures the fluorescence — that is, the light emitted by a substance when it absorbs light or other electromagnetic radiation — of a given material. It works by shining light on a material, which excites the electrons within the molecules of the material. The electrons then emit fluorescent light toward a filter that measures fluorescence.
The particular spectra of fluorescent light that's emitted can help scientists detect small concentrations of particular types of biological molecules within a material. But some biomolecules, such as RNA and DNA, don't emit fluorescent light, or they only do so at extremely low levels. That's where Raman spectroscopy comes into play.
Raman spectroscopy involves shooting a laser at a sample and observing how the light scatters. When light hits molecules, the atoms within the molecules vibrate and photons get scattered. Most of the scattered light is of the same wavelength and color as the original light, so it provides no information. But a tiny fraction of the light gets scattered differently; that is, the wavelength and color are different. Known as Raman scattering, this is extremely useful because it provides highly precise information about the chemical composition of the molecule. In other words, all molecules have a unique Raman "fingerprint."
Creating an affordable, pocket-sized spectrometer
To build the spectrometer, the researchers connected a smartphone to a laser and a series of plastic lenses. The smartphone camera was placed facing a transmission diffraction grating, which splits incoming light into its constituent wavelengths and colors. After a laser is fired into a sample, the scattered light is diffracted through this grating, and the smartphone camera analyzes the light on the other side.
Schematic diagram of the designed system.Credit: Dhankhar et al.
To test the spectrometer, the researchers analyzed a range of sample materials, including carrots and bacteria. The laser used in the spectrometer emits a wavelength that's readily absorbed by the pigments in carrots and bacteria, which is why these materials were chosen.
The results showed that the smartphone spectrometer was able to correctly identify the materials, but it wasn't quite as effective as the best commercially available Raman spectrometers. The researchers noted that their system might be improved by using specific High Dynamic Range (HDR) smartphone camera applications.
Ultimately, the study highlights how improving the fundamentals of a technology, like smartphone cameras, can lead to a surprisingly wide range of useful applications.
"This inexpensive yet accurate recording pocket Raman system has the potential of being an integral part of ubiquitous cell phones that will make it possible to identify chemical impurities and pathogens, in situ within minutes," the researchers concluded.