Once a week.
Subscribe to our weekly newsletter.
Lab-grown brain organoids mature like real infant brains
After 20 months, scientists find lab-dish brain cells matured at a similar rate to those of an actual infant.
- Scientists have found that cultures of embryonic brain cells mature at the same rate as a 20-month-old infant's.
- Researchers have looked to such cell structures, called "organoids," as potential models for understanding the human body's biological mechanisms.
- Their study validates the use of lab-dish organoids for research.
Scientists have been growing cell cultures that resemble natural human cells in dishes for a while now, but their usefulness for research has been inhibited by concerns that they never mature enough to provide insights into human development beyond the womb. Now, scientists from UCLA and Stanford have genetically analyzed dish-grown brain organoids that matured over 20 months pretty close to the same timetable that an infant's brain cells would.
"This will be an important boost for the field. We've shown that these organoids can mature and replicate many aspects of normal human development — making them a good model for studying human disease in a dish," says senior author Daniel Geschwind of UCLA.
The research is published in a study in the journal Nature Neuroscience.
What organoids really are and aren't
A brain organoid
Credit: NIH Image Gallery/Wikimedia
Organoids are tissue cultures comprised of human embryonic stem cells. They start off as induced pluripotent stem cells (IPS) drawn from skin cells or blood cells before being reprogrammed to revert to an embryonic stem-cell-like state. From there, they can be exposed to chemicals that cause them to behave like a specific type of human cell.
In the case of this study, the chemicals caused them to become cerebral organoids, self-organized 3D cell structures that behave similarly to natural human brain cells. They don't grow to become full mini-brains.
The promise of organoids
Credit: David Matos/Unsplash
The hope for organoids has been that they would provide researchers a way of observing human biological process in a benign, non-invasive way. Insights regarding the manner in which human cells and organs develop a disease, progress through its stages, and respond, or not, to medication, without involving actual human subjects or animal analogues could revolutionize research.
In the case of brain organoids, researchers have been hoping they can somehow be used to reveal the secrets of neurological and neurodevelopmental disorders, including epilepsy, autism and schizophrenia.
That organoids could be useful—though that doesn't mean that they would be—as a result of their permanently remaining embryonic cells. This study is a first indicator that organoids' larger promise can actually be fulfilled.
An answer scientists have been hoping for
"This is novel," says Geschwind. "Until now, nobody has grown and characterized these organoids for this amount of time, nor shown they will recapitulate human brain development in a laboratory environment for the most part."
Now, says first author Aaron Gordon, "We show that these 3D brain organoids follow an internal clock, which progresses in a laboratory environment in parallel to what occurs inside a living organism. This is a remarkable finding — we show that they reach post-natal maturity around 280 days in culture, and after that begin to model aspects of the infant brain, including known physiological changes in neurotransmitter signaling."
With the study verifying a 20-month maturation process, it remains to be seen how long, or how far, maturation in organoids goes. Can their cells continue to mature for years? Decades?
Even without an answer to that question, the study, says Geschwind, "represents an important milestone by showing which aspects of human brain development are modeled with the highest fidelity and which specific genes are behaving well in vitro and when best to model them. Equally important, we provide a framework based on unbiased genomic analyses for assessing how well in vitro models model in vivo development and function."
With IPS cells able to take on the roles of so many types of cells in the human body, and the new knowledge that they do in fact mature beyond their embryonic stage, researchers can feel more confident of insights into biological mechanisms organoids seem to reveal. And researchers can now better equipped to solve some of the human body's vexing mysteries.
A Harvard professor's study discovers the worst year to be alive.
- Harvard professor Michael McCormick argues the worst year to be alive was 536 AD.
- The year was terrible due to cataclysmic eruptions that blocked out the sun and the spread of the plague.
- 536 ushered in the coldest decade in thousands of years and started a century of economic devastation.
The past year has been nothing but the worst in the lives of many people around the globe. A rampaging pandemic, dangerous political instability, weather catastrophes, and a profound change in lifestyle that most have never experienced or imagined.
But was it the worst year ever?
Nope. Not even close. In the eyes of the historian and archaeologist Michael McCormick, the absolute "worst year to be alive" was 536.
Why was 536 so bad? You could certainly argue that 1918, the last year of World War I when the Spanish Flu killed up to 100 million people around the world, was a terrible year by all accounts. 1349 could also be considered on this morbid list as the year when the Black Death wiped out half of Europe, with up to 20 million dead from the plague. Most of the years of World War II could probably lay claim to the "worst year" title as well. But 536 was in a category of its own, argues the historian.
It all began with an eruption...
According to McCormick, Professor of Medieval History at Harvard University, 536 was the precursor year to one of the worst periods of human history. It featured a volcanic eruption early in the year that took place in Iceland, as established by a study of a Swiss glacier carried out by McCormick and the glaciologist Paul Mayewski from the Climate Change Institute of The University of Maine (UM) in Orono.
The ash spewed out by the volcano likely led to a fog that brought an 18-month-long stretch of daytime darkness across Europe, the Middle East, and portions of Asia. As wrote the Byzantine historian Procopius, "For the sun gave forth its light without brightness, like the moon, during the whole year." He also recounted that it looked like the sun was always in eclipse.
Cassiodorus, a Roman politician of that time, wrote that the sun had a "bluish" color, the moon had no luster, and "seasons seem to be all jumbled up together." What's even creepier, he described, "We marvel to see no shadows of our bodies at noon."
...that led to famine...
The dark days also brought a period of coldness, with summer temperatures falling by 1.5° C. to 2.5° C. This started the coldest decade in the past 2300 years, reports Science, leading to the devastation of crops and worldwide hunger.
...and the fall of an empire
In 541, the bubonic plague added considerably to the world's misery. Spreading from the Roman port of Pelusium in Egypt, the so-called Plague of Justinian caused the deaths of up to one half of the population of the eastern Roman Empire. This, in turn, sped up its eventual collapse, writes McCormick.
Between the environmental cataclysms, with massive volcanic eruptions also in 540 and 547, and the devastation brought on by the plague, Europe was in for an economic downturn for nearly all of the next century, until 640 when silver mining gave it a boost.
Was that the worst time in history?
Of course, the absolute worst time in history depends on who you were and where you lived.
Native Americans can easily point to 1520, when smallpox, brought over by the Spanish, killed millions of indigenous people. By 1600, up to 90 percent of the population of the Americas (about 55 million people) was wiped out by various European pathogens.
Like all things, the grisly title of "worst year ever" comes down to historical perspective.
A new paper reveals that the Voyager 1 spacecraft detected a constant hum coming from outside our Solar System.
Voyager 1, humanity's most faraway spacecraft, has detected an unusual "hum" coming from outside our solar system. Fourteen billion miles away from Earth, the Voyager's instruments picked up a droning sound that may be caused by plasma (ionized gas) in the vast emptiness of interstellar space.
Launched in 1977, the Voyager 1 space probe — along with its twin Voyager 2 — has been traveling farther and farther into space for over 44 years. It has now breached the edge of our solar system, exiting the heliosphere, the bubble-like region of space influenced by the sun. Now, the spacecraft is moving through the "interstellar medium," where it recorded the peculiar sound.
Stella Koch Ocker, a doctoral student in astronomy at Cornell University, discovered the sound in the data from the Voyager's Plasma Wave System (PWS), which measures electron density. Ocker called the drone coming from plasma shock waves "very faint and monotone," likely due to the narrow bandwidth of its frequency.
While they think the persistent background hum may be coming from interstellar gas, the researchers don't yet know what exactly is causing it. It might be produced by "thermally excited plasma oscillations and quasi-thermal noise."
The new paper from Ocker and her colleagues at Cornell University and the University of Iowa, published in Nature Astronomy, also proposes that this is not the last we'll hear of the strange noise. The scientists write that "the emission's persistence suggests that Voyager 1 may be able to continue tracking the interstellar plasma density in the absence of shock-generated plasma oscillation events."
Voyager Captures Sounds of Interstellar Space www.youtube.com
The researchers think the droning sound may hold clues to how interstellar space and the heliopause, which can be thought of as the solar's system border, may be affecting each other. When it first entered interstellar space, the PWS instrument reported disturbances in the gas caused by the sun. But in between such eruptions is where the researchers spotted the steady signature made by the near-vacuum.
Senior author James Cordes, a professor of astronomy at Cornell, compared the interstellar medium to "a quiet or gentle rain," adding that "in the case of a solar outburst, it's like detecting a lightning burst in a thunderstorm and then it's back to a gentle rain."
More data from Voyager over the next few years may hold crucial information to the origins of the hum. The findings are already remarkable considering the space probe is functioning on technology from the mid-1970s. The craft has about 70 kilobytes of computer memory. It also carries a Golden Record created by a committee chaired by the late Carl Sagan, who taught at Cornell University. The 12-inch gold-plated copper disk record is essentially a time capsule, meant to tell the story of Earthlings to extraterrestrials. It contains sounds and images that showcase the diversity of Earth's life and culture.
A team of scientists managed to install onto a smartphone a spectrometer that's capable of identifying specific molecules — with cheap parts you can buy online.
- Spectroscopy provides a non-invasive way to study the chemical composition of matter.
- These techniques analyze the unique ways light interacts with certain materials.
- If spectrometers become a common feature of smartphones, it could someday potentially allow anyone to identify pathogens, detect impurities in food, and verify the authenticity of valuable minerals.
The quality of smartphone cameras has increased exponentially over the past decade. Today's smartphone cameras can not only capture photos that rival those of stand-alone camera systems but also offer practical applications, like heart-rate measurement, foreign-text translation, and augmented reality.
What's the next major functionality of smartphone cameras? It could be the ability to identify chemicals, drugs, and biological molecules, according to a new study published in the Review of Scientific Instruments.
The study describes how a team of scientists at Texas A&M turned a common smartphone into a "pocket-sized" Raman and emission spectral detector by modifying it with just $50 worth of extra equipment. With the added hardware, the smartphone was able to identify chemicals in the field within minutes.
The technology could have a wide range of applications, including diagnosing certain diseases, detecting the presence of pathogens and dangerous chemicals, identifying impurities in food, and verifying the authenticity of valuable artwork and minerals.
Raman and fluorescence spectroscopy
Raman and fluorescence spectroscopies are techniques for discerning the chemical composition of materials. Both strategies exploit the fact that light interacts with certain types of matter in unique ways. But there are some differences between the two techniques.
As the name suggests, fluorescence spectroscopy measures the fluorescence — that is, the light emitted by a substance when it absorbs light or other electromagnetic radiation — of a given material. It works by shining light on a material, which excites the electrons within the molecules of the material. The electrons then emit fluorescent light toward a filter that measures fluorescence.
The particular spectra of fluorescent light that's emitted can help scientists detect small concentrations of particular types of biological molecules within a material. But some biomolecules, such as RNA and DNA, don't emit fluorescent light, or they only do so at extremely low levels. That's where Raman spectroscopy comes into play.
Raman spectroscopy involves shooting a laser at a sample and observing how the light scatters. When light hits molecules, the atoms within the molecules vibrate and photons get scattered. Most of the scattered light is of the same wavelength and color as the original light, so it provides no information. But a tiny fraction of the light gets scattered differently; that is, the wavelength and color are different. Known as Raman scattering, this is extremely useful because it provides highly precise information about the chemical composition of the molecule. In other words, all molecules have a unique Raman "fingerprint."
Creating an affordable, pocket-sized spectrometer
To build the spectrometer, the researchers connected a smartphone to a laser and a series of plastic lenses. The smartphone camera was placed facing a transmission diffraction grating, which splits incoming light into its constituent wavelengths and colors. After a laser is fired into a sample, the scattered light is diffracted through this grating, and the smartphone camera analyzes the light on the other side.
Schematic diagram of the designed system.Credit: Dhankhar et al.
To test the spectrometer, the researchers analyzed a range of sample materials, including carrots and bacteria. The laser used in the spectrometer emits a wavelength that's readily absorbed by the pigments in carrots and bacteria, which is why these materials were chosen.
The results showed that the smartphone spectrometer was able to correctly identify the materials, but it wasn't quite as effective as the best commercially available Raman spectrometers. The researchers noted that their system might be improved by using specific High Dynamic Range (HDR) smartphone camera applications.
Ultimately, the study highlights how improving the fundamentals of a technology, like smartphone cameras, can lead to a surprisingly wide range of useful applications.
"This inexpensive yet accurate recording pocket Raman system has the potential of being an integral part of ubiquitous cell phones that will make it possible to identify chemical impurities and pathogens, in situ within minutes," the researchers concluded.