Health care: Information tech must catch up to medical marvels
Michael Dowling, Northwell Health's CEO, believes we're entering the age of smart medicine.
- The United States health care system has much room for improvement, and big tech may be laying the foundation for those improvements.
- Technological progress in medicine is coming from two fronts: medical technology and information technology.
- As information technology develops, patients will become active participants in their health care, and value-based care may become a reality.
In his book Health Care Reboot, Michael Dowling, Northwell Health's CEO, argues that "[the United States] is constructing a solid foundation upon which the new American health care is being erected." To those steeped in news of health care's administrative bloat, under-performing primary care, and low levels of insurance coverage, such a thesis may seem bold, wishful, or downright delusional.
But Dowling does not ignore the health care system's need for improvement. Rather, he believes that contemporary trends can foster such improvement if we recognize their value. He cites advances and disruptions in areas such as consolidation, education, payment reform, and mental health to support his progressive view that "better, safer, and more accessible care" is coming.
Among those trends is big tech's move into health care, or as Dowling puts it, technology may soon move us into the age of smart medicine.
Medical tech marvels
Dowling sees big tech's stride into health care as coming from two fronts: medical technology and information technology. On the medical technology front, the technology available to doctors has accelerated at an unprecedented pace, resulting in tools and techniques that are "the stuff of Star Wars."
"Some of the most advanced technology tools ever developed in any field are in use to care for patients. Look at any modern operating room or intensive care unit, and the technology to treat patients and keep them alive is remarkable," writes Dowling.
To pick one of many examples, Northwell Health's Cohen Children's Medical Center was the first pediatric program on Long Island to institute ROSA, a "robotic operating surgical assistant." Before ROSA, children suffering epilepsy would have to undergo a full craniotomy to target and monitor areas of seizure activity. With ROSA's assistance, surgeons can get the same results through a minimally invasive procedure, reducing the risk of infection and strain on the patient.
Even technology not designed for therapy has been co-opted to play small, yet supportive, roles in quotidian treatment. A study out of the Children's Hospital Los Angeles found that virtual reality can help reduce a child's anxiety and stress during basic procedures such as a blood draw.
Information tech plays catch up
Photo: Sisacorn / Shutterstock
Dowling characterizes the information technology front as "less impressive," pointing to the well-known difficulties of onboarding electronic health records. Beyond concerns of cybersecurity and interoperability, such systems have caused widespread burnout and dissatisfaction among practitioners thanks to their time consumption and complicated workflows.
But progress is being made. Apple recently added a Health Records app to its iPhone, giving patients from 39 health systems access to their medical records.
"This existing new reality is that a fat file, that until recently was stored away unavailable to the patient, now sits in its entirety on the patient's phone," writes Dowling. "For patients with chronic conditions who make frequent use of medical services, this leap forward enables them, whether a mile from their doctor's office or a thousand miles, to track and share with their doctor essential data on blood pressure, heart rate, glucose levels, and scores of other important clinical markers."
But to succeed, this information must be gatherable, accessible, and understandable to any patient. Big tech will need to streamline such systems for maximum user-friendliness, all while keeping operations on a device with which patients and practitioners are intimately familiar.
That device will be the smartphone and tablet. 77 percent of Americans own smartphones. Among Americans over 65 years of age — the demographic most in need of such advancements — 46 percent own a smartphone, a number that is likely to climb.
Big tech's vision of integrating information technology with health care is some ways off. Much experimenting must be done, and big tech needs to better collaborate with traditional health care stakeholders. Even so, these incipient steps may lead to a framework where practitioners can gather more data more quickly and with greater ease, while patients become partners, not passive recipients, of their health care team.
Accelerating value-based care
In the United States, value-based health care exists today as a should-we, could-we debate topic. Big tech's entry into the field could push value-based care closer to practice. As noted on the health care blog Tech Prescribed, integrating improved data acquisition with AI-powered platforms could turn value-based care into a manageable venture.
"As a result, we will see the move to VBC accelerate even further as more firms turn a profit through this business model. Good news for docs — this will make you the primary customer for provider technology and really improve your user experience as a side effect," writes Colton Ortolf of Tech Prescribed.
The Northwell Health entity Pharma Ventures was created both in response to collaborating with big pharma and as a means to promote value-based care. Pharma Ventures was designed "to link drug prices to drug performance" and "to serve as a super-site for clinical trials." The goal is to drive down costs while simultaneously improving patient experience. Such an initiative is only possible due to Northwell's integrated systems and system-wide electronic health records.
Entering the smart age of medicine
For Dowling, health care in the United States is laying an important foundation for the medicine of tomorrow. We're moving away from the view that health care is something the patient receives at a medical facility. Soon, health care will see the patient take an active role alongside a team of health care providers.
"The new American medicine is proactive and has physicians working in teams with nurses and other caregivers to reach out to patients and guide them along a pathway to health and wellbeing," writes Dowling.
By creating new machines, proliferating information, and making that information easier to obtain, big tech's dive into health care will be a fundamental element in this upcoming paradigm shift.
A clever new design introduces a way to image the vast ocean floor.
- Neither light- nor sound-based imaging devices can penetrate the deep ocean from above.
- Stanford scientists have invented a new system that incorporates both light and sound to overcome the challenge of mapping the ocean floor.
- Deployed from a drone or helicopter, we my finally get to see what lies beneath our planet's seas.
The challenge<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yNDg1NDQwNy9vcmlnaW4uZ2lmIiwiZXhwaXJlc19hdCI6MTY2MDY3MzUzOX0.tBLGc6HcHK_EMJ3gmeXSgNcwrbeO6XxIDKLIIOdvWpw/img.gif?width=980" id="81e2f" class="rm-shortcode" data-rm-shortcode-id="207c947387cfa5572b1639298b6db9ff" data-rm-shortcode-name="rebelmouse-image" />
3D image of a submerged object reconstructed using reflected ultrasound waves in tests of the new system
Credit: Aidan Fitzpatrick/Stanford University<p>"Airborne and spaceborne radar and laser-based, or LIDAR, systems have been able to map Earth's landscapes for decades. Radar signals are even able to penetrate cloud coverage and canopy coverage. However, seawater is much too absorptive for imaging into the water," says lead study author electrical engineer <a href="https://web.stanford.edu/~arbabian/Home/Welcome.html" target="_blank">Amin Arbabian</a> of Stanford's School of Engineering in <a href="https://news.stanford.edu/2020/11/30/combining-light-sound-see-underwater/" target="_blank"><em>Stanford News</em></a>.</p><p>One of the most reliable ways to map a terrain is by using sonar, which deduces the features of a surface by analyzing sound waves that bounce off it. However, If one were to project sound waves from above into the sea, more than 99.9% of those sound waves would be lost as they passed into water. If they managed to reach the seabed and bounce upward out of the water, another 99.9% would be lost.</p><p>Electromagnetic devices — using light, microwaves, or radar signals — are also fairly useless for ocean-floor mapping from above. Says first author <a href="https://profiles.stanford.edu/aidan-fitzpatrick" target="_blank">Aidan Fitzpatrick</a>, "Light also loses some energy from reflection, but the bulk of the energy loss is due to absorption by the water." (Ever try to get phone service underwater? Not gonna happen.)</p>
PASS<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yNDg1NDQxNS9vcmlnaW4uZ2lmIiwiZXhwaXJlc19hdCI6MTY0ODIyMjIzM30.xM93jQtseqV4wurny9hrTIuSxMjCazMObhEjZqzeclM/img.gif?width=980" id="ce5e6" class="rm-shortcode" data-rm-shortcode-id="a66a82c226e030e64c5457ec88e5b7e5" data-rm-shortcode-name="rebelmouse-image" />
The PASS system
Credit: Aidan Fitzpatrick/Stanford University<p>The solution presented in the study is the Photoacoustic Airborne Sonar System (PASS). Its core idea is the combining of sound and light to get the ob done. "If we can use light in the air, where light travels well, and sound in the water, where sound travels well, we can get the best of both worlds," says Fitzpatrick.</p><p>An imaging session beings with a laser fired down to the water from a craft above the area to be mapped. When it hits the ocean surface, it's absorbed and converted into fresh sound waves that travel down to the target. When these bounce back up to the surface and out into the air and back to PASS technicians, they do still suffer a loss. However, using light on the way in and sound only on the way out cuts that loss in half.</p><p>This means that the PASS transducers that ultimately retrieve the sound waves have plenty to work with. "We have developed a system," says Arbabian, "that is sensitive enough to compensate for a loss of this magnitude and still allow for signal detection and imaging." Form there, software assembles a 3D image of the submerged target from the acoustic signals.</p><p>PASS was initially designed to help scientists image underground plant roots.</p>
Next steps<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="e35b46b6902d17d10ce48241adc0565a"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/2YyAnxQkeuk?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span><p>Although its developers are confident that PASS will be able to see down thousands of meters into the ocean, so far it's only been tested in an "ocean" about the size of a fish tank. Tiny and obviously free of real-world ocean turbulence. </p><p>Fitzpatrick reports, "Current experiments use static water but we are currently working toward dealing with water waves. This is a challenging, but we think feasible, problem."</p><p>Scaling up, says Fitzpatrick, "Our vision for this technology is on-board a helicopter or drone. We expect the system to be able to fly at tens of meters above the water." </p>
Philosophers have been asking the question for hundreds of years. Now neuroscientists are joining the quest to find out.
- The debate over whether or not humans have free will is centuries old and ongoing. While studies have confirmed that our brains perform many tasks without conscious effort, there remains the question of how much we control and when it matters.
- According to Dr. Uri Maoz, it comes down to what your definition of free will is and to learning more about how we make decisions versus when it is ok for our brain to subconsciously control our actions and movements.
- "If we understand the interplay between conscious and unconscious," says Maoz, "it might help us realize what we can control and what we can't."
Puerto Rico's iconic telescope facilitated important scientific discoveries while inspiring young scientists and the public imagination.
- The Arecibo Observatory's main telescope collapsed on Tuesday morning.
- Although officials had been planning to demolish the telescope, the accident marked an unceremonious end to a beloved astronomical tool.
- The Arecibo radio telescope has facilitated many discoveries in astronomy, including the mapping of near-Earth asteroids and the detection of exoplanets.
Bradley Rivera via twitter.com<p>In 1963, the concave dish was built into a natural sinkhole on the northern coast of Puerto Rico. The location was <a href="https://www.space.com/20984-arecibo-observatory.html" target="_blank">picked because it was near the equator,</a> providing scientists a clear view of planets passing overhead, and also of the ionosphere, which is the uniquely reactive layer of Earth's upper atmosphere where the northern lights form.</p><p>Since its construction, scientists have used the Arecibo telescope to map near-Earth asteroids, detect gravitational waves, study pulsars, detect exoplanets and <a href="https://www.seti.org/goodbye-arecibo" target="_blank">search for alien civilizations</a>, among other projects. Here's a brief look at some of the discoveries and accomplishments made using the Arecibo telescope:</p><ul><li>1964: Astronomer <a href="https://en.wikipedia.org/wiki/Gordon_Pettengill" target="_blank" rel="noopener noreferrer">Gordon Pettengill</a> discovers that Mercury's rotation period is 59 days, significantly shorter than the previous prediction of 88 days.</li><li>1974: Physicists Russell Alan Hulse and Joseph Hooton Taylor Jr. discovers the first binary pulsar, for which they won a Nobel Prize in Physics.</li><li>1974: Scientists use the telescope to transmit the "Arecibo message" to <a href="https://en.wikipedia.org/wiki/Great_Globular_Cluster_in_Hercules" target="_blank" rel="noopener noreferrer">globular star cluster M13</a>. The message, when translated into image form, contains basic information about humanity and human knowledge: the numbers one to 10, a map of our solar system, an illustration of a human being, and the atomic numbers of certain elements.</li><li>1989: Scientists use the telescope to image an asteroid for the first time.</li><li>1992: Astronomers Alex Wolszczan and Dale Frail become the first to discover exoplanets.</li></ul>
SMARTER FASTER trademarks owned by Freethink Media, Inc. All rights reserved.