Once a week.
Subscribe to our weekly newsletter.
AI bests humans in medical diagnosis
Artificial intelligence has proven equal and even better than humans in making some diagnoses.
- A review of studies found that AI is at least equal to human healthcare workers in making diagnoses.
- The conclusion applies to cases where AI looked at images.
- More real-world tests are necessary to further develop artificial intelligence in medicine.
A review found that artificial intelligence is just as good and even slightly better than human doctors in making medical diagnoses using images.
Researchers carried out a thorough analysis of various studies published since 2012. The AI utilized in the studies used deep learning to classify images based on certain features, which were compared to visuals of diseases.
The study's co-author Professor Alastair Denniston from the University Hospitals Birmingham is both optimistic and cautious about what AI can do. As reported by the Guardian, this sentiment was shared by Dr. Xiaoxuan Liu, the lead author of the study, associated with the same institution.
"There are a lot of headlines about AI outperforming humans, but our message is that it can at best be equivalent," said Liu.
To come to such conclusions, the scientists made an effort to whittle down the list of often-poor studies they looked at from 20,000 to just 14 which had the best quality data. In these studies, the deep learning systems used images from datasets that were separate from the ones used to train them. These same images were then shown to human experts.
The researchers found that AI was able to correctly pinpoint illnesses 87% of the time. That's compared to 86% for healthcare pros. The AI was also right in clearing people of diseases 93% of the time, in contrast to 91% of human experts. One caveat to this statistic was that the healthcare workers tested were not given extra info about patients that they would have had in real-world situations.
Using AI in medicine has a number of potential advantages, from freeing up human doctors to spend more time with patients to customizing treatments and clearing backlogs of necessary diagnoses. AI can also be particularly helpful in environments that lack enough human experts to do this work.The next step in assessing the usefulness of AI in the medical field would be to carry out clinical trials to see if patient outcomes would improve by utilizing deep learning systems.
Check out the new review here.
The biggest problem in AI? Machines have no common sense.
The father of all giant sea bugs was recently discovered off the coast of Java.
- A new species of isopod with a resemblance to a certain Sith lord was just discovered.
- It is the first known giant isopod from the Indian Ocean.
- The finding extends the list of giant isopods even further.
Humanity knows surprisingly little about the ocean depths. An often-repeated bit of evidence for this is the fact that humanity has done a better job mapping the surface of Mars than the bottom of the sea. The creatures we find lurking in the watery abyss often surprise even the most dedicated researchers with their unique features and bizarre behavior.
A recent expedition off the coast of Java discovered a new isopod species remarkable for its size and resemblance to Darth Vader.
The ocean depths are home to many creatures that some consider to be unnatural.
According to LiveScience, the Bathynomus genus is sometimes referred to as "Darth Vader of the Seas" because the crustaceans are shaped like the character's menacing helmet. Deemed Bathynomus raksasa ("raksasa" meaning "giant" in Indonesian), this cockroach-like creature can grow to over 30 cm (12 inches). It is one of several known species of giant ocean-going isopod. Like the other members of its order, it has compound eyes, seven body segments, two pairs of antennae, and four sets of jaws.
The incredible size of this species is likely a result of deep-sea gigantism. This is the tendency for creatures that inhabit deeper parts of the ocean to be much larger than closely related species that live in shallower waters. B. raksasa appears to make its home between 950 and 1,260 meters (3,117 and 4,134 ft) below sea level.
Perhaps fittingly for a creature so creepy looking, that is the lower sections of what is commonly called The Twilight Zone, named for the lack of light available at such depths.
It isn't the only giant isopod, far from it. Other species of ocean-going isopod can get up to 50 cm long (20 inches) and also look like they came out of a nightmare. These are the unusual ones, though. Most of the time, isopods stay at much more reasonable sizes.
View this post on Instagram
During an expedition, there are some animals which you find unexpectedly, while there are others that you hope to find. One of the animal that we hoped to find was a deep sea cockroach affectionately known as Darth Vader Isopod. The staff on our expedition team could not contain their excitement when they finally saw one, holding it triumphantly in the air! #SJADES2018
A post shared by LKCNHM (@lkcnhm) on
What benefit does this find have for science? And is it as evil as it looks?
The discovery of a new species is always a cause for celebration in zoology. That this is the discovery of an animal that inhabits the deeps of the sea, one of the least explored areas humans can get to, is the icing on the cake.
Helen Wong of the National University of Singapore, who co-authored the species' description, explained the importance of the discovery:
"The identification of this new species is an indication of just how little we know about the oceans. There is certainly more for us to explore in terms of biodiversity in the deep sea of our region."
The animal's visual similarity to Darth Vader is a result of its compound eyes and the curious shape of its head. However, given the location of its discovery, the bottom of the remote seas, it may be associated with all manner of horrifically evil Elder Things and Great Old Ones.
Virtual reality continues to blur the line between the physical and the digital, and it will change our lives forever.
- Extended reality technologies — which include virtual reality, augmented reality, and mixed reality — have long captivated the public imagination, but have yet to become mainstream.
- Extended reality technologies are quickly becoming better and cheaper, suggesting they may soon become part of daily life.
- Over the long term, these technologies may usher in the "mirror world" — a digital layer "map" that lies atop the physical world and enables us to interact with internet-based technologies more seamlessly than ever.
What will the Disneyland of the future look like? | Hard Reset by Freethink www.youtube.com
Immersive technology aims to overlay a digital layer of experience atop everyday reality, changing how we interact with everything from medicine to entertainment. What that future will look like is anyone's guess. But immersive technology is certainly on the rise.
The extended reality (XR) industry — which includes virtual reality (VR), augmented reality (AR), and mixed reality (MR), which involves both virtual and physical spaces — is projected to grow from $43 billion in 2020 to $333 billion by 2025, according to a recent market forecast. Much of that growth will be driven by consumer technologies, such as VR video games, which are projected to be worth more than $90 billion by 2027, and AR glasses, which Apple and Facebook are currently developing.
But other sectors are adopting immersive technologies, too. A 2020 survey found that 91 percent of businesses are currently using some form of XR or plan to use it in the future. The range of XR applications seems endless: Boeing technicians use AR when installing wiring in airplanes. H&R Block service representatives use VR to boost their on-the-phone soft skills. And KFC developed an escape-room VR game to train employees how to make fried chicken.
XR applications not only train and entertain; they also have the unique ability to transform how people perceive familiar spaces. Take theme parks, which are using immersive technology to add a new experiential layer to their existing rides, such as roller coasters where riders wear VR headsets. Some parks, like China's $1.5 billion VR Star Theme Park, don't have physical rides at all.
One of the most novel innovations in theme parks is Disney's Star Wars: Galaxy's Edge attraction, which has multiple versions: physical locations in California and Florida and a near-identical virtual replica within the "Tales from the Galaxy's Edge" VR game.
"That's really the first instance of anything like this that's ever been done, where you can get a deeper dive, and a somewhat different view, of the same location by exploring its digital counterpart," game designer Michael Libby told Freethink.
Libby now runs Worldbuildr, a company that uses game-engine software to prototype theme park attractions before construction begins. The prototypes provide a real-time VR preview of everything riders will experience during the ride. It begs the question: considering that VR technology is constantly improving, will there come a point when there's no need for the physical ride at all?
Maybe. But probably not anytime soon.
"I think we're more than a few minutes from the future of VR," Sony Interactive Entertainment CEO Jim Ryan told the Washington Post in 2020. "Will it be this year? No. Will it be next year? No. But will it come at some stage? We believe that."
It could take years for XR to become mainstream. But that growth period is likely to be a brief chapter in the long history of XR technologies.
The evolution of immersive technology
The first crude example of XR technology came in 1838 when the English scientist Charles Wheatstone invented the stereoscope, a device through which people could view two images of the same scene but portrayed at slightly different angles, creating the illusion of depth and solidity. Yet it took another century before anything resembling our modern conception of immersive technology struck the popular imagination.
In 1935, the science fiction writer Stanley G. Weinbaum wrote a short story called "Pygmalion's Spectacles," which describes a pair of goggles that enables one to perceive "a movie that gives one sight and sound [...] taste, smell, and touch. [...] You are in the story, you speak to the shadows (characters) and they reply, and instead of being on a screen, the story is all about you, and you are in it."
The 1950s and 1960s saw some bold and crude forays into XR, such as the Sensorama, which was dubbed an "experience theater" that featured a movie screen complemented by fan-generated wind, a motional chair, and a machine that produced scents. There was also the Telesphere Mask, which packed most of the same features but in the form of a headset designed presciently similar to modern models.
The first functional AR device came in 1968 with Ivan Sutherland's The Sword of Damocles, a heavy headset through which viewers could see basic shapes and structures overlaid on the room around them. The 1980s brought interactive VR systems featuring goggles and gloves, like NASA's Virtual Interface Environment Workstation (VIEW), which let astronauts control robots from a distance using hand and finger movements.
1980's Virtual Reality - NASA Video youtu.be
That same technology led to new XR devices in the gaming industry, like Nintendo's Power Glove and Virtual Boy. But despite a ton of hype over XR in the 1980s and 1990s, these flashy products failed to sell. The technology was too clunky and costly.
In 2012, the gaming industry saw a more successful run at immersive technology when Oculus VR raised $2.5 million on Kickstarter to develop a VR headset. Unlike previous headsets, the Oculus model offered a 90-degree field of view, was priced reasonably, and relied on a personal computer for processing power.
In 2014, Facebook acquired Oculus for $2 billion, and the following years brought a wave of new VR products from companies like Sony, Valve, and HTC. The most recent market evolution has been toward standalone wireless VR headsets that don't require a computer, like the Oculus Quest 2, which last year received five times as many preorders as its predecessor did in 2019.
Also notable about the Oculus Quest 2 is its price: $299 — $100 cheaper than the first version. For years, market experts have said cost is the primary barrier to adoption of VR; the Valve Index headset, for example, starts at $999, and that price doesn't include the cost of games, which can cost $60 a piece. But as hardware gets better and prices get cheaper, immersive technology might become a staple in homes and industry.
Advancing XR technologies
Over the short term, it's unclear whether the recent wave of interest in XR technologies is just hype. But there's reason to think it's not. In addition to surging sales of VR devices and games, particularly amid the COVID-19 pandemic, Facebook's heavy investments into XR suggests there's plenty of space into which these technologies could grow.
A report from The Information published in March found that roughly 20 percent of Facebook personnel work in the company's AR/VR division called Facebook Reality Labs, which is "developing all the technologies needed to enable breakthrough AR glasses and VR headsets, including optics and displays, computer vision, audio, graphics, brain-computer interface, haptic interaction."
What would "breakthroughs" in XR technologies look like? It's unclear exactly what Facebook has in mind, but there are some well-known points of friction that the industry is working to overcome. For example, locomotion is a longstanding problem in VR games. Sure, some advanced systems — that is, ones that cost far more than $300 — include treadmill-like devices on which you move through the virtual world by walking, running, or tilting your center of gravity.
But for the consumer-grade devices, the options are currently limited to using a joystick, walking in place, leaning forward, or pointing and teleporting. (There's also these electronic boots that keep you in place as you walk, for what it's worth.) These solutions usually work fine, but it produces an inherent sensory contradiction: Your avatar is moving through the virtual world but your body remains still. The locomotion problem is why most VR games don't require swift character movements and why designers often compensate by having the player sit in a cockpit or otherwise limiting the game environment to a confined space.
For AR, one key hurdle is fine-tuning the technology to ensure that the virtual content you see through, say, a pair of smart glasses is optically consistent with physical objects and spaces. Currently, AR often appears clunky, unrooted from the real world. Incorporating LiDAR (Light Detection and Ranging) into AR devices may do the trick. The futurist Bernard Marr elaborated on his blog:
"[LIDAR] is essentially used to create a 3D map of surroundings, which can seriously boost a device's AR capabilities. It can provide a sense of depth to AR creations — instead of them looking like a flat graphic. It also allows for occlusion, which is where any real physical object located in front of the AR object should, obviously, block the view of it — for example, people's legs blocking out a Pokémon GO character on the street."
Another broad technological upgrade to XR technologies, especially AR, is likely to be 5G, which will boost the transmission rate of wireless data over networks.
"The adoption of 5G will make a difference in terms of new types of content being able to be viewed by more people." Irena Cronin, CEO of Infinite Retina, a research and advisory firm that helps companies implement spatial computing technologies, said in a 2020 XR survey report. "5G is going to make a difference for more sophisticated, heavy content being viewed live when needed by businesses."
Beyond technological hurdles, the AR sector still has to answer some more abstract questions on the consumer side: From a comfort and style perspective, do people really want to walk around wearing smart glasses or other wearable AR tech? (The failure of Google Glass suggests people were not quite ready to in 2014.) What is the value proposition of AR for consumers? How will companies handle the ethical dilemmas associated with AR technology, such as data privacy, motion sickness, and the potential safety hazards created by tinkering with how users see, say, a busy intersection?
Despite the hurdles, it seems likely that the XR industry will steadily — if clumsily — continue to improve these technologies, weaving them into more aspects of our personal and professional lives. The proof is in your pocket: Smartphones can already run AR applications that let you see prehistoric creatures, true-to-size IKEA furniture in your living room, navigation directions overlaid on real streets, paintings at the Vincent Van Gogh exhibit, and, of course, Pokémon. So, what's next?
The future of immersive experiences
When COVID-19 struck, it not only brought a surge in sales of XR devices and applications but also made a case for rethinking how workers interact in physical spaces. Zoom calls quickly became the norm for office jobs. But for some, prolonged video calls became annoying and exhausting; the term "Zoom fatigue" caught on and was even researched in a 2021 study published in Technology, Mind, and Behavior.
The VR company Spatial offered an alternative to Zoom. Instead of talking to 2D images of coworkers on a screen, Spatial virtually recreates office environments where workers — more specifically, their avatars — can talk and interact. The experience isn't perfect: your avatar, which is created by uploading a photo of yourself, looks a bit awkward, as do the body movements. But the experience is good enough to challenge the idea that working in a physical office is worth the trouble.
Cyberspace illustrationtampatra via Adobe Stock
That's probably the most relatable example of an immersive environment people may soon encounter. But the future is wide open. Immersive environments may also be used on a wide scale to:
- Conduct job interviews, potentially with gender- and race-neutral avatars to eliminate possibilities of discriminatory hiring practices
- Ease chronic pain
- Help people overcome phobias through exposure therapy
- Train surgeons to conduct complex procedures, which may be especially beneficial to doctors in nations with weaker healthcare systems
- Prepare inmates for release into society
- Educate students, particularly in ways that cut down on distractions
- Enable people to go on virtual dates
But the biggest transformation XR technologies are likely to bring us is a high-fidelity connection to the "mirror world." The mirror world is essentially a 1:1 digital map of our world, created by the fusion of all the data collected through satellite imagery, cameras, and other modeling techniques. It already exists in crude form. For example, if you were needing directions on the street, you could open Google Maps AR, point your camera in a certain direction, and your screen will show you that Main Street is 223 feet in front of you. But the mirror world will likely become far more sophisticated than that.
Through the looking glass of AR devices, the outside world could be transformed in any number of ways. Maybe you are hiking through the woods and you notice a rare flower; you could leave a digital note suspended in the air so the next passerby can check it out. Maybe you encounter something like an Amazon Echo in public and, instead of it looking like a cylindrical tube, it appears as an avatar. You could be touring Dresden in Germany and choose to see a flashback representation of how the city looked after the bombings of WWII. You might also run into your friends — in digital avatar form — at the local bar.
Of course, this future poses no shortage of troubling aspects, ranging from privacy, pollution from virtual advertisements, and the currently impossible-to-answer psychological consequences of creating such an immersive environment. But despite all the uncertainties, the foundations of the mirror world are being built today.
As for what may lie beyond it? Ivan Sutherland, the creator of The Sword of Damocles, once described his idea of an "ultimate" immersive display:
"...a room within which the computer can control the existence of matter. A chair displayed in such a room would be good enough to sit in. Handcuffs displayed in such a room would be confining, and a bullet displayed in such a room would be fatal. With appropriate programming such a display could literally be the Wonderland into which Alice walked."
Hospitals often deal with the aftermath of gun violence, but they can play a key role in preventing it.
- Approximately 41,000 people are killed each year due to gun violence. That's more lives lost to guns than to car accidents. So why do we devote more attention (and money) to car safety than we do gun safety?
- As Northwell Health CEO Michael Dowling points out, the deaths are not the whole story. The physical, emotional, and psychological trauma reverberates through communities and the public at-large. "This is just not about guns," says Dowling," this is a serious public health issue and we've got to look at it that way.
- Hospitals often deal with the aftermath of gun violence, but they can play a key role in preventing it. Medical staff are trained to assess health risk factors. Dowling argues that a similar approach is needed for guns. "We have to be much more holistic in our approach."