"Courage" Is Just Part of Why Apple’s Ditching the Old Headphone Jack

Apple’s removal of the iPhones 3.4 mm headphone jack is causing an uproar as part of technology’s inexorable march forward.

Tech fans are in an uproar over Apple’s decision to forgo a standard 3.5 mm headphone jack on their just-announced iPhone 7. Really, it’s all much ado about nothing in practical terms. The Phone 7 will still ship with earbuds — using Apple’s Lightning connector — as well as an adapter into which you can plug your current 3.5 mm pair. (Should you lose the adapter, it’s just $9 to get a new one.) The iPhone 7’s audio system is also Bluetooth-compatible, so you can use any Bluetooth earbuds or headphones you like. You don’t have to buy Apple’s new $159 wireless Airpods. Apple isn’t even the first company to get rid of the jack: That honor goes to Android-phone manufacturers Lenovo (the Moto Z and Moto Z Force), and LeEco.

Moto Z and Moto Z Force (LENOVO)

The switch to the Lightning connector has a downside: It’s proprietary. This narrows the selection of earbuds available for consumers for plug directly into their iPhones without an adapter, and it seems unlikely that manufacturers of audio products will move to Apple’s proprietary standard. For one thing, the Android phones mentioned above send audio out through a completely different type of jack, USB Type-C.

But it also has an upside  for audio lovers. It marks a move from the analog domain for earbuds to the digital, where there’s no loss of sound quality before audio gets to the digital-to-analog converters in your ears. This should theoretically mean better sound, and given that it’s all digital, there are interesting signal-processing possibilities for audio manufacturers who do want to make the move.

Apple’s marketing chief Phil Schiller justified the removal of the jack during the phone’s announcement: "The reason to move on: courage. The courage to move on and do something new that betters all of us.” Lots of people find this pretty funny.

Courage. (FORBES)

The fact is Apple’s been down this road before, often divining what customers wanted before they did. When the first iMacs came out without floppy drives, it was a big deal. Do you even remember floppy drives? Likewise, Apple forced USB connectors down our throats, but prior to USB, we had to shut down our computers just to plug and unplug things. Apple was right.

What Schiller is hinting at is one of Apple’s core beliefs. It’s that the company’s dreamers and engineers have the time/resources/imaginations to look around the corner and see the future in a way that consumers can’t. And it’s that vision, rather than a list of current wants and desires that points the way toward breakthrough products that people will love.


Many people credit Steve Jobs with having been the sole source of the company’s vision, but, brilliant as he clearly was, that was never really so. The basic idea for Macintosh came from Jef Raskin, the iMac came from designer Jonathan Ive’s design group, and Kane Kramer made the first iPod-like music player. Apple’s always been packed with very smart people. Jobs’ genius seems to have been his impatience with complexity, since it continually drove Apple’s team to distill products to their simplest, easiest-to-use form just to satisfy him. Apple’s secret sauce has always been that it wants its products to “just work” without requiring effort on the customer’s part. They don’t always succeed in this, of course, but it’s the reason behind their product ecosystem that others deride as a “walled garden.” Apple sees it as a way to build products that work together seamlessly.

Of course, it requires tremendous internal discipline — Jobs always said one of his best skills was saying “no” — to stay focused on everyone’s needs, and not just come up with something you and your colleagues want. Apple Cube, for example, was more about a fun engineering challenge than something anyone really needed, and it sold poorly. (It wasn’t wasted R&D, though, since that form factor lives on in Apple TV and other products.)

Cube, not toaster. (APPLE)

Apple succeeds most when it solves a genuine problem. The genius is in seeing the problem before everyone else does. The iPhone was a smash because people universally hated their previous phones, and here, finally, was a device to enjoy. There were other music players before the iPod, but nothing fun, and getting music was difficult — with the iPod and iTunes, those problems were solved. iMacs were computers that fit in a home. iPads were perfect for couch potatoes, salespeople, and doctors alike.

On the other hand, this is probably the reason that Apple Watch isn’t doing so well. Do people really hate their current watches? Don’t think so. Apple Watch doesn’t solve a problem most people actually have. It’s a solution in search of a problem.

The real reason Apple ditched the jack on the iPhone is real estate. SVP of Hardware Engineering Dan Riccio explained, ”We've got this 50-year-old connector — just a hole filled with air — and it's just sitting there taking up space, really valuable space. It was holding us back from a number of things we wanted to put into the iPhone. It was fighting for space with camera technologies and processors and battery life.” Apple also says the jack’s removal aids in the iPhone’s water resistance. Riccio added, “And frankly, when there's a better, modern solution available, it's crazy to keep it around."

Companies that follow their customers’ wishes will always be doing just that: following. Leading the way to the future and “putting a dent in reality,” as Jobs put it, requires vision and — don’t laugh — courage.

'Upstreamism': Your zip code affects your health as much as genetics

Upstreamism advocate Rishi Manchanda calls us to understand health not as a "personal responsibility" but a "common good."

Sponsored by Northwell Health
  • Upstreamism tasks health care professionals to combat unhealthy social and cultural influences that exist outside — or upstream — of medical facilities.
  • Patients from low-income neighborhoods are most at risk of negative health impacts.
  • Thankfully, health care professionals are not alone. Upstreamism is increasingly part of our cultural consciousness.
Keep reading Show less
  • A huge segment of America's population — the Baby Boom generation — is aging and will live longer than any American generation in history.
  • The story we read about in the news? Their drain on social services like Social Security and Medicare.
  • But increased longevity is a cause for celebration, says Ashton Applewhite, not doom and gloom.

After death, you’re aware that you’ve died, say scientists

Some evidence attributes a certain neurological phenomenon to a near death experience.

Credit: Petr Kratochvil. PublicDomainPictures.net.
Surprising Science

Time of death is considered when a person has gone into cardiac arrest. This is the cessation of the electrical impulse that drive the heartbeat. As a result, the heart locks up. The moment the heart stops is considered time of death. But does death overtake our mind immediately afterward or does it slowly creep in?

Keep reading Show less

Yale scientists restore brain function to 32 clinically dead pigs

Researchers hope the technology will further our understanding of the brain, but lawmakers may not be ready for the ethical challenges.

Still from John Stephenson's 1999 rendition of Animal Farm.
Surprising Science
  • Researchers at the Yale School of Medicine successfully restored some functions to pig brains that had been dead for hours.
  • They hope the technology will advance our understanding of the brain, potentially developing new treatments for debilitating diseases and disorders.
  • The research raises many ethical questions and puts to the test our current understanding of death.

The image of an undead brain coming back to live again is the stuff of science fiction. Not just any science fiction, specifically B-grade sci fi. What instantly springs to mind is the black-and-white horrors of films like Fiend Without a Face. Bad acting. Plastic monstrosities. Visible strings. And a spinal cord that, for some reason, is also a tentacle?

But like any good science fiction, it's only a matter of time before some manner of it seeps into our reality. This week's Nature published the findings of researchers who managed to restore function to pigs' brains that were clinically dead. At least, what we once thought of as dead.

What's dead may never die, it seems

The researchers did not hail from House Greyjoy — "What is dead may never die" — but came largely from the Yale School of Medicine. They connected 32 pig brains to a system called BrainEx. BrainEx is an artificial perfusion system — that is, a system that takes over the functions normally regulated by the organ. Think a dialysis machine for the mind. The pigs had been killed four hours earlier at a U.S. Department of Agriculture slaughterhouse; their brains completely removed from the skulls.

BrainEx pumped an experiment solution into the brain that essentially mimic blood flow. It brought oxygen and nutrients to the tissues, giving brain cells the resources to begin many normal functions. The cells began consuming and metabolizing sugars. The brains' immune systems kicked in. Neuron samples could carry an electrical signal. Some brain cells even responded to drugs.

The researchers have managed to keep some brains alive for up to 36 hours, and currently do not know if BrainEx can have sustained the brains longer. "It is conceivable we are just preventing the inevitable, and the brain won't be able to recover," said Nenad Sestan, Yale neuroscientist and the lead researcher.

As a control, other brains received either a fake solution or no solution at all. None revived brain activity and deteriorated as normal.

The researchers hope the technology can enhance our ability to study the brain and its cellular functions. One of the main avenues of such studies would be brain disorders and diseases. This could point the way to developing new of treatments for the likes of brain injuries, Alzheimer's, Huntington's, and neurodegenerative conditions.

"This is an extraordinary and very promising breakthrough for neuroscience. It immediately offers a much better model for studying the human brain, which is extraordinarily important, given the vast amount of human suffering from diseases of the mind [and] brain," Nita Farahany, the bioethicists at the Duke University School of Law who wrote the study's commentary, told National Geographic.

An ethical gray matter

Before anyone gets an Island of Dr. Moreau vibe, it's worth noting that the brains did not approach neural activity anywhere near consciousness.

The BrainEx solution contained chemicals that prevented neurons from firing. To be extra cautious, the researchers also monitored the brains for any such activity and were prepared to administer an anesthetic should they have seen signs of consciousness.

Even so, the research signals a massive debate to come regarding medical ethics and our definition of death.

Most countries define death, clinically speaking, as the irreversible loss of brain or circulatory function. This definition was already at odds with some folk- and value-centric understandings, but where do we go if it becomes possible to reverse clinical death with artificial perfusion?

"This is wild," Jonathan Moreno, a bioethicist at the University of Pennsylvania, told the New York Times. "If ever there was an issue that merited big public deliberation on the ethics of science and medicine, this is one."

One possible consequence involves organ donations. Some European countries require emergency responders to use a process that preserves organs when they cannot resuscitate a person. They continue to pump blood throughout the body, but use a "thoracic aortic occlusion balloon" to prevent that blood from reaching the brain.

The system is already controversial because it raises concerns about what caused the patient's death. But what happens when brain death becomes readily reversible? Stuart Younger, a bioethicist at Case Western Reserve University, told Nature that if BrainEx were to become widely available, it could shrink the pool of eligible donors.

"There's a potential conflict here between the interests of potential donors — who might not even be donors — and people who are waiting for organs," he said.

It will be a while before such experiments go anywhere near human subjects. A more immediate ethical question relates to how such experiments harm animal subjects.

Ethical review boards evaluate research protocols and can reject any that causes undue pain, suffering, or distress. Since dead animals feel no pain, suffer no trauma, they are typically approved as subjects. But how do such boards make a judgement regarding the suffering of a "cellularly active" brain? The distress of a partially alive brain?

The dilemma is unprecedented.

Setting new boundaries

Another science fiction story that comes to mind when discussing this story is, of course, Frankenstein. As Farahany told National Geographic: "It is definitely has [sic] a good science-fiction element to it, and it is restoring cellular function where we previously thought impossible. But to have Frankenstein, you need some degree of consciousness, some 'there' there. [The researchers] did not recover any form of consciousness in this study, and it is still unclear if we ever could. But we are one step closer to that possibility."

She's right. The researchers undertook their research for the betterment of humanity, and we may one day reap some unimaginable medical benefits from it. The ethical questions, however, remain as unsettling as the stories they remind us of.