Big ideas.
Once a week.
Subscribe to our weekly newsletter.
How Do You Get Hired at Apple? Don’t Cover All Your Bases.
“Be yourself” can seem like risky advice in a competitive job market. But you know what’s riskier? Being nobody. Ken Segall explains how he became an ad man for Apple.
What’s the Big Idea?
You know that little ‘i’ that’s in everything from iPod to iCarly? Ken Segall came up with that. The “think different” campaign? Ken was a big part of that, too. He worked closely with Steve Jobs for 10 years, and as the creative director of advertising for Apple, was instrumental in defining the brand’s aesthetic of stunning simplicity.
How do you get there from ad school? In Ken’s case, by ignoring all of your professors’ good advice. Diversify, they told him. Build a portfolio that shows off your range. But after six unsuccessful months on the job market in New York, Segall decided to try a different strategy: totally geeking out. He was passionate about videotape (a new technology at the time), so he created twenty different fictional ad campaigns about videotape. Suddenly he was getting job offers from tech-focused ad agencies, including the company with the Canon camera account.
Successful salespeople say it’s easiest to sell a product you believe in. What Segall did was to refine his mission. He loved the creative challenge of advertising, but his interests didn’t end there. So why limit his professional identity to Advertising:General? By applying his talents to content he cared about, he created ads that immediately announced him as the right guy for the kind of job that was right for him.
Ken Segall, longtime creative director of advertising for Apple, on doing work you're passionate about.
What’s the Significance?
“Be yourself” can seem like risky advice in a competitive job market. But you know what’s riskier? Being nobody. Once you’ve got a grasp of the professional etiquette of your trade (e.g. don’t show up for a job interview on Wall Street in a tie-dye t-shirt – or maybe at Apple in a business suit...), the way to stand out among those stacks of resumes, and to get the job you want, is to highlight the work that you love to do.
Everybody knows these are tough times. And maybe you’ll decide to game the system a bit – to sell yourself for a job you know you’re not right for, as a stop-gap measure until something better comes along. That’s one approach. But more likely than not you’ll find it a costly detour in terms of time and energy, and one that leaves you five or six years down the road back where Ken Segall started – trying to figure out where your talents and passions really lie, and how best to express them.
What's the best piece of advice you've heard for getting started in a tumultuous job market? Take our poll and find out what fellow Big Thinkers think . . .
...also, some surprisingly relevant career advice from Ludwig Van Beethoven.
Follow Jason Gots (@jgots) on Twitter
There are 5 eras in the universe's lifecycle. Right now, we're in the second era.
Astronomers find these five chapters to be a handy way of conceiving the universe's incredibly long lifespan.
Image based on logarithmic maps of the Universe put together by Princeton University researchers, and images produced by NASA based on observations made by their telescopes and roving spacecraft
- We're in the middle, or thereabouts, of the universe's Stelliferous era.
- If you think there's a lot going on out there now, the first era's drama makes things these days look pretty calm.
- Scientists attempt to understand the past and present by bringing together the last couple of centuries' major schools of thought.
If you're fortunate enough to get yourself beneath a clear sky in a dark place on a moonless night, a gorgeous space-scape of stars waits. If you have binoculars and point them upward, you're treated to a mind-bogglingly dense backdrop of countless specks of light absolutely everywhere, stacked atop each other, burrowing outward and backward through space and time. Such is the universe of the cosmological era in which we live. It's called the Stelliferous era, and there are four others.
The 5 eras of the universe
There are many ways to consider and discuss the past, present, and future of the universe, but one in particular has caught the fancy of many astronomers. First published in 1999 in their book The Five Ages of the Universe: Inside the Physics of Eternity, Fred Adams and Gregory Laughlin divided the universe's life story into five eras:
- Primordial era
- Stellferous era
- Degenerate era
- Black Hole Era
- Dark era
The book was last updated according to current scientific understandings in 2013.
It's worth noting that not everyone is a subscriber to the book's structure. Popular astrophysics writer Ethan C. Siegel, for example, published an article on Medium last June called "We Have Already Entered The Sixth And Final Era Of Our Universe." Nonetheless, many astronomers find the quintet a useful way of discuss such an extraordinarily vast amount of time.
The Primordial era

Image source: Sagittarius Production/Shutterstock
This is where the universe begins, though what came before it and where it came from are certainly still up for discussion. It begins at the Big Bang about 13.8 billion years ago.
For the first little, and we mean very little, bit of time, spacetime and the laws of physics are thought not yet to have existed. That weird, unknowable interval is the Planck Epoch that lasted for 10-44 seconds, or 10 million of a trillion of a trillion of a trillionth of a second. Much of what we currently believe about the Planck Epoch eras is theoretical, based largely on a hybrid of general-relativity and quantum theories called quantum gravity. And it's all subject to revision.
That having been said, within a second after the Big Bang finished Big Banging, inflation began, a sudden ballooning of the universe into 100 trillion trillion times its original size.
Within minutes, the plasma began cooling, and subatomic particles began to form and stick together. In the 20 minutes after the Big Bang, atoms started forming in the super-hot, fusion-fired universe. Cooling proceeded apace, leaving us with a universe containing mostly 75% hydrogen and 25% helium, similar to that we see in the Sun today. Electrons gobbled up photons, leaving the universe opaque.
About 380,000 years after the Big Bang, the universe had cooled enough that the first stable atoms capable of surviving began forming. With electrons thus occupied in atoms, photons were released as the background glow that astronomers detect today as cosmic background radiation.
Inflation is believed to have happened due to the remarkable overall consistency astronomers measure in cosmic background radiation. Astronomer Phil Plait suggests that inflation was like pulling on a bedsheet, suddenly pulling the universe's energy smooth. The smaller irregularities that survived eventually enlarged, pooling in denser areas of energy that served as seeds for star formation—their gravity pulled in dark matter and matter that eventually coalesced into the first stars.
The Stelliferous era

Image source: Casey Horner/unsplash
The era we know, the age of stars, in which most matter existing in the universe takes the form of stars and galaxies during this active period.
A star is formed when a gas pocket becomes denser and denser until it, and matter nearby, collapse in on itself, producing enough heat to trigger nuclear fusion in its core, the source of most of the universe's energy now. The first stars were immense, eventually exploding as supernovas, forming many more, smaller stars. These coalesced, thanks to gravity, into galaxies.
One axiom of the Stelliferous era is that the bigger the star, the more quickly it burns through its energy, and then dies, typically in just a couple of million years. Smaller stars that consume energy more slowly stay active longer. In any event, stars — and galaxies — are coming and going all the time in this era, burning out and colliding.
Scientists predict that our Milky Way galaxy, for example, will crash into and combine with the neighboring Andromeda galaxy in about 4 billion years to form a new one astronomers are calling the Milkomeda galaxy.
Our solar system may actually survive that merger, amazingly, but don't get too complacent. About a billion years later, the Sun will start running out of hydrogen and begin enlarging into its red giant phase, eventually subsuming Earth and its companions, before shrining down to a white dwarf star.
The Degenerate era

Image source: Diego Barucco/Shutterstock/Big Think
Next up is the Degenerate era, which will begin about 1 quintillion years after the Big Bang, and last until 1 duodecillion after it. This is the period during which the remains of stars we see today will dominate the universe. Were we to look up — we'll assuredly be outta here long before then — we'd see a much darker sky with just a handful of dim pinpoints of light remaining: white dwarfs, brown dwarfs, and neutron stars. These"degenerate stars" are much cooler and less light-emitting than what we see up there now. Occasionally, star corpses will pair off into orbital death spirals that result in a brief flash of energy as they collide, and their combined mass may become low-wattage stars that will last for a little while in cosmic-timescale terms. But mostly the skies will be be bereft of light in the visible spectrum.
During this era, small brown dwarfs will wind up holding most of the available hydrogen, and black holes will grow and grow and grow, fed on stellar remains. With so little hydrogen around for the formation of new stars, the universe will grow duller and duller, colder and colder.
And then the protons, having been around since the beginning of the universe will start dying off, dissolving matter, leaving behind a universe of subatomic particles, unclaimed radiation…and black holes.
The Black Hole era

Image source: Vadim Sadovski/Shutterstock/Big Think
For a considerable length of time, black holes will dominate the universe, pulling in what mass and energy still remain.
Eventually, though, black holes evaporate, albeit super-slowly, leaking small bits of their contents as they do. Plait estimates that a small black hole 50 times the mass of the sun would take about 1068 years to dissipate. A massive one? A 1 followed by 92 zeros.
When a black hole finally drips to its last drop, a small pop of light occurs letting out some of the only remaining energy in the universe. At that point, at 1092, the universe will be pretty much history, containing only low-energy, very weak subatomic particles and photons.
The Dark Era

Image source: Big Think
We can sum this up pretty easily. Lights out. Forever.
Tonight, if it's clear, maybe you want to step outside, take a nice deep breath, and look up, grateful that we are where we are, and when we are, in spite of all the day's hardships. We've got a serious amount of temporal elbow room here, far more than we need, so not to worry, and those stars aren't going anywhere for a long, long time.
Greed and the philosophy of wealth
When does a healthy desire for wealth morph into greed? And how can we stop it?
- It's common wisdom that most things in life are best in moderation.
- Most of us agree that owning property is okay but are hard-pressed to say why and when it has gone too far.
- Greed dominates your life if the pursuit of wealth is a higher priority than charity, kindness, and solidarity with others.
The great Greek poet, Hesiod, wrote, "Observe due measure; moderation is best in all things." It's a wisdom that finds support across all ages, stages, and aspects of life. Drinking water is a good thing, but drinking too much is dangerous. A shot of vodka won't kill you, but a gallon probably will. Working hard is good, but burning yourself out is not. Being nice is great, but a sycophant is creepy. Moderation in all things.
But, it's not always easy to determine where that line falls, and a great example of this concerns property and wealth.
Most of us agree that owning things, or at least having the right to own things, is good. It's okay to buy a phone, to own a car, or to have your own clothes. But equally true is that most people feel uneasy about a world which has both billionaires in vast mansions as well as children dying malnourished. Greed, avarice, envy, and venality are considered vices. To be obsessively driven for material things is still, in the main, considered to be either misguided or, at its worst, utterly immoral. So, when does wealth become greed?
John Locke and the philosophy of property

It's hard to pinpoint exactly when humans first called a thing "mine," but the philosophy and law of property is much easier to track. One of the biggest names to consider the issue was the 17th century English philosopher John Locke.
Locke's political philosophy is famously cited as a major influence on the U.S. Declaration of Independence but also fed heavily into the French Revolution and the Great Reform movements of Britain. His work on property is perhaps one of his most important contributions.
Although subject to a fair bit of debate — what isn't in philosophy? — it's generally accepted that Locke adopted a "fair usage" view of property. He argued that one can hold any property that meets the following criteria:
- It can be used before it spoils (e.g., we don't have huge stores of food that just rots).
- It leaves "good and enough" for everyone else (e.g., one person cannot own all the land in a country).
- The property must come from your own work and effort or what he calls "mixing your labor" with that thing (e.g., if you farm a field, the field and its produce become yours).
If we were to follow these rules, it seems hard to envisage a world of greed and inequality. Everyone can have and get what they want, so long as enough is left for everyone else to get what they want, as well.
But, there's a lot of ambiguity in these rules, and money rather changes things. Money, especially modern money in the form of digital numbers on a screen, does not spoil. And, thanks to modern banking, there is no limit to the amount of money there could be — a bank can, and does, literally create money each time they give you a credit card or a loan (although, in practice, few countries allow this and place limits on money creation). So, no matter how many billions someone creates, there will always be "good and enough" money for others, too.
(Of course, in practice, constantly creating huge new pools of money will lead to hyperinflation, devaluing the money for everyone. Yet, even if we were to ban all new money creation today, a Lockean could argue that there's more than enough already for a generous distribution around the world.)
So, money changes things for Locke's account. It won't spoil and there will always be at least some money for everyone else. It's even been argued that Locke, far from advocating an equal and distributive philosophy, can easily support rampant capitalist accumulation of wealth. Locke wrote that, because of money, "Now one man could have… a disproportionate and unequal possession of the earth… and fairly possess more land than he himself can use."
It's the philosophy of greed.
Too much greed
The idea that greed is an essential part of being human (or at least an animal) goes back at least to Plato and has a rich philosophical history from there. Today, it often takes the form of evolutionary psychology or genetics, exemplified by Richard Dawkins' The Selfish Gene.
It's when we think of little else than increasing our experiences and material possessions. This is the point at which greed has come to dominate your life.
One thinker who has challenged this is Peter Singer. Singer acknowledges the fact that evolution does work on a certain competitiveness, that is, the fittest will pass on their genes. But he also believes that it's wrong to associate this wholly with greed or selfishness. Cooperation and productive relationships are just as vital to survival.
Singer argues that the desire to do good, to work hard, and to succeed are admirable parts of the human condition, but when they are taken to excess, they turn into greed. That line comes when the want of more — particularly, the desire for material wealth — becomes the sole focus of a life. It's when working late or constantly looking for that promotion is prioritized over family, friends, and common human compassion.
The fact is that, in the West, most people have enough. Even poor people generally have TVs, smartphones, and automobiles. The average person in the West lives far better than royalty did for millennia. Singer asks us to get a sense of perspective. We spend more on bottled water than some families in developing countries live off for a day. We're so fixated on our current day-to-day condition, that we lose sight of how much we really have.
Greed über alles
Singer's argument helps us identify the point at which drive and success insidiously morph into greed: It's when we are loath to spend our money and devote all of our waking lives to determinedly accumulating more and more at the expense of our relationships. It's when we think of little else than increasing our experiences and material possessions. This is the point at which greed has come to dominate your life.
But it's also when greed replaces our common sense of compassion. It's when property and wealth become virtues greater than charity, kindness, and solidarity with others. It's when dollar signs and fast cars matter more than people dying in the street. It's when getting a pay raise matters more than someone else getting fired.
Nobody likes to think of themselves as greedy, but if you examine yourself closely, you will probably find some aspects of your life that are at least tainted by greed. We should all check ourselves from time to time.
This programmable fiber has memories and can sense temperature
Researchers were even able store and read a 767-kilobit full-color short movie file in the fabric.
MIT researchers have created the first fiber with digital capabilities, able to sense, store, analyze, and infer activity after being sewn into a shirt.
Yoel Fink, who is a professor in the departments of materials science and engineering and electrical engineering and computer science, a Research Laboratory of Electronics principal investigator, and the senior author on the study, says digital fibers expand the possibilities for fabrics to uncover the context of hidden patterns in the human body that could be used for physical performance monitoring, medical inference, and early disease detection.
Or, you might someday store your wedding music in the gown you wore on the big day — more on that later.
Fink and his colleagues describe the features of the digital fiber today in Nature Communications. Until now, electronic fibers have been analog — carrying a continuous electrical signal — rather than digital, where discrete bits of information can be encoded and processed in 0s and 1s.
"This work presents the first realization of a fabric with the ability to store and process data digitally, adding a new information content dimension to textiles and allowing fabrics to be programmed literally," Fink says.
MIT PhD student Gabriel Loke and MIT postdoc Tural Khudiyev are the lead authors on the paper. Other co-authors MIT postdoc Wei Yan; MIT undergraduates Brian Wang, Stephanie Fu, Ioannis Chatziveroglou, Syamantak Payra, Yorai Shaoul, Johnny Fung, and Itamar Chinn; John Joannopoulos, the Francis Wright Davis Chair Professor of Physics and director of the Institute for Soldier Nanotechnologies at MIT; Harrisburg University of Science and Technology master's student Pin-Wen Chou; and Rhode Island School of Design Associate Professor Anna Gitelson-Kahn. The fabric work was facilitated by Professor Anais Missakian, who holds the Pevaroff-Cohn Family Endowed Chair in Textiles at RISD.
Memory and more
The new fiber was created by placing hundreds of square silicon microscale digital chips into a preform that was then used to create a polymer fiber. By precisely controlling the polymer flow, the researchers were able to create a fiber with continuous electrical connection between the chips over a length of tens of meters.
The fiber itself is thin and flexible and can be passed through a needle, sewn into fabrics, and washed at least 10 times without breaking down. According to Loke, "When you put it into a shirt, you can't feel it at all. You wouldn't know it was there."
Making a digital fiber "opens up different areas of opportunities and actually solves some of the problems of functional fibers," he says.
For instance, it offers a way to control individual elements within a fiber, from one point at the fiber's end. "You can think of our fiber as a corridor, and the elements are like rooms, and they each have their own unique digital room numbers," Loke explains. The research team devised a digital addressing method that allows them to "switch on" the functionality of one element without turning on all the elements.
A digital fiber can also store a lot of information in memory. The researchers were able to write, store, and read information on the fiber, including a 767-kilobit full-color short movie file and a 0.48 megabyte music file. The files can be stored for two months without power.
When they were dreaming up "crazy ideas" for the fiber, Loke says, they thought about applications like a wedding gown that would store digital wedding music within the weave of its fabric, or even writing the story of the fiber's creation into its components.
Fink notes that the research at MIT was in close collaboration with the textile department at RISD led by Missakian. Gitelson-Kahn incorporated the digital fibers into a knitted garment sleeve, thus paving the way to creating the first digital garment.

On-body artificial intelligence
The fiber also takes a few steps forward into artificial intelligence by including, within the fiber memory, a neural network of 1,650 connections. After sewing it around the armpit of a shirt, the researchers used the fiber to collect 270 minutes of surface body temperature data from a person wearing the shirt, and analyze how these data corresponded to different physical activities. Trained on these data, the fiber was able to determine with 96 percent accuracy what activity the person wearing it was engaged in.
Adding an AI component to the fiber further increases its possibilities, the researchers say. Fabrics with digital components can collect a lot of information across the body over time, and these "lush data" are perfect for machine learning algorithms, Loke says.
"This type of fabric could give quantity and quality open-source data for extracting out new body patterns that we did not know about before," he says.
With this analytic power, the fibers someday could sense and alert people in real-time to health changes like a respiratory decline or an irregular heartbeat, or deliver muscle activation or heart rate data to athletes during training.
The fiber is controlled by a small external device, so the next step will be to design a new chip as a microcontroller that can be connected within the fiber itself.
"When we can do that, we can call it a fiber computer," Loke says.
This research was supported by the U.S. Army Institute of Soldier Nanotechnologies, National Science Foundation, the U.S. Army Research Office, the MIT Sea Grant, and the Defense Threat Reduction Agency.
Reprinted with permission of MIT News. Read the original article.
No news is good news? Think again
Information economics suggests that "no news" means somebody is hiding something. But people are bad at noticing that.