Creation without consequence: How Silicon Valley made a hot mess of progress
At the dawn of the AI era, where decisions made now could affect the future of mankind, regulation over tech giants is needed now more than ever.
Joanna Bryson is a Reader (tenured Associate Professor) at the University of Bath, and an affiliate of Princeton's Center for Information Technology Policy (CITP). She has broad academic interests in the structure and utility of intelligence, both natural and artificial. Venues for her research range from Reddit to Science. She is best known for her work in systems AI and AI ethics, both of which she began during her Ph.D. in the 1990s, but she and her colleagues publish broadly, in biology, anthropology, sociology, philosophy, cognitive science, and politics. Current projects include “The Limits of Transparency for Humanoid Robotics” funded by AXA Research, and “Public Goods and Artificial Intelligence” (with Alin Coman of Princeton University’s Department of Psychology and Mark Riedl of Georgia Tech) funded by Princeton’s University Center for Human Values. Other current research includes understanding the causality behind the correlation between wealth inequality and political polarization, generating transparency for AI systems, and research on machine prejudice deriving from human semantics. She holds degrees in Psychology from Chicago and Edinburgh, and in Artificial Intelligence from Edinburgh and MIT. At Bath, she founded the Intelligent Systems research group (one of four in the Department of Computer Science) and heads their Artificial Models of Natural Intelligence.
Joanna Bryson: If we're coding AI and we understand that there's moral consequences does that mean the programmer has to understand it? It isn't only the programmer, although we do really think that we need to train programmers to be watching out for these kinds of situations, knowing how to whistle blow, knowing when to whistle blow. There is a problem of people being over-reactive and that costs companies and I understand that, but we also have sort of a Nuremberg situation that we need everybody to be responsible. But ultimately it isn't just about the programmers, the programmers work within the context of a company and the companies work in the context of regulation and so it is about the law, it's about society. One of the things, one of the papers that had come out in 2017 was Professor Alan Winfield was a thing about how if legislatures can't be expected to keep up with the pace of technological change, what they could keep up with is which professional societies do they trust. And they already do this in various disciplines; it's just new for AI. You say you have to achieve the moral standards of at least one professional organization so when they give their rules about what's okay. And that sort of allows you kind of a loose coupling because it's wrong for professional organizations to enforce the law to go after people to sue them, whatever. That's not what professional organizations are for. But it's also sensible it is what professional organizations are for is to keep up with their own field and to set things like codes of conduct. So that's why you want to bring those two things together the executive government and the professional organizations and you can kind of have the legislature join those two together.
This is what I'm working hard to keep in the regulations that it's always people in organizations that are accountable and so then they will be motivated to make sure that they can demonstrate they followed due process, so both of the people who are operating the AI and the people who developed the AI. Because it's like a car, when there's a car accident normally the driver is at fault, sometimes the person they hit is at fault because they did something completely unpredictable. But sometimes the manufacturer did something wrong with the brakes and that's a real problem. So we need to be able to show that the manufacturer followed good practice and it really is the fault of the driver. Or sometimes that there really isn't a fact of the matter like it was an unforeseeable thing in the past, but of course now it's happened so in the future we'll be more careful.
That just happened recently in Europe there was a case where somebody was on... it wasn't like a totally driverless car, but I guess it was cruise control or something it had some extra AI and unfortunately somebody had a stroke. Now what happens a lot and what automobile manufacturers have to look for is falling asleep at the wheel, but this guy had a stroke, which is different from falling asleep. So he was still kind of holding on semi in control but couldn't see anything, hit a family and killed two of the three of the family. And so the survivor was the father and he said he wasn't happy only to get money from insurance or whatever the liability or whatever, he wanted to know that whoever had caused this accident was being held accountable. So there was a big lawsuit and that company absolutely it was a car manufacturing company; they're able to show they followed due process; they had been incredibly careful; this was just like a really unlikely thing to have happen to have that kind of stroke that you'd be holding onto the steering wheel and all these other things. And so it was decided that there was nobody actually at fault. But it could have been different. If Facebook is really moving fast and breaking things then they're going to have a lot of trouble proving that they were doing due diligence when Cambridge Analytica got the data that it got. And so they are very likely to be held to account for anything that's found to have been a negative consequence of that behavior. It's something that computer scientists should want that tech companies should want they should want to be able to show that they've done the right thing.
So everybody who's ever made any code at all knows there's like two different pressures. One is you want to have clean beautiful code that you can understand that's well documented and everything and that's good because then if you ever need to change it or extend it or you want to write some more software that's great you can reuse it, other people can reuse it, maybe you'll get famous for your great code. The other thing is that you want to put stuff out as soon as you can because you can sell it faster, you don't have time, whatever, you want to go do something else or maybe you don't even understand what you're really doing and you've just barely got it to work. Whatever. So those two pressures are always working on each other and it's really, really important for all of our benefit, for society's benefit that we put weight on that side of that nice clean code so we can tell questions like the one I just mentioned questions like who's at fault if data goes the wrong place. So right now that's not the way it's been going. It has been completely sort of Wild West and nobody can tell where the data is going. But with a few lawsuits, with a few big failures I think everyone is going to be motivated to say no I want to show that the AI did the right thing and it was the owner's fault or that we followed due diligence and that was an unforeseeable consequence. They're going to want to prove that stuff. And like I said that's going to benefit not just the companies and not just the owners or operators, but all of us because we want liability to be on the people who are making the decisions and so that's the right way around so that's why we want to maintain human accountability even though the autonomous system is sometimes taking decisions.
The thing that drives me crazy that organizations do wrong about AI right now is when they go in and try to fight regulation by saying you'll lose the magic juice, like deep learning is the only reason we've got AI and if you regulate us then you can't use it because nobody knows what the weights are doing in deep learning. This is completely false. First of all, when you audit a company you don't go and try to figure out how the synapses are connected and their accountants, you just look at the accounts. So in the worst case, we can do the same thing with AI that we could be doing with humans. Now again, this goes back to what I was saying earlier about due diligence, if you have accountants and the accounts are wrong you can try to put the accountant on the stand and say why are the accounts wrong and then you try to establish whether they were doing the right thing at the right time or whatever. You can't do that with AI systems, but if you want to be able to prove that they were honest mistakes you can look at like how is the system trained? How was it being run? There's ways that you can audit whether the system was built appropriately. So I think we should be out looking for those because that also allows us to improve our systems. So the most important thing is just not believing the whole magic line. And one of the companies I've heard give the magic line in a regulatory setting in front of government representatives was Microsoft and that was in early 2017 and now they've completely reversed that. They've sat down, they've thought about it and now they've said accountability and transparency is absolutely core to what we should be doing with AI. I think Microsoft is making really strong efforts to be the adults in the room right now, which is interesting. Like I said just literally within one year there was that change. So I think everybody should be thinking that AI is not this big exception. Don't look for a way to get out of responsibility.
Joanna Bryson isn't a fan of companies that can't hold themselves responsible for their actions. Too many tech companies, she argues, think that they're above the law and that they should create what they want, no matter who it hurts, and have society pick up the pieces later. This libertarian attitude might be fine if the company happens to be a young startup. But if the company is a massive behemoth like Facebook that could easily manipulate 2 billion people worldwide — or influence an election, perhaps — perhaps there should be some oversight. Tech companies, she argues, could potentially create something catastrophic that they can't take back. And at the dawn of the AI era, where decisions made now could affect the future of mankind, regulation over these tech giants is needed now more than ever.
Get smarter, faster. Subscribe to our daily newsletter.
How would the ability to genetically customize children change society? Sci-fi author Eugene Clark explores the future on our horizon in Volume I of the "Genetic Pressure" series.
- A new sci-fi book series called "Genetic Pressure" explores the scientific and moral implications of a world with a burgeoning designer baby industry.
- It's currently illegal to implant genetically edited human embryos in most nations, but designer babies may someday become widespread.
- While gene-editing technology could help humans eliminate genetic diseases, some in the scientific community fear it may also usher in a new era of eugenics.
Tribalism and discrimination<p>One question the "Genetic Pressure" series explores: What would tribalism and discrimination look like in a world with designer babies? As designer babies grow up, they could be noticeably different from other people, potentially being smarter, more attractive and healthier. This could breed resentment between the groups—as it does in the series.</p><p>"[Designer babies] slowly find that 'everyone else,' and even their own parents, becomes less and less tolerable," author Eugene Clark told Big Think. "Meanwhile, everyone else slowly feels threatened by the designer babies."</p><p>For example, one character in the series who was born a designer baby faces discrimination and harassment from "normal people"—they call her "soulless" and say she was "made in a factory," a "consumer product." </p><p>Would such divisions emerge in the real world? The answer may depend on who's able to afford designer baby services. If it's only the ultra-wealthy, then it's easy to imagine how being a designer baby could be seen by society as a kind of hyper-privilege, which designer babies would have to reckon with. </p><p>Even if people from all socioeconomic backgrounds can someday afford designer babies, people born designer babies may struggle with tough existential questions: Can they ever take full credit for things they achieve, or were they born with an unfair advantage? To what extent should they spend their lives helping the less fortunate? </p>
Sexuality dilemmas<p>Sexuality presents another set of thorny questions. If a designer baby industry someday allows people to optimize humans for attractiveness, designer babies could grow up to find themselves surrounded by ultra-attractive people. That may not sound like a big problem.</p><p>But consider that, if designer babies someday become the standard way to have children, there'd necessarily be a years-long gap in which only some people are having designer babies. Meanwhile, the rest of society would be having children the old-fashioned way. So, in terms of attractiveness, society could see increasingly apparent disparities in physical appearances between the two groups. "Normal people" could begin to seem increasingly ugly.</p><p>But ultra-attractive people who were born designer babies could face problems, too. One could be the loss of body image. </p><p>When designer babies grow up in the "Genetic Pressure" series, men look like all the other men, and women look like all the other women. This homogeneity of physical appearance occurs because parents of designer babies start following trends, all choosing similar traits for their children: tall, athletic build, olive skin, etc. </p><p>Sure, facial traits remain relatively unique, but everyone's more or less equally attractive. And this causes strange changes to sexual preferences.</p><p>"In a society of sexual equals, they start looking for other differentiators," he said, noting that violet-colored eyes become a rare trait that genetically engineered humans find especially attractive in the series.</p><p>But what about sexual relationships between genetically engineered humans and "normal" people? In the "Genetic Pressure" series, many "normal" people want to have kids with (or at least have sex with) genetically engineered humans. But a minority of engineered humans oppose breeding with "normal" people, and this leads to an ideology that considers engineered humans to be racially supreme. </p>
Regulating designer babies<p>On a policy level, there are many open questions about how governments might legislate a world with designer babies. But it's not totally new territory, considering the West's dark history of eugenics experiments.</p><p>In the 20th century, the U.S. conducted multiple eugenics programs, including immigration restrictions based on genetic inferiority and forced sterilizations. In 1927, for example, the Supreme Court ruled that forcibly sterilizing the mentally handicapped didn't violate the Constitution. Supreme Court Justice Oliver Wendall Holmes wrote, "… three generations of imbeciles are enough." </p><p>After the Holocaust, eugenics programs became increasingly taboo and regulated in the U.S. (though some states continued forced sterilizations <a href="https://www.uvm.edu/~lkaelber/eugenics/" target="_blank">into the 1970s</a>). In recent years, some policymakers and scientists have expressed concerns about how gene-editing technologies could reanimate the eugenics nightmares of the 20th century. </p><p>Currently, the U.S. doesn't explicitly ban human germline genetic editing on the federal level, but a combination of laws effectively render it <a href="https://academic.oup.com/jlb/advance-article/doi/10.1093/jlb/lsaa006/5841599#204481018" target="_blank" rel="noopener noreferrer">illegal to implant a genetically modified embryo</a>. Part of the reason is that scientists still aren't sure of the unintended consequences of new gene-editing technologies. </p><p>But there are also concerns that these technologies could usher in a new era of eugenics. After all, the function of a designer baby industry, like the one in the "Genetic Pressure" series, wouldn't necessarily be limited to eliminating genetic diseases; it could also work to increase the occurrence of "desirable" traits. </p><p>If the industry did that, it'd effectively signal that the <em>opposites of those traits are undesirable. </em>As the International Bioethics Committee <a href="https://academic.oup.com/jlb/advance-article/doi/10.1093/jlb/lsaa006/5841599#204481018" target="_blank" rel="noopener noreferrer">wrote</a>, this would "jeopardize the inherent and therefore equal dignity of all human beings and renew eugenics, disguised as the fulfillment of the wish for a better, improved life."</p><p><em>"Genetic Pressure Volume I: Baby Steps"</em><em> by Eugene Clark is <a href="http://bigth.ink/38VhJn3" target="_blank">available now.</a></em></p>
Scientists discover burrows of giant predator worms that lived on the seafloor 20 million years ago.
- Scientists in Taiwan find the lair of giant predator worms that inhabited the seafloor 20 million years ago.
- The worm is possibly related to the modern bobbit worm (Eunice aphroditois).
- The creatures can reach several meters in length and famously ambush their pray.
A three-dimensional model of the feeding behavior of Bobbit worms and the proposed formation of Pennichnus formosae.
Credit: Scientific Reports
Beware the Bobbit Worm!<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="1f9918e77851242c91382369581d3aac"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/_As1pHhyDHY?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span>
The idea behind the law was simple: make it more difficult for online sex traffickers to find victims.
- SESTA (Stop Enabling Sex Traffickers Act) and FOSTA (Allow States and Victims to Fight Online Sex Trafficking Act) started as two separate bills that were both created with a singular goal: curb online sex trafficking. They were signed into law by former President Trump in 2018.
- The implementation of this law in America has left an international impact, as websites attempt to protect themselves from liability by closing down the sections of their sites that sex workers use to arrange safe meetings with clientele.
- While supporters of this bill have framed FOSTA-SESTA as a vital tool that could prevent sex trafficking and allow sex trafficking survivors to sue those websites for facilitating their victimization, many other people are strictly against the bill and hope it will be reversed.
What is FOSTA-SESTA?<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="723125b44601d565a7c671c7523b6452"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/WBaqDjPCH8k?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span><p>SESTA (Stop Enabling Sex Traffickers Act) and FOSTA (Allow States and Victims to Fight Online Sex Trafficking Act) were signed into law by former President Trump in 2018. There was some argument that this law may be unconstitutional as it could potentially violate the <a href="https://constitution.congress.gov/constitution/amendment-1/" target="_blank">first amendment</a>. A criminal defense lawyer explains this law in-depth in <a href="https://www.youtube.com/watch?v=RoWx2hYg5uo&t=38s" target="_blank" rel="nofollow">this video</a>. </p><p><strong>What did FOSTA-SESTA aim to accomplish?</strong></p><p>The idea behind the law was simple: make it more difficult for online sex traffickers to find victims. FOSTA-SESTA started as two separate bills that were both created with a singular goal: curb online sex trafficking. Targeting websites like Backpage and Craigslist, where sex workers would often arrange meetings with their clientele, FOSTA-SESTA aimed to stop the illegal sex-trafficking activity being conducted online. While the aim of FOSTA-SESTA was to keep people safer, these laws have garnered international speculation and have become quite controversial. </p><p><a href="https://www.businesswire.com/news/home/20180321006214/en/National-Anti-Trafficking-Coalition-Celebrates-Survivors-Senate-Passes" target="_blank" rel="noopener noreferrer">According to BusinessWire</a>, many people are in support of this bill, including the National Center for Missing and Exploited Children and World Without Exploitation (WorldWE). </p><p>"With the growth of the Internet, human trafficking that once happened mainly on street corners has largely shifted online. According to the National Center for Missing & Exploited Children, 73 percent of the 10,000 child sex trafficking reports it receives from the public each year involve ads on the website Backpage.com."</p><p>As soon as this bill was <a href="https://www.pivotlegal.org/sesta_fosta_censoring_sex_workers_from_websites_sets_a_dangerous_precedent" target="_blank" rel="noopener noreferrer">signed into law</a>, websites where sex workers often vetted and arranged meetings with their clients could now be held liable for the actions of the millions of people that used their sites. This meant websites could be prosecuted if they engaged in "the promotion or facilitation of prostitution" or "facilitate traffickers in advertising the sale of unlawful sex acts with sex trafficking victims." </p><p><strong>The bill's effects were felt around the world — from Canadians being unhappy with the impact of this American bill to U.K. politicians considering the implementation of similar laws in the future.</strong> </p><p>Heather Jarvis, the program coordinator of the Safe Harbour Outreach Project (SHOP), which supports sex workers in the St. John's area, <a href="https://www.cbc.ca/news/canada/newfoundland-labrador/heather-jarvis-website-shutdown-1.4667018" target="_blank" rel="noopener noreferrer">explained to CBC in an interview</a> that the American bill is impacting everyone, everywhere: "When laws impact the internet — the internet is often borderless — it often expands across different countries. So although these are laws in the United States, what we've seen is they've been shutting down websites in Canada and other countries as well."</p><p>Jarvis suggests in her interview that instead of doing what they aimed to do with the bill and improving the safety of victims of sex trafficking or sexual exploitation, the website shutdowns are actually making sex workers less safe. </p><p>While <a href="https://gizmodo.com/the-uk-wants-its-own-version-of-fosta-sesta-that-could-1827420794" target="_blank" rel="noopener noreferrer">one U.K. publication</a> refers to FOSTA-SESTA as "well-intentioned but ultimately deeply-flawed laws," it also mentions that politicians in the United Kingdom are hoping to pursue similar laws in the near future. </p>
Has FOSTA-SESTA done more harm than good?<img type="lazy-image" data-runner-src="https://assets.rebelmouse.io/eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpbWFnZSI6Imh0dHBzOi8vYXNzZXRzLnJibC5tcy8yNTUxMzY5Ny9vcmlnaW4uanBnIiwiZXhwaXJlc19hdCI6MTY2ODUyNDc4OX0.dSEEzcflJJUTnUCFmuwmPAIA0f754eW7rN8x6L7fcCc/img.jpg?width=1245&coordinates=-68%2C595%2C-68%2C595&height=700" id="69d99" class="rm-shortcode" data-rm-shortcode-id="734759fa254b5a33777536e0b4d7b511" data-rm-shortcode-name="rebelmouse-image" alt="sex worker looking online for a job" data-width="1245" data-height="700" />
Is this really going to help, or is this bill simply pushing sex work and sex-related content further into the dark?
Credit: Евгений Вершинин on Adobe Stock<p>While <a href="https://www.businesswire.com/news/home/20180321006214/en/National-Anti-Trafficking-Coalition-Celebrates-Survivors-Senate-Passes" target="_blank">supporters of this bill</a> have framed FOSTA-SESTA as a vital tool that could prevent sex trafficking and allow sex trafficking survivors to sue those websites for facilitating their victimization, many other people are strictly against the bill and hope it will be reversed.</p><p><strong>One of the biggest problems many people have with this bill is that it forces sex workers into an even more dangerous situation, which is quite the opposite of what the bill had intended to do.</strong> </p><p>According to <a href="https://www.theglobeandmail.com/canada/article-anti-trafficking-activists-cheer-but-sex-workers-bemoan-shutdown-of/" target="_blank" rel="noopener noreferrer">Globe and Mail</a>, there has been an upswing in pimps sending sex workers messages that promise work - which puts sex workers on the losing end of a skewed power-dynamic, when before they could attempt to safely arrange their own meetings online. </p><p><strong>How dangerous was online sex work before FOSTA-SESTA? </strong></p><p><a href="https://www.beyond-the-gaze.com/wp-content/uploads/2018/01/BtGbriefingsummaryoverview.pdf" target="_blank" rel="noopener noreferrer">The University of Leicester Department of Criminology</a> conducted an online survey that focused on the relative safety of internet-based sex work compared with outdoor sex work. According to the results, 91.6 percent of participants had not experienced a burglary in the past 5 years, 84.4 percent had not experienced physical assault in the same period, and only 5 percent had experienced physical assault in the last 12 months. </p><p><a href="https://www.pivotlegal.org/sesta_fosta_censoring_sex_workers_from_websites_sets_a_dangerous_precedent" target="_blank" rel="noopener noreferrer">PivotLegal</a> expresses concerns about this: "It is resoundingly clear, both from personal testimony and data, that attacking online sex work is an assault on the health and safety of people in the real world. In a darkly ironic twist, SESTA/FOSTA, legislation aimed at protecting victims of and preventing human trafficking for the purposes of sexual exploitation, will do the exact opposite."</p><p><strong>Websites are also being hypervigilant (and censoring more content than needed) because they can't possibly police every single user's activity on their platform.</strong> </p><p>Passing this bill meant any website (not just the ones that are commonly used by sex traffickers) could be held liable for their user's posts. Naturally, this saw a general "tightening of the belt" when it came to what was allowed on various platforms. In late 2018, shortly after the FOSTA-SESTA bill was passed, companies like Facebook slowly began to alter their terms and conditions to protect themselves. </p><p>Facebook notably added sections that express prohibited certain sexual content and messages:</p><p style="margin-left: 20px;"><em>"Content that includes an implicit invitation for sexual intercourse, which can be described as naming a sexual act and other suggestive elements including (but not limited to):</em></p><p style="margin-left: 20px;"><em>– vague suggestive statements such as: 'looking forward to an enjoyable evening'</em></p><p style="margin-left: 20px;"><em>– sexual use of language […]</em></p><p style="margin-left: 20px;"><em>– content (self-made, digital or existing) that possibly portrays explicit sexual acts or a suggestively positioned person/suggestively positioned persons."<br><br> </em></p><p>Additionally, sections like this were also added, prohibiting things that could allude to sexual activity: </p><p style="margin-left: 20px;"><em>"Content in which other acts committed by adults are requested or offered, such as:</em></p><p style="margin-left: 20px;"><em>– commercial pornography</em></p><p style="margin-left: 20px;"><em>– partners that share fetishes or sexual interests"</em></p><p>Facebook wasn't the only website to crack down on their policies — the Craigslist classifieds section being removed and Reddit banned quite a large number of sex-worker related subreddits. </p><p><strong>Is FOSTA-SESTA really helpful?</strong> </p><p>This is the question many people are facing with the FOSTA-SESTA acts being passed just a few years ago. Is this really going to help, or is this bill simply pushing sex work and sex-related content further into the dark? Opinions seem to be split down the middle on this — what do you think?</p>
A leading British space scientist thinks there is life under the ice sheets of Europa.
- A British scientist named Professor Monica Grady recently came out in support of extraterrestrial life on Europa.
- Europa, the sixth largest moon in the solar system, may have favorable conditions for life under its miles of ice.
- The moon is one of Jupiter's 79.
Neil deGrasse Tyson wants to go ice fishing on Europa<div class="rm-shortcode" data-media_id="GLGsRX7e" data-player_id="FvQKszTI" data-rm-shortcode-id="f4790eb8f0515e036b24c4195299df28"> <div id="botr_GLGsRX7e_FvQKszTI_div" class="jwplayer-media" data-jwplayer-video-src="https://content.jwplatform.com/players/GLGsRX7e-FvQKszTI.js"> <img src="https://cdn.jwplayer.com/thumbs/GLGsRX7e-1920.jpg" class="jwplayer-media-preview" /> </div> <script src="https://content.jwplatform.com/players/GLGsRX7e-FvQKszTI.js"></script> </div>
Water Vapor Above Europa’s Surface Deteced for First Time<span style="display:block;position:relative;padding-top:56.25%;" class="rm-shortcode" data-rm-shortcode-id="9c4abc8473e1b89170cc8941beeb1f2d"><iframe type="lazy-iframe" data-runner-src="https://www.youtube.com/embed/WQ-E1lnSOzc?rel=0" width="100%" height="auto" frameborder="0" scrolling="no" style="position:absolute;top:0;left:0;width:100%;height:100%;"></iframe></span>
Answering the question of who you are is not an easy task. Let's unpack what culture, philosophy, and neuroscience have to say.
- Who am I? It's a question that humans have grappled with since the dawn of time, and most of us are no closer to an answer.
- Trying to pin down what makes you you depends on which school of thought you prescribe to. Some argue that the self is an illusion, while others believe that finding one's "true self" is about sincerity and authenticity.
- In this video, author Gish Jen, Harvard professor Michael Puett, psychotherapist Mark Epstein, and neuroscientist Sam Harris discuss three layers of the self, looking through the lens of culture, philosophy, and neuroscience.