Skip to content
Who's in the Video
Ramesh Srinivasan is Professor of Information Studies and Design Media Arts at UCLA. He makes regular appearances on NPR, The Young Turks, MSNBC, and Public Radio International, and his writings[…]
Sign up for the Smarter Faster newsletter
A weekly newsletter featuring the biggest ideas from the smartest people

Beyond the Valley: How Innovators around the World are Overcoming Inequality and Creating the Technologies of Tomorrow (The MIT Press)

RAMESH SRINIVASAN: What has occurred is the internet, which was a publicly funded infrastructure, the web was also a nonprofit initiative. These technologies took all of our public input and monetized that and we directed that content in ways that suited these corporate interests. The intention wasn't to necessarily threaten our democracy or any other institutions that we rely on as Americans and many citizens across the world, but that's been the effect of all of this. And the reason why is the principles of democracy rely on a very open sort of media environment and also factuality. And what has occurred through these technology platforms is not only have they become the places we've gone to to democratically communicate, to access news, specifically Facebook in this case, but what has been made visible to us is dependent on choices that are hidden from plain sight. They are choices that are not even made by the technology companies themselves. They are made by complex computer systems that are optimized for one output and one output only which is to keep us glued, to keep our attention, to keep us online.

So that actually has a splintering effect on a democracy because in a democracy all of us expect some sort of common baseline of information. We expect some exposure to facts. We expect some exposure to a range of different opinions. We do expect though that has eroded a sort of civil sense of dialogue though that does exist but it's less so the case. What we don't expect and represents a threat to democracy are invisible silos where we are exposed to inflammatory, trolling, gaslighting and at times deliberately false content. But that's the new normal when it comes to algorithmic platforms and that's why this all needs to be dealt with right now.

So a digital economy and world that work for the 99 percent are one where, is one where technologies don't support the interests of some at the cost of others. They're sort of a zero sum mentality that can end up costing all of us actually at the end of the day. A digital economy produces prosperity and value for all. It does support business interests. It does support the great developments for consumers that a lot of digital platforms have provided, but that doesn't come at the cost of economic security, of worker security, of diverse opinions, of racial minorities, of indigenous peoples, of women. The issue is that right now our digital world through the technologies that have globalized to the world are more or less structured, influenced and dominated by a few technology companies that are located in a small sliver of the world – Silicon Valley, in Seattle and also actually in China. And they all have different kind of outcomes. But the people who are leading these companies not only are they supposed to develop technologies for their private interest, never mind the effects on the rest of us, but they tend to be in terms of demographics not representative of the vast majority of their users. We don't see many women. We don't see many racial minorities. We do see some Asian and white males.

And so as a result intentionally or not they are coding into the digital world outcomes that are generative of greater inequality. And it's really important to just situate this on top of what our world looks like right now and even our country looks like. Three people or so with the equivalent wealth to 195 million in this country. Whoever would have imagined that. That all has happened in the past few decades. Globally seven or eight people depending on what estimates you look at with equivalent wealth to 3.9 to 4 billion people approximately. These are different estimates on this. That wasn't even created by the internet and digital technology. But the internet and digital technology are amplifying these problems. So how can we arrive at such a world. That's bad for everybody. And again that's why we have to do something about these issues right now and there's a lot we can do and that's what I'm trying to argue for in this book.

So what can we do about these inequalities that we face right now. On the one hand we can see these inequalities as reasons to be upset, concerned, anxious, nervous and critical. And that's fine. I understand where that comes from. But, to me they represent alternatives and opportunities for us to actually engage in productive, progressive, pragmatic action. And some of those possible outcomes or pathways can emerge. So first of all every single person who is in danger of losing their job, losing their economic security which is already happening needs to be acknowledged, addressed and humanized not just through lip service but by actually presenting economic opportunities for those people. So in other words what I'm getting at on the economic level are jobs that are shifting to the gig economy, right like Uber drivers and so on, that many sort of studies are showing are likely the gateway to an automated world. Those people need to be protected. They either need to be presented with new types of jobs that are dignified, that are economically secure or we need to figure out other outcomes.

Imagine if Uber was at least partially, if not completely, owned or an Uber type model by its drivers, by its laborers. The entire model of technology corporations right now is to make labor and costs of all forms an afterthought, to basically disregard those sorts of costs to maximize profit and valuation. And that's a very toxic model on a social level. So that's one. That's on the economic level. Politically there's no question in my mind that what we need are not just independent auditors – and we have to not just say independent. We have to actually make transparent who these journalists are that are actually in charge of these algorithmic systems that people basically use to access news. For example, on Facebook, right. So we should actually bring reputable journalists across the political spectrum to actually design these algorithms and audit these algorithms with engineers. So there has to be public-private partnerships. That's the only way it's not going to turn into a complete implosion for Facebook, for example, which is getting so much criticism right now.

But this is symptomatic of a larger problem and an opportunity for us to actually develop real solutions to these issues. So that's a second issue on the political and democratic level I think a third element which is both economic and political is the question of making sure that we support small businesses in the digital economy and even alternative technology platforms to create a more competitive environment. That's going to allow what we see now which is horizontal integration across the board. Monopolistic type behavior by Facebook but also Google, Amazon and so on to actually be stemmed a little bit. Don't claim the language of a marketplace without actually supporting an open marketplace. But an open marketplace just like free speech doesn't mean that it's just presumed. Everybody has to have an equal opportunity to participate in these shifts. And then the last point I want to make which I think is very important is that vulnerable people and vulnerable communities in our world who have been historically discriminated against need to be first and foremost part of these solutions. Part of where we go and what we consider moving forward.

Workers should have power over designing platforms that define the future of work. Black and brown communities including like Black Lives Matter type communities that are victims of AI's algorithmic systems that are turning out to be racist across the board whether it's predictive policing or courtroom algorithmic systems. They should be designing those systems or even making decisions whether those systems should exist. So what I'm getting at is we have to completely open up the palette socially, politically and cultural over who has power and governance over technologies. That can coexist with Silicon Valley. That can coexist with Amazon, but it's not this blindness. What has basically happened is we've all become more or less blind lemmings in this socially engineered game that is now disrupting in a very negative way all of our lives. And that's why we have to do something about it right now.


Related