3 Ways to Solve the ‘Social Dilemma’ That Netflix Doc Is Warning Us About | PRO Insight

Available to WrapPRO members

In the words of tech executives: We are heading towards “the destruction of civilization, the end of democracy and a checkmate on humanity”


“The Social Dilemma” is a terrifying documentary about the toxic combination of social media and surveillance capitalism — and how together they’re harming our lives and our society. It’s the story of how the tech giants discovered they could optimize addiction by leveraging techniques of psychological manipulation, in order to earn billions. It’s told by some of the very people who created and scaled these companies — leaders from Facebook, Google, Twitter, and others. None of them set out with bad intentions. They were just trying to build popular products and make money along the way. But as the movie shows, things got out of control, and the unintended consequences have been devastating; in the words of these tech executives themselves: We are heading towards “the destruction of civilization, the end of democracy, and a checkmate on humanity.” Addiction for profit The interviews with these tech execs, as well as social scientists, educators, and others, are set against a fictional story of a family who we see living a screen-time based life that most will find familiar. To them, and even to us watching, the importance and dependence they place on their online interactions seems unfortunate, but certainly not tragic. But the film is incredibly effective in taking us on a tour of the addictive tricks the social media companies use, and the destructive way surveillance capitalism pays for and justifies the resulting carnage. None of this will be new for those who’ve been paying attention, but as big tech’s magic tricks are revealed the filmmakers both zoom in and slow them down in a way that makes it much easier to understand and much more repulsive to contemplate. We all know that these apps get us to login and scroll by exploiting our psychology and that the money is made by selling our eyeballs to advertisers. But the movie drills down effectively on how and why those practices are more advanced and insidious than most understand:
  • Technologist Jaron Lanier clarifies that advertisers aren’t paying to show you ads, but rather for “imperceptible changes in your behavior.” The targeting we hear about is only the precursor to the main event; getting you to think differently and take action.
  • Professor Shoshana Zuboff highlights that because these companies are getting paid for results, what they’re really selling is your future behavior. This gives them a stake in both predicting and manipulating that behavior, and the better they get at both the greater the harm they can inflict.
  • Investor Roger McNamee points out that we, as humans, are pitted against an algorithm, and that is not a fair fight. They have massive computing power, which has increased a trillion-fold in just a few decades, and machine learning that turns our willpower into a blade of grass hoping to fend off a bulldozer.
  • Ex-Googler Tristan Harris laments that while we’ve been worried about when computers will exceed human strengths — and therefore potentially subjugate us — they have in some cases already overtaken human weaknesses, which gives them the power to harm us at will right now.
The aggregate case that’s made, quite convincingly, is that the biggest of these firms have effectively built voodoo dolls of us, based on the information we’ve given them or allowed them to collect. They run predictive algorithms on our data to tell them what we are likely to do next and find groups of people who are similar to us (cohorts) that help them sharpen these predictions. These companies know how we’re likely to react to various stimuli, so they increasingly manipulate us via our feeds. If they don’t know, they run small tests and find out what we’re likely to do, then execute at scale. They know how to get us to visit them, then get us to stay, then get us to act, and even how to make us feel – with subtlety and increasing precision. We’re largely unaware we’re being manipulated and are generally without defenses. We don’t even know they’ve done it even after it’s happened. But the problem isn’t that they want more of us to use their products more often — that’s only an intermediate goal. The problem is that they have a unique set of tools (human relationships, cognitive models, powerful algorithms, endless content, etc.) to do that incredibly effectively (it could be considered asymmetrical psychological warfare), and they’re optimizing for their profit based on getting us to take actions regardless of the impact on us of taking those actions. Who is getting hurt? The impact all of this is having on us is bad both individually and worse as a society. A movie script word-cloud would reveal that addiction, alienation, assault (on democracy), cyber-attacks, destroy, dysmorphia, dystopia, erosion (of social fabric), existential threat, fake news, harassment, outrage, polarization, self-harm, surveillance capitalism, survival, loneliness, manipulation, and radicalization all figure prominently. This is how the people who created these platforms, the people who study them, and those who clean up after, see the impacts. It’s no longer just about using a little personalization to optimize sales. It becomes a path that starts with personalization then gets pulled to radicalization to falsification to antagonism. It works like this: The platforms have a natural desire to maximize your use (as does any business) so they personalize what you see. This is where they leverage what they know about you, and people similar to you, to give you a feed full of things that you like and agree with, and over time they hide and avoid anything they think you may dislike or disagree with. It’s called a filter-bubble, and we all live in them: Our news, search results, entertainment choices, ads, offers, and the opinions we see on most of the sites we use are all ‘personalized’ in this way. The good part of a filter bubble is that it’s safe and feels comfortable. It reassures us that we’ve made good choices because it seems like ‘the world’ agrees with our tastes and our opinions. We like it, so we come back often. We engage with the content because it aligns with our existing goals and models. It’s very hard to keep in mind that we’re seeing a view optimized for an audience of one. The bad part is that life in a filter bubble inevitably strengthens the thoughts or beliefs you already had, often supercharging them because all you see are variations and extensions. You may even get ‘pulled down rabbit holes’ where more and more content that extends the ideas a little further and a little further are presented to you. This kind of increasingly extreme but compelling content tends to be very engaging and can quickly lead to some level of radicalization. It’s worth noting that the platforms don’t actually intend to lead you in any particular direction, they’re just taking advantage of the reality that more intense content is generally more effective at keeping your attention and earning them more time and more clicks. YouTube is the undisputed king of this technique, with well documented horrific ramifications. Particularly when there are political interests involved, falsified content (deliberately or unintentionally) often creeps into the filter bubbles or defines rabbit holes. The platforms rarely prevent this because they really don’t take responsibility for the verification or confirmation of anything. As we’ve seen before, false content when constantly reinforced in a narrow and intensified experience tends to breed extremism. And this often leads to antagonism towards both contrary ideas and those who hold them. Apply this formula to enough issues on a broad enough scale, and you arrive in a world that lacks the kind of shared reality necessary for a society to grapple with issues, make decisions, move forward, and remain cohesive. Sound familiar? Where is the solution? Where the film disappoints is in offering solutions. What changes are needed now? How can they be made? Who should or will make them? These questions and others are left almost entirely unanswered by both the key players on screen and the filmmakers. They spend an hour and forty minutes making a devastatingly thorough and complex case for a massive societal problem, and then wrap it up with a few mumbled calls for ambiguous ‘government regulation’ and an exasperated and almost pleading realization that, ‘We have to do something.’ It seems like a colossal wasted opportunity. Solutions obviously aren’t easy; but there are at least three options that are worth pursuing. Each can have a meaningful impact and taken together perhaps a substantial one. None is simple, each has a different cost, trajectory, and responsible parties – so one would hope work would begin towards all of them. 1. Stop feeding the beast. The film makes clear that the platforms and their algorithms run on the data we voluntarily share with them. But most people share vastly more data far more freely than they need to; by changing settings and learning to make better choices and using a few simple tools or utilities each of us can massively reduce what these firms know about us and therefore weaken their ability to use your own data against you. Helping people to make that change is the mission of The Privacy Co. and the Priiv app, which launched in August to help people protect themselves against data theft. The free app is designed to make security easy to manage and covers nearly every aspect of a digital user’s life. The core of the product is PriivScore, fashioned much like a credit score, to determine how much protection people have, and provide a target score with ways to protect their online privacy. 2. Governmental regulation — or self-imposed industry policies. Both GDPR and CCPA are minimal first steps, but they put government in the game in a material way and they’ve put industry on notice that they need to re-tool for compliance and they better start thinking about data minimization, better disclosures, and more generally about the implications of their actions or else they might not like the next regulatory round. Of course, the idea of governmental regulations solving anything is fraught with risk, uncertainty, and potential unintended consequences. 3. Fix the system rather than reform it. While most of the executives in the film do suggest that government regulation is needed, they also point out that the business models behind these companies are the ultimate problem. Let’s be clear, these firms earn their revenue by harming their customers and damaging the society in which they live. Advertising-driven businesses, at least in this modern world, are naturally driven to promoting addiction and leveraging manipulation. So why not change business models? To the executives in the film, apparently, that’s unimaginable. But why? None of these businesses have to use advertising to drive revenue. They each provide services that consumers obviously value. Facebook earned $112 per user (US/Canada) in 2019, while Google’s advertising business earned $256 per user that year. Twitter earned about $25 and Pinterest about $15 for the year. Netflix for comparison sits in between at $131. This means roughly that for the price of two Netflix subscriptions users could get a Facebook, Twitter, Instagram, Snapchat, and probably one or two more. For another two Netflix subscriptions, users could get Google Search, Gmail, Google Maps, and YouTube. All that technology and talent could work on user delight instead of user abuse. All while they remain rapidly growing and hugely profitable. Of course, this switch isn’t simple. Users are conditioned to get these things for free. More affluent users generate more revenue and so it doesn’t work if only they switch to a paid model. But the problems are worth solving when you consider the result: the platforms could then focus on serving customers rather than serving them to advertisers. “The Social Dilemma” is an understated title. What the film lays out is “The Social Catastrophe.” It’s impossible to spend two hours watching this presentation and not find yourself awed by the scope of the problem and petrified for our collective future. It’s been almost 10 years since former Facebook employee Jeff Hammerbacher said: “The best minds of my generation are thinking about how to make people click ads. That sucks.” It remains so. The people who built these companies, and many more like them, often say they want to solve hard problems. They also often claim to want important work with meaning. Well, here you go. Save us and save yourself. Move fast and break these things.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.