en
Roger McNamee

Zucked

Avise-me quando o livro for adicionado
Para ler este livro carregue o arquivo EPUB ou FB2 no Bookmate. Como carrego um livro?
  • Masha Samartsavafez uma citaçãohá 5 anos
    As I noted in chapter 3, one of the most manipulative reciprocity tricks played by Facebook relates to photo tagging. When users post a photo, Facebook offers an opportunity to tag friends—the message “[Friend] has tagged you in a photo” is an appealing form of validation—which initiates a cycle of reciprocity, with notifications to users who have been tagged and an invitation to tag other people in the photo. Tagging was a game changer for Facebook because photos are one of the main reasons users visit Facebook daily in the first place. Each tagged photo brings with it a huge trove of data and metadata about location, activity, and friends, all of which can be used to target ads more effectively. Thanks to photo tagging, users have built a giant database of photos for Facebook, complete with all the information necessary to monetize it effectively.
  • Masha Samartsavafez uma citaçãohá 5 anos
    The persuasive technology tricks espoused by Fogg include several related to social psychology: a need for approval, a desire for reciprocity, and a fear of missing out. Everyone wants to feel approved of by others. We want our posts to be liked. We want people to respond to our texts, emails, tags, and shares. The need for social approval is what made Facebook’s Like button so powerful. By controlling how often a user experiences social approval, as evaluated by others, Facebook can get that user to do things that generate billions of dollars in economic value. This makes sense because the currency of Facebook is attention. Users manicure their image in the hope of impressing others, but they soon discover that the best way to get attention is through emotion and conflict. Want attention online? Say something outrageous. This phenomenon first emerged decades ago in online forums such as The WELL, which often devolved into mean-spirited confrontation and has reappeared in every generation of tech platform since then.

    Social approval has a twin: social reciprocity. When we do something for someone else, we expect them to respond in kind. Likewise, when a person does something for us, we feel obligated to reciprocate. When someone “follows” us on Instagram, we feel obligated to “follow” them in return. When we see an “Invitation to Connect” on LinkedIn from a friend, we may feel guilty if we do not reciprocate the gesture and accept it. It feels organic, but it is not. Millions of users reciprocate one another’s Likes and friend requests all day long, not aware that platforms orchestrate all of this behavior upstream, like a puppet master.
  • Masha Samartsavafez uma citaçãohá 5 anos
    Notifications are another way that platforms exploit the weakest elements of human psychology. Notifications exploit an old sales technique, called the “foot in the door” strategy, that lures the prospect with an action that appears to be low cost but sets in motion a process that leads to bigger costs. Who wouldn’t want to know they have just received an email, text, friend request, or Like? As humans, we are not good at forecasting the true cost of engaging with a foot-in-the-door strategy. Worse yet, we behave as though notifications are personal to us, completely missing that they are automatically generated, often by an algorithm tied to an artificial intelligence that has concluded that the notification is just the thing to provoke an action that will serve the platform’s economic interests.
  • Masha Samartsavafez uma citaçãohá 5 anos
    Another tool from the Fogg tool kit is the “bottomless bowl.” News Feeds on Facebook and other platforms are endless. In movies and television, scrolling credits signal to the audience that it is time to move on, providing what Tristan would call a “stopping cue.” Platforms with endless news feeds and autoplay remove that signal, ensuring that users maximize their time on site for every visit. Endless news feeds work on dating apps. They work on photo sites like Instagram. And they work on Facebook. YouTube, Netflix, and Facebook use autoplay on their videos because it works, too. Next thing you know, millions of people are sleep deprived from binging on videos, checking Instagram, or browsing Facebook.
  • Masha Samartsavafez uma citaçãohá 5 anos
    Platforms like Facebook would have you believe that the user is always in control, but as I have said before, user control is an illusion. Maintaining that illusion is central to every platform’s success, but with Facebook, it is especially disingenuous. Menu choices limit user actions to things that serve Facebook’s interests. In addition, Facebook’s design teams exploit what are known as “dark patterns” in order to produce desired outcomes. Wikipedia defines a dark pattern as “a user interface that has been carefully crafted to trick users into doing things.” The company tests every pixel to ensure it produces the desired response. Which shade of red best leads people to check their notifications? For how many milliseconds should notifications bubbles appear in the bottom left before fading away, to most effectively keep users on site? Based on what measures of closeness should we recommend new friends for you to “add”?
  • Masha Samartsavafez uma citaçãohá 5 anos
    Research suggests that people who accept one conspiracy theory have a high likelihood of accepting a second one. The same is true of inflammatory disinformation.
  • Masha Samartsavafez uma citaçãohá 5 anos
    As users, we sometimes adopt an idea suggested by the platform or by other users on the platform as our own. For example, if I am active in a Facebook Group associated with a conspiracy theory and then stop using the platform for a time, Facebook will do something surprising when I return. It may suggest other conspiracy theory Groups to join because they share members with the first conspiracy Group. And because conspiracy theory Groups are highly engaging, they are very likely to encourage reengagement with the platform. If you join the Group, the choice appears to be yours, but the reality is that Facebook planted the seed. It does so not because conspiracy theories are good for you but because conspiracy theories are good for them.
  • Masha Samartsavafez uma citaçãohá 5 anos
    For example, former YouTube algorithm engineer Guillaume Chaslot created a program to take snapshots of what YouTube would recommend to users. He learned that when a user watches a regular 9/11 news video, YouTube will then recommend 9/11 conspiracies; if a teenage girl watches a video on food dietary habits, YouTube will recommend videos that promote anorexia-related behaviors. It is not for nothing that the industry jokes about YouTube’s “three degrees of Alex Jones,” referring to the notion that no matter where you start, YouTube’s algorithms will often surface a Jones conspiracy theory video within three recommendations.
  • Masha Samartsavafez uma citaçãohá 5 anos
    Extreme views attract more attention, so platforms recommend them. News Feeds with filter bubbles do better at holding attention than News Feeds that don’t have them. If the worst thing that happened with filter bubbles was that they reinforced preexisting beliefs, they would be no worse than many other things in society. Unfortunately, people in a filter bubble become increasingly tribal, isolated, and extreme. They seek out people and ideas that make them comfortable.
  • Masha Samartsavafez uma citaçãohá 5 anos
    Social media has enabled personal views that had previously been kept in check by social pressure—white nationalism is an example—to find an outlet. Before the platforms arrived, extreme views were often moderated because it was hard for adherents to find one another. Expressing extreme views in the real world can lead to social stigma, which also keeps them in check. By enabling anonymity and/or private Groups, the platforms removed the stigma, enabling like-minded people, including extremists, to find one another, communicate, and, eventually, to lose the fear of social stigma.
fb2epub
Arraste e solte seus arquivos (não mais do que 5 por vez)