Tags: filter bubble*

70 bookmark(s) - Sort by: Date ↓ / Title / Voting /

  1. As the visualization below shows, Michele, the bot pretending to be a fascist, enjoyed a radically different news feed experience from the others:
    Total number of posts seen by each bot (the wider the bar, the more posts), grouped by how many times the posts were repeated (the higher up the bar, the more times).

    Fash-bot Michele is shown a much smaller variety of posts, repeated way more often than normal — it saw some posts as often as 29 times in the 20 days represented in the data set.

    I mostly agree with studies such as Political polarization? Don’t blame the web, especially because the opposite belief is a kind of techno-determinism I feel doesn’t take into account a lot of political complexity. But the data above displays a frightening situation: the Michele bot has been segregated by the algorithm, and only receives content from a very narrow political area. Sure, let’s make fun of fascists because they see mostly pictures or because they are ill-informed, but these people will vote in 31 hours. Not cool.

    If you were curious what posts the Facebook algorithm deemed so essential that they had to be shown 29 times each (once a day or more, on average — each), here they are, all three of them. The third is peculiar, with its message that “mass media does not give us a platform, they never even mention our name, but people still declare they will vote for us. Mass media is a scam, spread the word”.
    Voting 0
  2. Who is doing the targeting?

    Albright: It really depends on the platform and the news event. Just the extensiveness of the far right around the election: I can’t talk about that right this second, but I can say that, very recently, what I’ve tended to see from a linking perspective and a network perspective is that the left, and even to some degree center-left news organizations and journalists, are really kind of isolated in their own bubble, whereas the right have very much populated most of the social media resources and use YouTube extensively. This study I did over the weekend shows the depth of the content and how much reach they have. I mean, they’re everywhere; it’s almost ubiquitous. They’re ambient in the media information ecosystem. It’s really interesting from a polarization standpoint as well, because self-identified liberals and self-identified conservatives have different patterns in unfriending people and in not friending people who have the opposite of their ideology.

    From those initial maps of the ad tech and hyperlink ecosystem of the election-related partisan news realm, I dove into every platform. For example, I did a huge study on YouTube last year. It led me to almost 80,000 fake videos that were being auto-scripted and batch-uploaded to YouTube. They were all keyword-stuffed. Very few of them had even a small number of views, so what these really were was about impact — these were a gaming system. My guess is that they were meant to skew autocomplete or search suggestions in YouTube. It couldn’t have been about monetization because the videos had very few views the sheer volume wouldn’t have made sense with YouTube’s business model.

    Someone had set up a script that detected social signals off of Twitter. It would go out and scrape related news articles, pull the text back in, and read it out in a computer voice, a Siri-type voice. It would pull images from Google Images, create a slideshow, package that up and wrap it, upload it to YouTube, hashtag it and load it with keywords. There were so many of these and they were going up so fast that as I was pulling data from the YouTube API dozens more would go up.

    I worked with The Washington Post on a project where I dug into Twitter and got, for the last week leading up to the election, a more or less complete set of Twitter data for a group of hashtags. I found what were arguably the top five most influential bots through that last week, and we found that the top one was not a completely automated account, it was a person.

    The Washington Post’s Craig Timberg » looked around and actually found this person and contacted him and he agreed to an interview at his house. It was just unbelievable. It turns out that this guy was almost 70, almost blind.

    From Timberg’s piece: “Sobieski’s two accounts…tweet more than 1,000 times a day using ‘schedulers’ that work through stacks of his own pre-written posts in repetitive loops. With retweets and other forms of sharing, these posts reach the feeds of millions of other accounts, including those of such conservative luminaries as Fox News’s Sean Hannity, GOP strategist Karl Rove and Sen. Ted Cruz (R-Tex.), according to researcher Jonathan Albright…’Life isn’t fair,’ Sobieski said with a smile. ‘Twitter in a way is like a meritocracy. You rise to the level of your ability….People who succeed are just the people who work hard.'” »

    The most dangerous accounts, the most influential accounts, are often accounts that are supplemented with human input, and also a human identity that’s very strong and possibly already established before the elections come in.

    I mean, I do hold that it’s not okay to come in and try to influence someone’s election; when I look at these YouTube videos, I think: Someone has to be funding this. In the case of the YouTube research, though, I looked at this more from a systems/politics perspective.

    We have a problem that’s greater than the one-off abuse of technologies to manipulate elections. This thing is parasitic. It’s growing in size. The last week and a half are some of the worst things I’ve ever seen, just in terms of the trending. YouTube is having to manually go in and take these videos out. YouTube’s search suggestions, especially in the context of fact-checking, are completely counter-productive. I think Russia is a side effect of our larger problems.

    Why is it getting worse?

    Albright: There are more people online, they’re spending more time online, there’s more content, people are becoming more polarized, algorithms are getting better, the amount of data that platforms have is increasing over time.

    I think one of the biggest things that’s missing from political science research is that it usually doesn’t consider the amount of time that people spend online. Between the 2012 election and the 2016 election, smartphone use went up by more than 25 percent. Many people spend all of their waking time somehow connected.

    This is where psychology really needs to come in. There’s been very little psychology work done looking at this from an engagement perspective, looking at the effect of seeing things in the News Feed but not clicking out. Very few people actually click out of Facebook. We really need social psychology, we really need humanities work to come in and pick up the really important pieces. What are the effects of someone seeing vile or conspiracy news headlines in their News Feed from their friends all day?

    Owen: This is so depressing.
    Voting 0
  3. ‘Whatever the causes of political polarisation today, it is not social media or the internet.

    ‘If anything, most people use the internet to broaden their media horizons. We found evidence that people actively look to confirm the information that they read online, in a multitude of ways. They mainly do this by using a search engine to find offline media and validate political information. In the process they often encounter opinions that differ from their own and as a result whether they stumbled across the content passively or use their own initiative to search for answers while double checking their “facts”, some changed their own opinion on certain issues.’

    The research shows that respondents used an average of four different media sources, and had accounts on three different social media platforms. The more media outlets people used, the more they tended to avoid echo chambers.

    While age, income, ethnicity nor gender were found to significantly influence the likelihood of being in an echo chamber, political interest significantly did. Those with a keen political interest were most likely to be opinion leaders who others turn to for political information. Compared with the less politically inclined, these people were found to be media junkies, who consumed political content wherever they could find it, and as a result of this diversity they were less likely to be in an echo chamber.

    Dr Elizabeth Dubois, co-author and Assistant Professor at the University of Ottawa, said: ‘Our results show that most people are not in a political echo chamber. The people at risk are those who depend on only a single medium for political news and who are not politically interested: about 8% of the population. However, because of their lack of political engagement, their opinions are less formative and their influence on others is likely to be comparatively small.’
    Voting 0
  4. Google tracks you on more than just their search engine. You may realize they also track you on YouTube, Gmail, Chrome, Android, Gmaps, and all the other services they run. For those, we recommend using private alternatives like DuckDuckGo for search. Yes, you can live Google-free. I’ve been doing it for many years.

    What you may not realize, though, is Google trackers are actually lurking behind the scenes on 75% of the top million websites. To give you a sense of how large that is, Facebook is the next closest with 25%. It’s a good bet that any random site you land on the Internet will have a Google tracker hiding on it. Between the two of them, they are truly dominating online advertising, by some measures literally making up 74%+ of all its growth. A key component of how they have managed to do that is through all these hidden trackers.

    Google Analytics is installed on most sites, tracking you behind the scenes, letting website owners know who is visiting their sites, but also feeding that information back to Google. Same for the ads themselves, with Google running three of the largest non-search ad networks installed on millions of sites and apps: Adsense, Admob, and DoubleClick.

    You know those ads that creepily follow you around everywhere? Most of those are actually run through these Google ad networks, where they let advertisers target you against your search history, browsing history, location history and other personal information they collect. Even less well known is they also enable advertisers like airlines to charge you different prices based upon your personal information.

    These ads are not only annoying — they are literally designed to manipulate you through targeting to make you buy more things, and just showing them to you is an act of Google profiting off of your personal information.

    At DuckDuckGo, we’ve expanded beyond our roots in search, to protect you no matter where you go on the Internet. Our DuckDuckGo browser extension and mobile app is available for all major browsers and devices, and blocks these Google trackers, along with the ones from Facebook and countless other data brokers. It does even more to protect you as well like providing smarter encryption.

    #3 — Get unbiased results, outside the Filter Bubble.

    When you search, you expect unbiased results, but that’s not what you get on Google. On Google, you get results tailored to what they think you’re likely to click on, based on the data profile they’ve built on you over time from all that tracking I described above.

    That may appear at first blush to be a good thing, but when most people say they want personalization in a search context they actually want localization. They want local weather and restaurants, which can actually be provided without tracking, like we do at DuckDuckGo. That’s because approximate location info is automatically embedded by your computer in the search request, which we can use to serve you local results and immediately throw away without tracking you.

    Beyond localization, personalized results are dangerous because to show you results they think you’ll click on, they must filter results they think you’ll skip. That’s why it’s called the Filter Bubble.

    So if you have political leanings one way or another, you’re more likely to get results you already agree with, and less likely to ever see opposing viewpoints. In the aggregate this leads to increased echo chambers that are significantly contributing to our increasingly polarized society.

    This Filter Bubble is especially pernicious in a search context because you have the expectation that you’re seeing what others are seeing, that you’re seeing the “results.” We’ve done studies over the years where we have people search for the same topics on Google at the same time and in “Incognito” mode, and found they are significantly tailored.
    Voting 0
  5. As problematic as Facebook has become, it represents only one component of a much broader shift into a new human connectivity that is both omnipresent (consider the smartphone) and hypermediated—passing through and massaged by layer upon layer of machinery carefully hidden from view. The upshot is that it’s becoming increasingly difficult to determine what in our interactions is simply human and what is machine-generated. It is becoming difficult to know what is real.

    Before the agents of this new unreality finish this first phase of their work and then disappear completely from view to complete it, we have a brief opportunity to identify and catalogue the processes shaping our drift to a new world in which reality is both relative and carefully constructed by others, for their ends. Any catalogue must include at least these four items:

    the monetisation of propaganda as ‘fake news’;
    the use of machine learning to develop user profiles accurately measuring and modelling our emotional states;
    the rise of neuromarketing, targeting highly tailored messages that nudge us to act in ways serving the ends of others;
    a new technology, ‘augmented reality’, which will push us to sever all links with the evidence of our senses.

    The fake news stories floated past as jetsam on Facebook’s ‘newsfeed’, that continuous stream of shared content drawn from a user’s Facebook’s contacts, a stream generated by everything everyone else posts or shares. A decade ago that newsfeed had a raw, unfiltered quality, the notion that everyone was doing everything, but as Facebook has matured it has engaged increasingly opaque ‘algorithms’ to curate (or censor) the newsfeed, producing something that feels much more comfortable and familiar.

    This seems like a useful feature to have, but the taming of the newsfeed comes with a consequence: Facebook’s billions of users compose their world view from what flows through their feeds. Consider the number of people on public transport—or any public place—staring into their smartphones, reviewing their feeds, marvelling at the doings of their friends, reading articles posted by family members, sharing video clips or the latest celebrity outrages. It’s an activity now so routine we ignore its omnipresence.

    Curating that newsfeed shapes what Facebook’s users learn about the world. Some of that content is controlled by the user’s ‘likes’, but a larger part is derived from Facebook’s deep analysis of a user’s behaviour. Facebook uses ‘cookies’ (invisible bits of data hidden within a user’s web browser) to track the behaviour of its users even when they’re not on the Facebook site—and even when they’re not users of Facebook. Facebook knows where its users spend time on the web, and how much time they spend there. All of that allows Facebook to tailor a newsfeed to echo the interests of each user. There’s no magic to it, beyond endless surveillance.

    What is clear is that Facebook has the power to sway the moods of billions of users. Feed people a steady diet of playful puppy videos and they’re likely to be in a happier mood than people fed images of war. Over the last two years, that capacity to manage mood has been monetised through the sharing of fake news and political feeds atuned to reader preference: you can also make people happy by confirming their biases.

    We all like to believe we’re in the right, and when we get some sign from the universe at large that we are correct, we feel better about ourselves. That’s how the curated newsfeed became wedded to the world of profitable propaganda.

    Adding a little art to brighten an other-wise dull wall seems like an unalloyed good, but only if one completely ignores bad actors. What if that blank canvas gets painted with hate speech? What if, perchance, the homes of ‘undesirables’ are singled out with graffiti that only bad actors can see? What happens when every gathering place for any oppressed community gets invisibly ‘tagged’? In short, what happens when bad actors use Facebook’s augmented reality to amplify their own capacity to act badly?

    But that’s Zuckerberg: he seems to believe his creations will only be used to bring out the best in people. He seems to believe his gigantic sharing network would never be used to incite mob violence. Just as he seems to claim that Facebook’s capacity to collect and profile the moods of its users should never be monetised—but, given that presentation unearthed by the Australian, Facebook tells a different story to advertisers.

    Regulating Facebook enshrines its position as the data-gathering and profile-building organisation, while keeping it plugged into and responsive to the needs of national powers. Before anyone takes steps that would cement Facebook in our social lives for the foreseeable future, it may be better to consider how this situation arose, and whether—given what we now know—there might be an opportunity to do things differently.
    Voting 0
  6. There are 1.5 billion YouTube users in the world, which is more than the number of households that own televisions. What they watch is shaped by this algorithm, which skims and ranks billions of videos to identify 20 “up next” clips that are both relevant to a previous video and most likely, statistically speaking, to keep a person hooked on their screen.

    Company insiders tell me the algorithm is the single most important engine of YouTube’s growth. In one of the few public explanations of how the formula works – an academic paper that sketches the algorithm’s deep neural networks, crunching a vast pool of data about videos and the people who watch them – YouTube engineers describe it as one of the “largest scale and most sophisticated industrial recommendation systems in existence”.

    The algorithm does not appear to be optimising for what is truthful, or balanced, or healthy for democracy
    Guillaume Chaslot, an ex-Google engineer

    Lately, it has also become one of the most controversial. The algorithm has been found to be promoting conspiracy theories about the Las Vegas mass shooting and incentivising, through recommendations, a thriving subculture that targets children with disturbing content such as cartoons in which the British children’s character Peppa Pig eats her father or drinks bleach.

    Lewd and violent videos have been algorithmically served up to toddlers watching YouTube Kids, a dedicated app for children. One YouTube creator who was banned from making advertising revenues from his strange videos – which featured his children receiving flu shots, removing earwax, and crying over dead pets – told a reporter he had only been responding to the demands of Google’s algorithm. “That’s what got us out there and popular,” he said. “We learned to fuel it and do whatever it took to please the algorithm.”
    Voting 0
  7. CEO Mark Zuckerberg wrote on Facebook today, “I’m changing the goal I give our product teams from focusing on helping you find relevant content to helping you have more meaningful social interactions.” VP of News Feed Adam Mosseri tells TechCrunch “I expect that the amount of distribution for publishers will go down because a lot of publisher content is just passively consumed and not talked about. Overall time on Facebook will decrease, but we think this is the right thing to do.”

    The winners in this change will be users and their sense of community, since they should find Facebook more rewarding and less of a black hole of wasted time viewing mindless video clips and guilty-pleasure articles. And long-term, it should preserve Facebook’s business and ensure it still has a platform to provide referral traffic for news publishers and marketers, albeit less than before.

    The biggest losers will be publishers who’ve shifted resources to invest in eye-catching pre-recorded social videos, because Mosseri says “video is such a passive experience”. He admits that he expects publishers to react with “a certain amount of scrutiny and anxiety”, but didn’t have many concrete answers about how publishers should scramble to react beyond “experimenting . . . and seeing . . what content gets more comments, more likes, more reshares.”
    Voting 0
  8. Facebook has turned into a toxic commodity since Mr Trump was elected. Big Tech is the new big tobacco in Washington. It is not a question of whether the regulatory backlash will come, but when and how.

    Mr Zuckerberg bears responsibility for this. Having denied Facebook’s “filter bubble” played any role in Mr Trump’s victory — or Russia’s part in helping clinch it — Mr Zuckerberg is the primary target of the Democratic backlash. He is now asking America to believe that he can turn Facebook’s news feed from an echo chamber into a public square. Revenue growth is no longer the priority. “None of that matters if our services are used in a way that doesn’t bring people closer together,” he says.

    How will Mr Zuckerberg arrange this Kumbaya conversion? By boosting the community ties that only Facebook can offer. Readers will forgive me if I take another lie down. Mr Zuckerberg suffers from two delusions common to America’s new economy elites. They think they are nice people — indeed, most of them are. Mr Zuckerberg seems to be, too. But they tend to cloak their self-interest in righteous language. Talking about values has the collateral benefit of avoiding talking about wealth. If the rich are giving their money away to good causes, such as inner city schools and research into diseases, we should not dwell on taxes. Mr Zuckerberg is not funding any private wars in Africa. He is a good person. The fact that his company pays barely any tax is therefore irrelevant.

    The second liberal delusion is to believe they have a truer grasp of people’s interests than voters themselves. In some cases that might be true. It is hard to see how abolishing health subsidies will help people who live in “flyover” America. But here is the crux. It does not matter how many times Mr Zuckerberg invokes the magic of online communities. They cannot substitute for the real ones that have gone missing. Bowling online together is no cure for bowling offline alone.

    The next time Mr Zuckerberg wants to showcase Facebook, he should invest some of his money in an actual place. It should be far away from any of America’s booming cities — say Youngstown, Ohio. For the price of a couple of days’ Facebook revenues, he could train thousands of people. He might even fund a newspaper to make up for social media’s destruction of local journalism. The effect could be electrifying. Such an example would bring a couple more benefits. First, it would demonstrate that Mr Zuckerberg can listen, rather than pretending to. Second, people will want to drop round to his place for dinner.
    Voting 0
  9. In Slovakia, data from Facebook-owned analytics site CrowdTangle shows that “interactions” – engagement such as likes, shares and comments – fell by 60% overnight for the Facebook pages of a broad selection of the country’s media Facebook pages. Filip Struhárik, a Slovakian journalist with news site Denník N, says the situation has since worsened, falling by a further 5%.

    “Lower reach can be a problem for smaller publishers, citizens’ initiatives, small NGOs,” Struhárik said. “They can’t afford to pay for distribution on Facebook by boosting posts – and they don’t have infrastructure to reach people other ways.”

    Struhárik thinks his employer will survive the change. Denník N has subscription revenue, which means it doesn’t rely on the vast traffic that Facebook can drive for advertising income, and ensures that its most dedicated readers go straight to its homepage for their news. But Fernandez, in Guatemala, is much more concerned.
    Voting 0
  10. The problem is this: Facebook has become a feedback loop which can and does, despite its best intentions, become a vicious spiral. At Facebook’s scale, behavioral targeting doesn’t just reflect our behavior, it actually influences it. Over time, a service which was supposed to connect humanity is actually partitioning us into fractal disconnected bubbles.

    The way Facebook’s News Feed works is that the more you “engage” with posts from a particular user, the more often their posts are shown to you. The more you engage with a particular kind of post, the more you will see its ilk. So far so good! It’s just showing you what you’ve demonstrated you’re interested in. What’s wrong with that?

    The answer is twofold. First, this eventually constructs a small “in-group” cluster of Facebook friends and topics that dominate your feed; and as you grow accustomed to interacting with them, this causes your behavior to change, and you interact with them even more, reinforcing their in-group status … and (relatively) isolating you from the rest of your friends, the out-group.

    Second, and substantially worse, because “engagement” is the metric, Facebook inevitably selects for the shocking and the outrageous. Ev Williams summed up the results brilliantly:

    Of course this doesn’t just apply to Facebook.
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 7 Online Bookmarks of M. Fioretti: tagged with "filter bubble"

About - Propulsed by SemanticScuttle