mfioretti: algorithms* + filter bubble*

Bookmarks on this page are managed by an admin user.

15 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. CEO Mark Zuckerberg wrote on Facebook today, “I’m changing the goal I give our product teams from focusing on helping you find relevant content to helping you have more meaningful social interactions.” VP of News Feed Adam Mosseri tells TechCrunch “I expect that the amount of distribution for publishers will go down because a lot of publisher content is just passively consumed and not talked about. Overall time on Facebook will decrease, but we think this is the right thing to do.”

    The winners in this change will be users and their sense of community, since they should find Facebook more rewarding and less of a black hole of wasted time viewing mindless video clips and guilty-pleasure articles. And long-term, it should preserve Facebook’s business and ensure it still has a platform to provide referral traffic for news publishers and marketers, albeit less than before.

    The biggest losers will be publishers who’ve shifted resources to invest in eye-catching pre-recorded social videos, because Mosseri says “video is such a passive experience”. He admits that he expects publishers to react with “a certain amount of scrutiny and anxiety”, but didn’t have many concrete answers about how publishers should scramble to react beyond “experimenting . . . and seeing . . what content gets more comments, more likes, more reshares.”
    https://techcrunch.com/2018/01/11/facebook-time-well-spent
    Voting 0
  2. The problem is this: Facebook has become a feedback loop which can and does, despite its best intentions, become a vicious spiral. At Facebook’s scale, behavioral targeting doesn’t just reflect our behavior, it actually influences it. Over time, a service which was supposed to connect humanity is actually partitioning us into fractal disconnected bubbles.

    The way Facebook’s News Feed works is that the more you “engage” with posts from a particular user, the more often their posts are shown to you. The more you engage with a particular kind of post, the more you will see its ilk. So far so good! It’s just showing you what you’ve demonstrated you’re interested in. What’s wrong with that?

    The answer is twofold. First, this eventually constructs a small “in-group” cluster of Facebook friends and topics that dominate your feed; and as you grow accustomed to interacting with them, this causes your behavior to change, and you interact with them even more, reinforcing their in-group status … and (relatively) isolating you from the rest of your friends, the out-group.

    Second, and substantially worse, because “engagement” is the metric, Facebook inevitably selects for the shocking and the outrageous. Ev Williams summed up the results brilliantly:

    Of course this doesn’t just apply to Facebook.
    https://techcrunch.com/2017/06/04/whe...m_source=tctwreshare&sr_share=twitter
    Voting 0
  3. Facebook’s entire project, when it comes to news, rests on the assumption that people’s individual preferences ultimately coincide with the public good, and that if it doesn’t appear that way at first, you’re not delving deeply enough into the data. By contrast, decades of social-science research shows that most of us simply prefer stuff that feels true to our worldview even if it isn’t true at all and that the mining of all those preference signals is likely to lead us deeper into bubbles rather than out of them.

    What’s needed, he argues, is some global superstructure to advance humanity.

    This is not an especially controversial idea; Zuckerberg is arguing for a kind of digital-era version of the global institution-building that the Western world engaged in after World War II. But because he is a chief executive and not an elected president, there is something frightening about his project. He is positioning Facebook — and, considering that he commands absolute voting control of the company, he is positioning himself — as a critical enabler of the next generation of human society. A minor problem with his mission is that it drips with megalomania, albeit of a particularly sincere sort. With his wife, Priscilla Chan, Zuckerberg has pledged to give away nearly all of his wealth to a variety of charitable causes, including a long-term medical-research project to cure all disease. His desire to take on global social problems through digital connectivity, and specifically through Facebook, feels like part of the same impulse.

    Yet Zuckerberg is often blasé about the messiness of the transition between the world we’re in and the one he wants to create through software. Building new “social infrastructure” usually involves tearing older infrastructure down. If you manage the demolition poorly, you might undermine what comes next.
    https://www.nytimes.com/2017/04/25/ma...n-facebook-fix-its-own-worst-bug.html
    Voting 0
  4. With increased political polarization, amplified by homophily — our preference to connect to people like us — and algorithmic recommender systems, we’re effectively constructing our own realities.

    Two years ago I wrote about how social networks helped Israelis and Palestinians build a form of personalized propaganda during the last Israel-Gaza war. The shape of conversations and responses to events typically looked something like the graph below, where one frame of the story tends to stay on only one side of the graph, while a completely different take spreads on the other.
    Typical Polarized Social Networked Space for Information Spreading about the Israeli-Palestinian Conflict.

    In the cases that I was investigating, neither side of the graph’s frame was false per se. Rather, each carefully crafted story on either side omitted important detail and context. When this happens constantly, on a daily basis, it systematically and deeply affects people’s perception of what is real.
    https://points.datasociety.net/fake-n...ot-the-problem-f00ec8cdfcb#.ha2sk7ijs
    Voting 0
  5. non voglio farla lunga, ma in allora, come oggi, io non controllavo affatto il dato e l’informazione personale volontariamente o forzosamente appresa ad ogni mio movimento; ciò che in qualche modo mi salvava nella tribolata adolescenza (non sempre invero) era il controllo della situazione sociale e del contesto.

    Il controllo sul dato-informazione non l’avevo con il macellaio del paese e non posso pensare di averlo oggi sul web con Google, Facebook e soprattutto con le mille agenzie statuali affette, per svariate e talvolta encomiabili ragioni, da bulimia informativa. Ma in allora avevo contezza e in qualche modo governavo le banali regole tecniche (le vie del paese, gli orari della corriera) e quelle sociali di prossimità del mio territorio.

    Oggi non ci riesco più. E non è solo per la quantità dei dati captati e memorizzati ad ogni passo ma per la totale opacità del contesto e delle regole tecniche e sociali che governano la nostra vita digitale.

    Algoritmi ignoti, insondabili ai loro stessi creatori, ricostruiscono la nostra immagine, creano punteggi e giudicano rilevanze e congruità a nostra totale insaputa. Banche, assicurazioni, imprese di ogni risma e fattezza (a breve l’internet delle cose ci stupirà) ma soprattutto lo Stato, con le sue mille agenzie di verifica e controllo, accedono ad ogni informazione decontestualizzandola, creando relazioni e correlazioni di cui non abbiamo coscienza, ma di cui subiamo quotidianamente le conseguenze.

    Non possiamo impedire tutto questo, il big data e gli open-data salveranno il mondo, d’accordo. Ma possiamo e dobbiamo pretendere di sapere il chi, il come e il quando. Abbiamo bisogno di sapere qual è il contesto, e quali sono le regole; solo così troveremo strategie, non per delinquere o eludere la legge (come sostiene parte della magistratura), ma per esercitare i diritti fondamentali della persona.

    Nel mondo fisico sappiamo quando lo Stato ha il diritto di entrare in casa nostra, o a quali condizioni possa limitare le nostre libertà personali, di movimento, d’espressione; nel mondo digitale non sappiamo, e neppure ci chiediamo, chi, quando e a quali condizioni possa impossessarsi dei nostri dati, dei nostri dispositivi tramite software occulti, della nostra vita. Accettiamo supinamente un’intollerabile opacità.

    Io ho qualcosa da nascondere da quando ho ricordi: sono riservatezze variabili a seconda dell’interlocutore, del tempo, del luogo e del contesto. E non voglio per me e i miei figli una società stupidamente disciplinata da una costante sorveglianza e decerebrata dagli algoritmi. Vorrei una società in cui l’asimmetria dell’informazione sia l’esatto opposto dell’attuale, dove purtroppo il cittadino è totalmente trasparente e lo Stato e le sue regole sono opache e incerte.
    Mostra commenti ( 0 )
    Carlo Blengino
    Carlo Blengino

    Avvocato penalista, affronta nelle aule giudiziarie il diritto delle nuove tecnologie, le questioni di copyright e di data protection. È fellow del NEXA Center for Internet & Society del Politecnico di Torino. @CBlengio su Twitter
    http://www.ilpost.it/carloblengino/2016/11/02/ho-qualcosa-da-nascondere
    Voting 0
  6. Yours might be one of angst and despair, or celebrations and "I told you so's." It depends on the people you're friends with and the online community you've created with your clicks, likes and shares.

    Facebook's algorithm knows what you like based on the videos you watch, people you talk to, and content you interact with. It then shows you more of the same. This creates something called "filter bubbles." You begin to see only the content you like and agree with, while Facebook (FB, Tech30) hides dissenting points of view.

    This means news on Facebook comes with confirmation bias -- it reinforces what you already think is true -- and people are increasingly frustrated.

    Facebook denies it's a media company, yet almost half of U.S. adults get news from Facebook.

    When Facebook fired its human curators and began to rely on algorithms to surface popular stories earlier this year, fake news proliferated.

    Viral memes and propaganda spread among people with similar beliefs and interests. It's cheaper and easier to create and spread ideological disinformation than deeply-researched and reported news. And it comes from all over -- teens in Macedonia are responsible for a large portion of fake pro-Trump news, according to a BuzzFeed analysis.

    Related: The plague of fake news is getting worse -- here's how to protect yourself

    Filter bubbles became especially problematic during the presidential election.

    Hyperpartisan news sites and fake websites distributed false stories about voter fraud, election conspiracies, and the candidates' pasts that spread like wildfire on Facebook. It was more prevalent on right-leaning Facebook pages. As CNNMoney's Brian Stelter said in response to the growing number of false viral stories, people should have a "triple check before you share" rule.

    Today, many people are shocked by Trump's victory. Words of fear and sorrow fill their Facebook feeds, and even those with thousands of friends are probably only seeing posts that echo their feelings.

    But if you voted for Trump, chances are your feed reflects the opposite. You might see a cascade of #MakeAmericaGreatAgain hashtags and friends celebrating.
    http://money.cnn.com/2016/11/09/techn...2Fedition_us+%28RSS%3A+CNNi+-+U.S.%29
    Voting 0
  7. 5. Of course, algorithms aren't neutral, which is the real issue. Facebook is a powerful media gatekeeper because of the artificial scarcity of the News Feed — unlike, Twitter, which blasts users with a firehose of content, Facebook's News Feed algorithm controls what you see from all the people and organizations you follow. And changes to the News Feed algorithm divert enormous amounts of attention: last year Facebook was sending massive amounts of traffic to websites, but earlier this year Facebook prioritized video and that traffic dipped sharply. This month Facebook is prioritizing live video, so the media started making live videos. When media people want to complain, they complain about having to chase Facebook, because it feels like Facebook has a ton of control over the media. (Disclosure: Facebook is paying Verge parent company Vox Media to create Facebook Live videos.)
    http://www.theverge.com/2016/5/10/116...k-trending-box-bias-conservative-news
    Voting 0
  8. Quit fracking our lives to extract data that’s none of your business and that your machines misinterpret. — New Clues, #58

    That’s the blunt advice David Weinberger and I give to marketers who still make it hard to talk, sixteen years after many of them started failing to get what we meant by Markets are Conversations.

    In the world of 2001, people have become so machinelike that the most human character turns out to be a machine. That’s the essence of Kubrick’s dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.

    Even if our own intelligence is not yet artificialized, what’s feeding it surely is.

    In The Filter Bubble, after explaining Google’s and Facebook’s very different approaches to personalized “experience” filtration, and the assumptions behind both, Eli Pariser says both companies approximations are based on “a bad theory of you,” and come up with “pretty poor representations of who we are, in part because there is no one set of data that describes who we are.” He says the ideal of perfect personalization dumps us into what animators, puppetry and robotics engineers call the uncanny valley: a “place where something is lifelike but not convincingly alive, and it gives people the creeps.”

    Sanity requires that we line up many different personalities behind a single first person pronoun: I, me, mine. And also behind multiple identifiers. In my own case, I am Doc to most of those who know me, David to various government agencies (and most of the entities that bill me for stuff), Dave to many (but not all) family members, @dsearls to Twitter, and no name at all to the rest of the world, wherein I remain, like most of us, anonymous (literally, nameless), because that too is a civic grace. (And if you doubt that, ask any person who has lost their anonymity through the Faustian bargain called celebrity.)

    Third, advertising needs to return to what it does best: straightforward brand messaging that is targeted at populations, and doesn’t get personal. For help with that, start reading
    https://medium.com/@dsearls/on-market...bad-guesswork-88a84de937b0#.deu5ue16x
    Voting 0
  9. Younger Internet users like to joke about how Facebook “is the new TV,” but in the case of political news consumption that appears to be literally true, according to a new study from the Pew Research Center for Media and Journalism. More than 60% of millennials who were surveyed said that during the previous week they got their political news from Facebook, compared with 37% who got it from TV.

    Facebook came under fire recently for a study that it funded, done by a number of in-house scientists, which looked at whether news-feed users were subjected to differing political points of view. Although the study said that the decisions of users themselves determined how much they were exposed to different points of view, a number of experts took issue with that explanation.

    These experts pointed out that Facebook’s own data confirmed that for one test group, the algorithmically filtered news-feed did affect the amount of alternative political commentary and news they were exposed to. But even more important than that, Facebook’s study pretended that a user’s experience on the site could be looked at separately from the functioning of the algorithm, when the two are so closely linked that it’s almost impossible to separate them.

    For older members of the “Baby Boom” generation, meanwhile, those figures were almost exactly reversed:
    https://fortune.com/2015/06/01/facebook-algorithm-news-millennials
    Voting 0
  10. Much of the paper is written as if it is about adult U.S. Facebook users in general, but that is not the case. Those included in the study are just those who self-identify their politics on the site. This is a rare behavior, something only 9% of users do. This 9% number is not in the report but in a separate supporting materials appendix but is crucial for interpreting the results. The population number given in the report is 10.1 million people, which yea omg is a very big number but don’t fall for the Big-N trick, we don’t how this 9% is different from Facebook in general. We cannot treat this as a sample of “Facebook users” or even “Facebook liberals and conservatives”, as the authors do in various parts of the report, but instead as about the rare people who explicitly state their political orientation on their Facebook profile.* Descriptive statistics comparing the few who explicitly self-identify and therefore enter into the study versus those who do not are not provided. Who are they, how are they different from the rest of us, why are they important to study, are all obvious things to discuss that the report doesn’t. We might infer that people who self-identify are more politically engaged, but anecdotally, nearly all my super politically engaged Facebook friends don’t explicitly list their political orientation on the site. Facebook’s report talks about Facebook users, which isn’t accurate. All the findings should be understood to be about Facebook users who also put their political orientation on their profiles, who may or may not be like the rest of Facebook users in lots of interesting and research-confounding ways. The researchers had an obligation to make this limitation much more clear, even if it tempered their grand conclusions.

    So, AMONG THOSE RARE USERS WHO EXPLICITLY SELF-IDENTIFY THEIR POLITICAL ORIENTATION ON THEIR FACEBOOK PROFILES, the study looks at the flow of news stories that are more liberal versus conservative as they are shared on Facebook, how those stories are seen and clicked on as they are shared by liberals to other liberals, conservatives to other conservatives, and most important for this study, the information that is politically cross cutting, that is, shared by someone on the right and then seen by someone on the left and vice versa. The measure of conservative or liberal news stories is a simple and in my opinion an effective one: the degree that a web domain is shared by people on the right is the degree to which content on that domain is treated as conservative (and same goes for politically liberal content). And they differentiated between soft (entertainment) versus hard (news) content, only including the latter in this study. The important work is seeing if Facebook, as a platform, is creating a filter bubble where people only see what they’d already agree with as opposed to more diverse and challenging “cross cutting” information.

    The Facebook researchers looked at how much, specifically, the newsfeed algorithm promotes the filter bubble, that is, showing users what they will already agree with over and above a non-algorithmically-sorted newsfeed. The newsfeed algorithm provided 8% less conservative content for liberals versus a non-algorithmically sorted feed, and 5% less liberal content for conservatives. This is an outcome directly attributable to the structure of Facebook itself.

    Facebook published this finding, that the newsfeed algorithm encourages users seeing what they already would agree with more than if the algorithm wasn’t there, ultimately because Facebook wants to make the case that their algorithm isn’t as big a factor in this political confirmation bias as people’s individual choices, stating, “individual choice has a larger role in limiting exposure to ideologically cross cutting content.” The researchers estimate that conservatives click on 17% less ideologically opposed news stories and liberals click on 6% less than what would be expected if users clicked on random links in their feed.

    The report concludes that, “we conclusively establish that on average in the context of Facebook, individual choices matter » more than algorithms”. Nooo this just simply isn’t the case.
    http://thesocietypages.org/cyborgolog...2015/05/07/facebook-fair-and-balanced
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 2 Online Bookmarks of M. Fioretti: Tags: algorithms + filter bubble

About - Propulsed by SemanticScuttle