mfioretti: algorithms* + filter bubble*

Bookmarks on this page are managed by an admin user.

19 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. Which brings us back to Facebook, which to this day seems at best to dimly understand how the news business works, as is evident in its longstanding insistence that it's not a media company. Wired was even inspired to publish a sarcastic self-help quiz for Facebook execs on "How to tell if you're a media company." It included such questions as "Are you the country's largest source of news?"

    The answer is a resounding yes. An astonishing 45 percent of Americans get their news from this single source. Add Google, and above 70 percent of Americans get their news from a pair of outlets. The two firms also ate up about 89 percent of the digital-advertising growth last year, underscoring their monopolistic power in this industry.

    Facebook's cluelessness on this front makes the ease with which it took over the press that much more bizarre to contemplate. Of course, the entire history of Facebook is pretty weird, even by Silicon Valley standards, beginning with the fact that the firm thinks of itself as a movement and not a giant money-sucking machine.


    That Facebook saw meteoric rises without ever experiencing a big dip in users might have something to do with the fact that the site was consciously designed to be addictive, as early founder Parker recently noted at a conference in Philadelphia.

    Facebook is full of features such as "likes" that dot your surfing experience with neuro-rushes of micro-approval – a "little dopamine hit," as Parker put it. The hits might come with getting a like when you post a picture of yourself thumbs-upping the world's third-largest cheese wheel, or flashing the "Live Long and Prosper" sign on International Star Trek day, or whatever the hell it is you do in your cyber-time. "It's a social-validation feedback loop," Parker explained. "Exactly the kind of thing that a hacker like myself would come up with, because you're exploiting a vulnerability in human psychology."
    https://www.rollingstone.com/politics...e-be-saved-social-media-giant-w518655
    Voting 0
  2. As the visualization below shows, Michele, the bot pretending to be a fascist, enjoyed a radically different news feed experience from the others:
    Total number of posts seen by each bot (the wider the bar, the more posts), grouped by how many times the posts were repeated (the higher up the bar, the more times).

    Fash-bot Michele is shown a much smaller variety of posts, repeated way more often than normal — it saw some posts as often as 29 times in the 20 days represented in the data set.

    I mostly agree with studies such as Political polarization? Don’t blame the web, especially because the opposite belief is a kind of techno-determinism I feel doesn’t take into account a lot of political complexity. But the data above displays a frightening situation: the Michele bot has been segregated by the algorithm, and only receives content from a very narrow political area. Sure, let’s make fun of fascists because they see mostly pictures or because they are ill-informed, but these people will vote in 31 hours. Not cool.

    If you were curious what posts the Facebook algorithm deemed so essential that they had to be shown 29 times each (once a day or more, on average — each), here they are, all three of them. The third is peculiar, with its message that “mass media does not give us a platform, they never even mention our name, but people still declare they will vote for us. Mass media is a scam, spread the word”.
    https://medium.com/@trackingexposed/t...es-fascists-from-reality-d36739b0758b
    Voting 0
  3. As problematic as Facebook has become, it represents only one component of a much broader shift into a new human connectivity that is both omnipresent (consider the smartphone) and hypermediated—passing through and massaged by layer upon layer of machinery carefully hidden from view. The upshot is that it’s becoming increasingly difficult to determine what in our interactions is simply human and what is machine-generated. It is becoming difficult to know what is real.

    Before the agents of this new unreality finish this first phase of their work and then disappear completely from view to complete it, we have a brief opportunity to identify and catalogue the processes shaping our drift to a new world in which reality is both relative and carefully constructed by others, for their ends. Any catalogue must include at least these four items:

    the monetisation of propaganda as ‘fake news’;
    the use of machine learning to develop user profiles accurately measuring and modelling our emotional states;
    the rise of neuromarketing, targeting highly tailored messages that nudge us to act in ways serving the ends of others;
    a new technology, ‘augmented reality’, which will push us to sever all links with the evidence of our senses.



    The fake news stories floated past as jetsam on Facebook’s ‘newsfeed’, that continuous stream of shared content drawn from a user’s Facebook’s contacts, a stream generated by everything everyone else posts or shares. A decade ago that newsfeed had a raw, unfiltered quality, the notion that everyone was doing everything, but as Facebook has matured it has engaged increasingly opaque ‘algorithms’ to curate (or censor) the newsfeed, producing something that feels much more comfortable and familiar.

    This seems like a useful feature to have, but the taming of the newsfeed comes with a consequence: Facebook’s billions of users compose their world view from what flows through their feeds. Consider the number of people on public transport—or any public place—staring into their smartphones, reviewing their feeds, marvelling at the doings of their friends, reading articles posted by family members, sharing video clips or the latest celebrity outrages. It’s an activity now so routine we ignore its omnipresence.

    Curating that newsfeed shapes what Facebook’s users learn about the world. Some of that content is controlled by the user’s ‘likes’, but a larger part is derived from Facebook’s deep analysis of a user’s behaviour. Facebook uses ‘cookies’ (invisible bits of data hidden within a user’s web browser) to track the behaviour of its users even when they’re not on the Facebook site—and even when they’re not users of Facebook. Facebook knows where its users spend time on the web, and how much time they spend there. All of that allows Facebook to tailor a newsfeed to echo the interests of each user. There’s no magic to it, beyond endless surveillance.

    What is clear is that Facebook has the power to sway the moods of billions of users. Feed people a steady diet of playful puppy videos and they’re likely to be in a happier mood than people fed images of war. Over the last two years, that capacity to manage mood has been monetised through the sharing of fake news and political feeds atuned to reader preference: you can also make people happy by confirming their biases.

    We all like to believe we’re in the right, and when we get some sign from the universe at large that we are correct, we feel better about ourselves. That’s how the curated newsfeed became wedded to the world of profitable propaganda.

    Adding a little art to brighten an other-wise dull wall seems like an unalloyed good, but only if one completely ignores bad actors. What if that blank canvas gets painted with hate speech? What if, perchance, the homes of ‘undesirables’ are singled out with graffiti that only bad actors can see? What happens when every gathering place for any oppressed community gets invisibly ‘tagged’? In short, what happens when bad actors use Facebook’s augmented reality to amplify their own capacity to act badly?

    But that’s Zuckerberg: he seems to believe his creations will only be used to bring out the best in people. He seems to believe his gigantic sharing network would never be used to incite mob violence. Just as he seems to claim that Facebook’s capacity to collect and profile the moods of its users should never be monetised—but, given that presentation unearthed by the Australian, Facebook tells a different story to advertisers.

    Regulating Facebook enshrines its position as the data-gathering and profile-building organisation, while keeping it plugged into and responsive to the needs of national powers. Before anyone takes steps that would cement Facebook in our social lives for the foreseeable future, it may be better to consider how this situation arose, and whether—given what we now know—there might be an opportunity to do things differently.
    https://meanjin.com.au/essays/the-last-days-of-reality
    Voting 0
  4. There are 1.5 billion YouTube users in the world, which is more than the number of households that own televisions. What they watch is shaped by this algorithm, which skims and ranks billions of videos to identify 20 “up next” clips that are both relevant to a previous video and most likely, statistically speaking, to keep a person hooked on their screen.

    Company insiders tell me the algorithm is the single most important engine of YouTube’s growth. In one of the few public explanations of how the formula works – an academic paper that sketches the algorithm’s deep neural networks, crunching a vast pool of data about videos and the people who watch them – YouTube engineers describe it as one of the “largest scale and most sophisticated industrial recommendation systems in existence”.

    The algorithm does not appear to be optimising for what is truthful, or balanced, or healthy for democracy
    Guillaume Chaslot, an ex-Google engineer

    Lately, it has also become one of the most controversial. The algorithm has been found to be promoting conspiracy theories about the Las Vegas mass shooting and incentivising, through recommendations, a thriving subculture that targets children with disturbing content such as cartoons in which the British children’s character Peppa Pig eats her father or drinks bleach.

    Lewd and violent videos have been algorithmically served up to toddlers watching YouTube Kids, a dedicated app for children. One YouTube creator who was banned from making advertising revenues from his strange videos – which featured his children receiving flu shots, removing earwax, and crying over dead pets – told a reporter he had only been responding to the demands of Google’s algorithm. “That’s what got us out there and popular,” he said. “We learned to fuel it and do whatever it took to please the algorithm.”
    https://www.theguardian.com/technolog...how-youtubes-algorithm-distorts-truth
    Voting 0
  5. CEO Mark Zuckerberg wrote on Facebook today, “I’m changing the goal I give our product teams from focusing on helping you find relevant content to helping you have more meaningful social interactions.” VP of News Feed Adam Mosseri tells TechCrunch “I expect that the amount of distribution for publishers will go down because a lot of publisher content is just passively consumed and not talked about. Overall time on Facebook will decrease, but we think this is the right thing to do.”

    The winners in this change will be users and their sense of community, since they should find Facebook more rewarding and less of a black hole of wasted time viewing mindless video clips and guilty-pleasure articles. And long-term, it should preserve Facebook’s business and ensure it still has a platform to provide referral traffic for news publishers and marketers, albeit less than before.

    The biggest losers will be publishers who’ve shifted resources to invest in eye-catching pre-recorded social videos, because Mosseri says “video is such a passive experience”. He admits that he expects publishers to react with “a certain amount of scrutiny and anxiety”, but didn’t have many concrete answers about how publishers should scramble to react beyond “experimenting . . . and seeing . . what content gets more comments, more likes, more reshares.”
    https://techcrunch.com/2018/01/11/facebook-time-well-spent
    Voting 0
  6. The problem is this: Facebook has become a feedback loop which can and does, despite its best intentions, become a vicious spiral. At Facebook’s scale, behavioral targeting doesn’t just reflect our behavior, it actually influences it. Over time, a service which was supposed to connect humanity is actually partitioning us into fractal disconnected bubbles.

    The way Facebook’s News Feed works is that the more you “engage” with posts from a particular user, the more often their posts are shown to you. The more you engage with a particular kind of post, the more you will see its ilk. So far so good! It’s just showing you what you’ve demonstrated you’re interested in. What’s wrong with that?

    The answer is twofold. First, this eventually constructs a small “in-group” cluster of Facebook friends and topics that dominate your feed; and as you grow accustomed to interacting with them, this causes your behavior to change, and you interact with them even more, reinforcing their in-group status … and (relatively) isolating you from the rest of your friends, the out-group.

    Second, and substantially worse, because “engagement” is the metric, Facebook inevitably selects for the shocking and the outrageous. Ev Williams summed up the results brilliantly:

    Of course this doesn’t just apply to Facebook.
    https://techcrunch.com/2017/06/04/whe...m_source=tctwreshare&sr_share=twitter
    Voting 0
  7. Facebook’s entire project, when it comes to news, rests on the assumption that people’s individual preferences ultimately coincide with the public good, and that if it doesn’t appear that way at first, you’re not delving deeply enough into the data. By contrast, decades of social-science research shows that most of us simply prefer stuff that feels true to our worldview even if it isn’t true at all and that the mining of all those preference signals is likely to lead us deeper into bubbles rather than out of them.

    What’s needed, he argues, is some global superstructure to advance humanity.

    This is not an especially controversial idea; Zuckerberg is arguing for a kind of digital-era version of the global institution-building that the Western world engaged in after World War II. But because he is a chief executive and not an elected president, there is something frightening about his project. He is positioning Facebook — and, considering that he commands absolute voting control of the company, he is positioning himself — as a critical enabler of the next generation of human society. A minor problem with his mission is that it drips with megalomania, albeit of a particularly sincere sort. With his wife, Priscilla Chan, Zuckerberg has pledged to give away nearly all of his wealth to a variety of charitable causes, including a long-term medical-research project to cure all disease. His desire to take on global social problems through digital connectivity, and specifically through Facebook, feels like part of the same impulse.

    Yet Zuckerberg is often blasé about the messiness of the transition between the world we’re in and the one he wants to create through software. Building new “social infrastructure” usually involves tearing older infrastructure down. If you manage the demolition poorly, you might undermine what comes next.
    https://www.nytimes.com/2017/04/25/ma...n-facebook-fix-its-own-worst-bug.html
    Voting 0
  8. With increased political polarization, amplified by homophily — our preference to connect to people like us — and algorithmic recommender systems, we’re effectively constructing our own realities.

    Two years ago I wrote about how social networks helped Israelis and Palestinians build a form of personalized propaganda during the last Israel-Gaza war. The shape of conversations and responses to events typically looked something like the graph below, where one frame of the story tends to stay on only one side of the graph, while a completely different take spreads on the other.
    Typical Polarized Social Networked Space for Information Spreading about the Israeli-Palestinian Conflict.

    In the cases that I was investigating, neither side of the graph’s frame was false per se. Rather, each carefully crafted story on either side omitted important detail and context. When this happens constantly, on a daily basis, it systematically and deeply affects people’s perception of what is real.
    https://points.datasociety.net/fake-n...ot-the-problem-f00ec8cdfcb#.ha2sk7ijs
    Voting 0
  9. non voglio farla lunga, ma in allora, come oggi, io non controllavo affatto il dato e l’informazione personale volontariamente o forzosamente appresa ad ogni mio movimento; ciò che in qualche modo mi salvava nella tribolata adolescenza (non sempre invero) era il controllo della situazione sociale e del contesto.

    Il controllo sul dato-informazione non l’avevo con il macellaio del paese e non posso pensare di averlo oggi sul web con Google, Facebook e soprattutto con le mille agenzie statuali affette, per svariate e talvolta encomiabili ragioni, da bulimia informativa. Ma in allora avevo contezza e in qualche modo governavo le banali regole tecniche (le vie del paese, gli orari della corriera) e quelle sociali di prossimità del mio territorio.

    Oggi non ci riesco più. E non è solo per la quantità dei dati captati e memorizzati ad ogni passo ma per la totale opacità del contesto e delle regole tecniche e sociali che governano la nostra vita digitale.

    Algoritmi ignoti, insondabili ai loro stessi creatori, ricostruiscono la nostra immagine, creano punteggi e giudicano rilevanze e congruità a nostra totale insaputa. Banche, assicurazioni, imprese di ogni risma e fattezza (a breve l’internet delle cose ci stupirà) ma soprattutto lo Stato, con le sue mille agenzie di verifica e controllo, accedono ad ogni informazione decontestualizzandola, creando relazioni e correlazioni di cui non abbiamo coscienza, ma di cui subiamo quotidianamente le conseguenze.

    Non possiamo impedire tutto questo, il big data e gli open-data salveranno il mondo, d’accordo. Ma possiamo e dobbiamo pretendere di sapere il chi, il come e il quando. Abbiamo bisogno di sapere qual è il contesto, e quali sono le regole; solo così troveremo strategie, non per delinquere o eludere la legge (come sostiene parte della magistratura), ma per esercitare i diritti fondamentali della persona.

    Nel mondo fisico sappiamo quando lo Stato ha il diritto di entrare in casa nostra, o a quali condizioni possa limitare le nostre libertà personali, di movimento, d’espressione; nel mondo digitale non sappiamo, e neppure ci chiediamo, chi, quando e a quali condizioni possa impossessarsi dei nostri dati, dei nostri dispositivi tramite software occulti, della nostra vita. Accettiamo supinamente un’intollerabile opacità.

    Io ho qualcosa da nascondere da quando ho ricordi: sono riservatezze variabili a seconda dell’interlocutore, del tempo, del luogo e del contesto. E non voglio per me e i miei figli una società stupidamente disciplinata da una costante sorveglianza e decerebrata dagli algoritmi. Vorrei una società in cui l’asimmetria dell’informazione sia l’esatto opposto dell’attuale, dove purtroppo il cittadino è totalmente trasparente e lo Stato e le sue regole sono opache e incerte.
    Mostra commenti ( 0 )
    Carlo Blengino
    Carlo Blengino

    Avvocato penalista, affronta nelle aule giudiziarie il diritto delle nuove tecnologie, le questioni di copyright e di data protection. È fellow del NEXA Center for Internet & Society del Politecnico di Torino. @CBlengio su Twitter
    http://www.ilpost.it/carloblengino/2016/11/02/ho-qualcosa-da-nascondere
    Voting 0
  10. Yours might be one of angst and despair, or celebrations and "I told you so's." It depends on the people you're friends with and the online community you've created with your clicks, likes and shares.

    Facebook's algorithm knows what you like based on the videos you watch, people you talk to, and content you interact with. It then shows you more of the same. This creates something called "filter bubbles." You begin to see only the content you like and agree with, while Facebook (FB, Tech30) hides dissenting points of view.

    This means news on Facebook comes with confirmation bias -- it reinforces what you already think is true -- and people are increasingly frustrated.

    Facebook denies it's a media company, yet almost half of U.S. adults get news from Facebook.

    When Facebook fired its human curators and began to rely on algorithms to surface popular stories earlier this year, fake news proliferated.

    Viral memes and propaganda spread among people with similar beliefs and interests. It's cheaper and easier to create and spread ideological disinformation than deeply-researched and reported news. And it comes from all over -- teens in Macedonia are responsible for a large portion of fake pro-Trump news, according to a BuzzFeed analysis.

    Related: The plague of fake news is getting worse -- here's how to protect yourself

    Filter bubbles became especially problematic during the presidential election.

    Hyperpartisan news sites and fake websites distributed false stories about voter fraud, election conspiracies, and the candidates' pasts that spread like wildfire on Facebook. It was more prevalent on right-leaning Facebook pages. As CNNMoney's Brian Stelter said in response to the growing number of false viral stories, people should have a "triple check before you share" rule.

    Today, many people are shocked by Trump's victory. Words of fear and sorrow fill their Facebook feeds, and even those with thousands of friends are probably only seeing posts that echo their feelings.

    But if you voted for Trump, chances are your feed reflects the opposite. You might see a cascade of #MakeAmericaGreatAgain hashtags and friends celebrating.
    http://money.cnn.com/2016/11/09/techn...2Fedition_us+%28RSS%3A+CNNi+-+U.S.%29
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 2 Online Bookmarks of M. Fioretti: Tags: algorithms + filter bubble

About - Propulsed by SemanticScuttle