mfioretti: echo chamber*

Bookmarks on this page are managed by an admin user.

8 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. ‘Whatever the causes of political polarisation today, it is not social media or the internet.

    ‘If anything, most people use the internet to broaden their media horizons. We found evidence that people actively look to confirm the information that they read online, in a multitude of ways. They mainly do this by using a search engine to find offline media and validate political information. In the process they often encounter opinions that differ from their own and as a result whether they stumbled across the content passively or use their own initiative to search for answers while double checking their “facts”, some changed their own opinion on certain issues.’

    The research shows that respondents used an average of four different media sources, and had accounts on three different social media platforms. The more media outlets people used, the more they tended to avoid echo chambers.

    While age, income, ethnicity nor gender were found to significantly influence the likelihood of being in an echo chamber, political interest significantly did. Those with a keen political interest were most likely to be opinion leaders who others turn to for political information. Compared with the less politically inclined, these people were found to be media junkies, who consumed political content wherever they could find it, and as a result of this diversity they were less likely to be in an echo chamber.

    Dr Elizabeth Dubois, co-author and Assistant Professor at the University of Ottawa, said: ‘Our results show that most people are not in a political echo chamber. The people at risk are those who depend on only a single medium for political news and who are not politically interested: about 8% of the population. However, because of their lack of political engagement, their opinions are less formative and their influence on others is likely to be comparatively small.’
    http://www.ox.ac.uk/news/2018-02-21-s...net-not-cause-political-polarisation#
    Voting 0
  2. CEO Mark Zuckerberg wrote on Facebook today, “I’m changing the goal I give our product teams from focusing on helping you find relevant content to helping you have more meaningful social interactions.” VP of News Feed Adam Mosseri tells TechCrunch “I expect that the amount of distribution for publishers will go down because a lot of publisher content is just passively consumed and not talked about. Overall time on Facebook will decrease, but we think this is the right thing to do.”

    The winners in this change will be users and their sense of community, since they should find Facebook more rewarding and less of a black hole of wasted time viewing mindless video clips and guilty-pleasure articles. And long-term, it should preserve Facebook’s business and ensure it still has a platform to provide referral traffic for news publishers and marketers, albeit less than before.

    The biggest losers will be publishers who’ve shifted resources to invest in eye-catching pre-recorded social videos, because Mosseri says “video is such a passive experience”. He admits that he expects publishers to react with “a certain amount of scrutiny and anxiety”, but didn’t have many concrete answers about how publishers should scramble to react beyond “experimenting . . . and seeing . . what content gets more comments, more likes, more reshares.”
    https://techcrunch.com/2018/01/11/facebook-time-well-spent
    Voting 0
  3. The problem is this: Facebook has become a feedback loop which can and does, despite its best intentions, become a vicious spiral. At Facebook’s scale, behavioral targeting doesn’t just reflect our behavior, it actually influences it. Over time, a service which was supposed to connect humanity is actually partitioning us into fractal disconnected bubbles.

    The way Facebook’s News Feed works is that the more you “engage” with posts from a particular user, the more often their posts are shown to you. The more you engage with a particular kind of post, the more you will see its ilk. So far so good! It’s just showing you what you’ve demonstrated you’re interested in. What’s wrong with that?

    The answer is twofold. First, this eventually constructs a small “in-group” cluster of Facebook friends and topics that dominate your feed; and as you grow accustomed to interacting with them, this causes your behavior to change, and you interact with them even more, reinforcing their in-group status … and (relatively) isolating you from the rest of your friends, the out-group.

    Second, and substantially worse, because “engagement” is the metric, Facebook inevitably selects for the shocking and the outrageous. Ev Williams summed up the results brilliantly:

    Of course this doesn’t just apply to Facebook.
    https://techcrunch.com/2017/06/04/whe...m_source=tctwreshare&sr_share=twitter
    Voting 0
  4. Esplora il significato del termine: L’attenzione a come ci sentiamo, di cui adesso comprendiamo bene la ragione, non è una novità: nel giugno del 2014, Facebook ha pubblicato i risultati di un esperimento che aveva esposto quasi 700 mila utenti a contenuti soprattutto positivi o soprattutto negativi. Lo scopo? Studiare le reazioni. Il passo successivo? Rivendersele, pare. » L’attenzione a come ci sentiamo, di cui adesso comprendiamo bene la ragione, non è una novità: nel giugno del 2014, Facebook ha pubblicato i risultati di un esperimento che aveva esposto quasi 700 mila utenti a contenuti soprattutto positivi o soprattutto negativi. Lo scopo? Studiare le reazioni. Il passo successivo? Rivendersele, pare.
    http://www.corriere.it/tecnologia/soc...f16-2e64-11e7-8176-4e0249fd95d5.shtml
    Voting 0
  5. Facebook’s entire project, when it comes to news, rests on the assumption that people’s individual preferences ultimately coincide with the public good, and that if it doesn’t appear that way at first, you’re not delving deeply enough into the data. By contrast, decades of social-science research shows that most of us simply prefer stuff that feels true to our worldview even if it isn’t true at all and that the mining of all those preference signals is likely to lead us deeper into bubbles rather than out of them.

    What’s needed, he argues, is some global superstructure to advance humanity.

    This is not an especially controversial idea; Zuckerberg is arguing for a kind of digital-era version of the global institution-building that the Western world engaged in after World War II. But because he is a chief executive and not an elected president, there is something frightening about his project. He is positioning Facebook — and, considering that he commands absolute voting control of the company, he is positioning himself — as a critical enabler of the next generation of human society. A minor problem with his mission is that it drips with megalomania, albeit of a particularly sincere sort. With his wife, Priscilla Chan, Zuckerberg has pledged to give away nearly all of his wealth to a variety of charitable causes, including a long-term medical-research project to cure all disease. His desire to take on global social problems through digital connectivity, and specifically through Facebook, feels like part of the same impulse.

    Yet Zuckerberg is often blasé about the messiness of the transition between the world we’re in and the one he wants to create through software. Building new “social infrastructure” usually involves tearing older infrastructure down. If you manage the demolition poorly, you might undermine what comes next.
    https://www.nytimes.com/2017/04/25/ma...n-facebook-fix-its-own-worst-bug.html
    Voting 0
  6. They forgot that the world doesn’t run on information. People don’t make decisions based on truth or facts. They don’t spend their money based on data.

    The world runs on feelings.

    And when you give the average person an infinite reservoir of human wisdom, they will not Google for the higher truth that contradicts their own convictions. They will not Google for what is true yet unpleasant. Instead, most of us will Google for what is pleasant but untrue.

    Having an errant racist thought? Well, there’s a whole forum of racists two clicks away with a lot of convincing-sounding arguments as to why you shouldn’t be so ashamed to have racist leanings.

    Ex-wife leaves you and you start thinking women are inherently selfish and evil? Doesn’t take a creative Google search to find more than you would ever need to believe that women are biologically inferior.

    Think Muslims are going to stalk from school to school murdering your children? I’m sure there’s a conspiracy theory somewhere out there that’s already confirming that.

    The internet, in the end, was not designed to give people the information they need. It gives people the information they want.

    And sadly, there’s a huge difference.
    Echo chamber cartoon by David Byrne
    By David Byrne

    For instance, I badly want to believe that the Trump administration is floundering and is on the brink of collapse all but a month into its tenure. And without asking, Facebook dutifully shows me articles validating this desire every single day.

    Yet, when I force myself to visit conservative websites, to look at polling data, to dig into primary sources and look at historical analogs, I see that this probably isn’t true. That we’re not in a clown car careening off a cliff. And if we are, Trump probably isn’t the one driving it, he’s just the hood ornament.

    But the fact that I’m most easily given the information that confirms my fears and quells my insecurities—this is the problem. This same network of systems designed to make me feel good every time I open my laptop is the same network of systems that is disconnecting me—disconnecting us—from the rest of our country and often from reality itself.

    Economics 101 teaches us that when there’s an oversupply of something, people value it less. If we wake up tomorrow and there are suddenly 3 billion extra lawnmowers in the US, the price of lawnmowers will plummet. If suddenly everyone had a Louis Vuitton bag, nobody would care about Louis Vuitton anymore. People would throw them out, forget them, spill wine on them, and give them away to charities.

    What if the same is true for information? What if increasing the supply of information to the point where it’s limitless has made us value any particular piece of information less?

    The problem is, as far as I can tell, the internet and its technologies don’t deliver us from tribalism. They don’t deliver us from our baser instincts. They do the opposite. They mainline tribalism into our eyeballs. And what we’re seeing is the beginning of that terrifying impact.

    This is despite the fact that war, violent crime, and authoritarianism are at their lowest points in world history, and education, life expectancy, and income are at their highest in world history.2

    It doesn’t matter, everyone thinks the world is going to hell in a handbasket anyway.

    And if everyone is feeling this way at once, despite the realities, it can’t be because the radical left is winning or the radical right is winning or the patriarchy or communists or Muslims or anarcho-fascist-ballerinas are winning.

    It can only be because our information is losing.
    https://markmanson.net/everything-is-fucked
    Voting 0
  7. Wikipedia –though not very well accepted in academic spheres- somehow gives a clear definition of what an echo chamber is in media terms: “the situation in which information, ideas, or beliefs are amplified or reinforced by transmission and repetition inside an “enclosed” system, where different or competing views are censored, disallowed or otherwise underrepresented”. It is the media effect caused by audiences seeking certain news sources online that are compatible with their world view or way of thinking. Psychologists have proved that people “generally prefer and are better at understanding information that accords with their existing schemas” 7 » , limiting their chance of encountering new or dramatically different ideas. This process of having the public divided into echo chambers seriously jeopardizes democratic citizenship. In an agonistic society where fragmented audiences have totally different ideas and thoughts, it’s of vital importance that they share them with each other, discuss, and debate, even if they don’t arrive to a conclusive solution or consensus. It’s part of the democratic life to be able to embrace different ideas and accept the critics that others have to make of ours.

    This “echo chambers” effect is not only produced by the mere consuming of certain news sources instead of others, but also because we usually share our ideas and make friends with people that think like us and that have our same values. In this way, our own ideas are replicated and even magnified by our closer social groups, reinforcing the ideas and making them more true and convincing to us than before.

    On the other hand, the “Filter Bubble” is another effect produced mainly by new digital media that adds up to the “echo chamber” problem and the lack of plurality in news online. Like Robin Foster explains in his report News Plurality in a Digital World, “through the filtering of stories via friends, or via the personalization of search, digital media encourages people to remain within their own comfort zone” 8 » . The term “Filter Bubble” describes the phenomenon by which social networks and search engines “use algorithms and personal data to select only content which matches existing tastes and preferences” 9 » . This process keeps on making hard for citizens to encounter opposing ideas that would trigger a debate or change their way of thinking. It also has a negative effect on democratic participation, closing the development of publics and exchanges that are vital to democracy.
    https://talkingpoliticsjomc.wordpress...lter-bubbles-in-the-new-digital-world
    Tags: , by M. Fioretti (2017-02-24)
    Voting 0
  8. We’re losing trust in numbers, especially statistics. Their sheer volume and variety can be overwhelming. In Politico’s recent roundup of Trump’s popularity figures, for example, the approval numbers among nine polls ranged from 36 percent to 54 percent. Add the hangover that many still suffer from the misleading presidential election predictions, and it's not surprising that people are starting to tune out data altogether, or simply interpret them in ways that support their beliefs.

    I don’t know whether this will lead to a full-blown crisis of democracy, but I think it’s already fair to place at least some of the blame on big data. Algorithms developed by companies such as Google parent Alphabet Inc. and Facebook Inc. enable partisan confirmation bias. They tailor our online environments not to the truth, but to the specific information we search for or click on. This can undermine our understanding of, and trust in, objective scientific and historical facts.

    Here’s an extreme example: Dylann Roof claimed in his manifesto that it was a Google search for “black on white crime” that led him to massacre nine people in a Charleston, South Carolina church in 2015. Think about that search term. What kinds of texts will perfectly match “black on white crime," as opposed to, say, “statistics on crime rates by race?” Naturally, Roof got links to racist web sites with their own alternative facts -- just as a search for “who really killed JFK” will, more often than not, lead to conspiracy sites.

    When I typed the phrase “Was the Hol” into Google, the search engine auto-completed to “Was the Holocaust real?” Of the top six search results, four were Holocaust-denying sites.
    https://www.bloomberg.com/view/articl...t-big-data-try-googling-the-holocaust
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 1 Online Bookmarks of M. Fioretti: Tags: echo chamber

About - Propulsed by SemanticScuttle