mfioretti: facebook study*

Bookmarks on this page are managed by an admin user.

10 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. The problem is this: Facebook has become a feedback loop which can and does, despite its best intentions, become a vicious spiral. At Facebook’s scale, behavioral targeting doesn’t just reflect our behavior, it actually influences it. Over time, a service which was supposed to connect humanity is actually partitioning us into fractal disconnected bubbles.

    The way Facebook’s News Feed works is that the more you “engage” with posts from a particular user, the more often their posts are shown to you. The more you engage with a particular kind of post, the more you will see its ilk. So far so good! It’s just showing you what you’ve demonstrated you’re interested in. What’s wrong with that?

    The answer is twofold. First, this eventually constructs a small “in-group” cluster of Facebook friends and topics that dominate your feed; and as you grow accustomed to interacting with them, this causes your behavior to change, and you interact with them even more, reinforcing their in-group status … and (relatively) isolating you from the rest of your friends, the out-group.

    Second, and substantially worse, because “engagement” is the metric, Facebook inevitably selects for the shocking and the outrageous. Ev Williams summed up the results brilliantly:

    Of course this doesn’t just apply to Facebook.
    Voting 0
  2. earlier this month, The Australian uncovered something that felt like a breach in the social contract: a leaked confidential document prepared by Facebook that revealed the company had offered advertisers the opportunity to target 6.4 million younger users, some only 14 years old, during moments of psychological vulnerability, such as when they felt “worthless,” “insecure,” “stressed,” “defeated,” “anxious,” and like a “failure.”

    The 23-page document had been prepared for a potential advertiser and highlighted Facebook’s ability to micro-target ads down to “moments when young people need a confidence boost.” According to The Australian’s report, Facebook had been monitoring posts, photos, interactions, and internet activity in real time to track these emotional lows. (Facebook confirmed the existence of the report, but declined to respond to questions from WIRED about which types of posts were used to discern emotion.)

    The day the story broke, Facebook quickly issued a public statement arguing that the premise of the article was “misleading”
    Voting 0
  3. Esplora il significato del termine: L’attenzione a come ci sentiamo, di cui adesso comprendiamo bene la ragione, non è una novità: nel giugno del 2014, Facebook ha pubblicato i risultati di un esperimento che aveva esposto quasi 700 mila utenti a contenuti soprattutto positivi o soprattutto negativi. Lo scopo? Studiare le reazioni. Il passo successivo? Rivendersele, pare. » L’attenzione a come ci sentiamo, di cui adesso comprendiamo bene la ragione, non è una novità: nel giugno del 2014, Facebook ha pubblicato i risultati di un esperimento che aveva esposto quasi 700 mila utenti a contenuti soprattutto positivi o soprattutto negativi. Lo scopo? Studiare le reazioni. Il passo successivo? Rivendersele, pare.
    Voting 0
  4. On Facebook, people were told the world was either a disaster, or seeing monumental progress. On Facebook, a Trump victory was likely, or a Clinton win was all but assured. On Facebook, the thoughts in your head turned into news articles you liked, turned into things you could share. On Facebook, everyone and no one could hear you scream.

    And the louder we screamed, the more our time on the site increased. As did Facebook’s revenue.

    it pretends to be a neutral platform where people can share whatever they like, within reason. Its teams of moderators police the site for content like pornography or the illegal sales of firearms or drugs, and other generally prohibited things. But beyond that, it turns a blind eye to the nature of the content within its walls.

    Meanwhile, an increasing number of fake news sites with completely fabricated content have filled the network even as Facebook abdicated responsibility for the disinformation it lets virally spread.

    It even went so far as to fire news editors who managed the Trends section, leaving the matter up to an impartial, but entirely fallible, algorithm. This wholesale elimination of human judgement from the site’s news machinery could not have come at a worse time for the election.

    The algorithm later trended a number of stories that were “profoundly inaccurate,” according to a report that tracked the occurrences of fake news in this high-profile section of Facebook’s platform.
    Voting 0
  5. Alla luce di questa potenza di orientamento delle opinioni e, in futuro, anche delle emozioni, tanto più preoccupante appare la cristianofobia manifestata da Facebook in questi stessi giorni, stavolta la segnalazione è arrivata da Giuseppe Marino ne il Giornale del 7 luglio “Il Facebook “buonista” che censura tutto tranne le bestemmie“:

    Il David di Michelangelo offende la morale di Facebook, la bestemmia no. I bimbi che mimano la crocifissione in una campagna anti abusi sono da «bannare», la pagina che invita a violentare la Madonna è sacra libertà d’espressione.

    C’è da impazzire a cercare una logica in queste scelte già (giustamente) al centro di polemiche.

    Il puzzle si è complicato ulteriormente ieri, dopo che lo staff del leader di Fratelli d’Italia Giorgia Meloni ha segnalato al social network una pagina intitolata a una bestemmia contro Dio. La risposta di Facebook: «Grazie per la segnalazione, ma la pagina rispetta gli standard della comunità», mancando «discorsi o simboli di incitazione all’odio».

    E proprio negli stessi giorni un’analoga segnalazione da me fatta riguardo il gruppo “Gli anticlericalisti” sul quale vengono postate, insieme alle bestemmie, delle immagini e delle frasi che oltraggiano la fede e fortemente lesive della dignità dei cristiani, ha ottenuto la medesima risposta:


    Le bestemmie e gli oltraggi ai cristiani non violano gli “standard della comunità”. Questa la posizione di Facebook.

    Secondo quanto emerso dagli studi sui meccanismi di orientamento delle masse non è eccessivo dire che Facebook sta promuovendo la discriminazione dei cristiani e sta gettando le basi per farne l’unico gruppo che sarà lecito colpire verbalmente perché questo sarà parte degli “standard” sociali.

    Facebook si sta dunque trasformando in uno strumento di formazione e orientamento del consenso, ma ancor di più, come avrebbe detto Guenon, una fonte di “état d’esprit”.

    Uno strumento antidemocratico che consegna nelle mani di una sola persona un potere che mina la formazione del giudizio dei cittadini e mette conseguentemente a rischio un’autentica dinamica democratica sostituendola con una parvenza della stessa svuotata di significato, in pratica il social sta diventando lo strumento di un totalitarismo subdolo.

    La prima, e per ora unica, vittima delle politiche di orientamento degli stati d’animo operate da Facebook sono i cristiani, il tempo dirà quali saranno le ricadute di una scelta della quale dovranno ritenersi responsabili.
    Voting 0
  6. Younger Internet users like to joke about how Facebook “is the new TV,” but in the case of political news consumption that appears to be literally true, according to a new study from the Pew Research Center for Media and Journalism. More than 60% of millennials who were surveyed said that during the previous week they got their political news from Facebook, compared with 37% who got it from TV.

    Facebook came under fire recently for a study that it funded, done by a number of in-house scientists, which looked at whether news-feed users were subjected to differing political points of view. Although the study said that the decisions of users themselves determined how much they were exposed to different points of view, a number of experts took issue with that explanation.

    These experts pointed out that Facebook’s own data confirmed that for one test group, the algorithmically filtered news-feed did affect the amount of alternative political commentary and news they were exposed to. But even more important than that, Facebook’s study pretended that a user’s experience on the site could be looked at separately from the functioning of the algorithm, when the two are so closely linked that it’s almost impossible to separate them.

    For older members of the “Baby Boom” generation, meanwhile, those figures were almost exactly reversed:
    Voting 0
  7. Mitchell’s three replies are not adequate— for us or for Facebook.

    Q. What are you optimizing for, along with user interest? A. “It’s not that we control NewsFeed, you control NewsFeed.” No, sorry. As I wrote before: It simply isn’t true that an algorithmic filter can be designed to remove the designers from the equation. The assertion melts on contact.

    Q. How do you see your role in the news ecosystem where you are more and more the dominant player? A. Facebook should not be anyone’s primary news source or news experience. No, sorry. On mobile, especially, “primary” is exactly what’s happening. And everyone who pays attention knows how strenuously Facebook tries to keep users engaged with Facebook. So “we don’t want to be primary” is… I’m trying to be nice here… a little insulting.

    Q. In news you have a lot of power now. How do you intend to use that power? A. We just want to create a great experience for users. No, sorry, that’s not an answer because you just said the users have the power, not Facebook, so what you’re really saying is: power? us? whatever do you mean?

    Facebook’s smart, capable and caring-about-news people should be disappointed that this is as far as the company has gotten in being real with itself and with us.

    Facebook is not, and knows quite well it is not, a neutral machine passing on news. Its algorithm chooses what people see, it has ‘community standards’ that material must meet and it has to operate within the laws of many countries.

    The claim that Facebook doesn’t think about journalism has to be false. And, at least in the long run, it won’t work; in the end these issues have to faced. Facebook is a private company which has grown and made billions by very successfully keeping more people on its site for longer and longer. I can imagine that any suggestion that there are responsibilities which distract from that mission must seem like a nuisance.

    Google once claimed something similar. Its executives would sit in newspaper offices and claim, with perfectly straight faces, that Google was not a media company. As this stance gradually looked more and more absurd, Google grew up and began to discuss its own power in the media.

    I would put it differently: Facebook has to start recognizing that our questions are real— not error messages. We are not suggesting that it “edits” NewsFeed in the same way that a newspaper editor once edited the front page. It’s a very different way. That’s why we’re asking about it! We are not suggesting that algorithms work in the same way that elites deciding what’s news once operated. It’s a different way. That’s why we’re asking about it!

    No one is being simple-minded here and demanding that Facebook describe editorial criteria it clearly does not have— like reaching for a nice mix of foreign and domestic news. We get it. You want not to be making those decisions. You want user interest to drive those decisions.
    Voting 0
  8. Several academics have pointed to limitations of the study, such as the fact that the only people involved had indicated their political affiliation on their Facebook page. Critics point out that those users might behave in a different way from everyone else. But beyond that, a few academics have noted a potential tension between Facebook’s desire to explore the scientific value of its data and its own corporate interests.

    Zeynep Tufekci, an assistant professor at the University of North Carolina, says the study is fascinating but adds that she would like to see more transparency about the way research is conducted at Facebook. “The study is interesting; I’m thrilled they’re publishing this stuff,” says Tufekci. “But who knows what else they found?”

    Tufekci suggests that besides the new paper showing the “filter bubble” phenomenon to be less pronounced than some had thought, several other Facebook papers have painted the network in a positive light.

    Facebook has published several important social-science studies in recent years. The enormous amount of data it collects is extremely valuable as an academic resource
    Voting 0
  9. Much of the paper is written as if it is about adult U.S. Facebook users in general, but that is not the case. Those included in the study are just those who self-identify their politics on the site. This is a rare behavior, something only 9% of users do. This 9% number is not in the report but in a separate supporting materials appendix but is crucial for interpreting the results. The population number given in the report is 10.1 million people, which yea omg is a very big number but don’t fall for the Big-N trick, we don’t how this 9% is different from Facebook in general. We cannot treat this as a sample of “Facebook users” or even “Facebook liberals and conservatives”, as the authors do in various parts of the report, but instead as about the rare people who explicitly state their political orientation on their Facebook profile.* Descriptive statistics comparing the few who explicitly self-identify and therefore enter into the study versus those who do not are not provided. Who are they, how are they different from the rest of us, why are they important to study, are all obvious things to discuss that the report doesn’t. We might infer that people who self-identify are more politically engaged, but anecdotally, nearly all my super politically engaged Facebook friends don’t explicitly list their political orientation on the site. Facebook’s report talks about Facebook users, which isn’t accurate. All the findings should be understood to be about Facebook users who also put their political orientation on their profiles, who may or may not be like the rest of Facebook users in lots of interesting and research-confounding ways. The researchers had an obligation to make this limitation much more clear, even if it tempered their grand conclusions.

    So, AMONG THOSE RARE USERS WHO EXPLICITLY SELF-IDENTIFY THEIR POLITICAL ORIENTATION ON THEIR FACEBOOK PROFILES, the study looks at the flow of news stories that are more liberal versus conservative as they are shared on Facebook, how those stories are seen and clicked on as they are shared by liberals to other liberals, conservatives to other conservatives, and most important for this study, the information that is politically cross cutting, that is, shared by someone on the right and then seen by someone on the left and vice versa. The measure of conservative or liberal news stories is a simple and in my opinion an effective one: the degree that a web domain is shared by people on the right is the degree to which content on that domain is treated as conservative (and same goes for politically liberal content). And they differentiated between soft (entertainment) versus hard (news) content, only including the latter in this study. The important work is seeing if Facebook, as a platform, is creating a filter bubble where people only see what they’d already agree with as opposed to more diverse and challenging “cross cutting” information.

    The Facebook researchers looked at how much, specifically, the newsfeed algorithm promotes the filter bubble, that is, showing users what they will already agree with over and above a non-algorithmically-sorted newsfeed. The newsfeed algorithm provided 8% less conservative content for liberals versus a non-algorithmically sorted feed, and 5% less liberal content for conservatives. This is an outcome directly attributable to the structure of Facebook itself.

    Facebook published this finding, that the newsfeed algorithm encourages users seeing what they already would agree with more than if the algorithm wasn’t there, ultimately because Facebook wants to make the case that their algorithm isn’t as big a factor in this political confirmation bias as people’s individual choices, stating, “individual choice has a larger role in limiting exposure to ideologically cross cutting content.” The researchers estimate that conservatives click on 17% less ideologically opposed news stories and liberals click on 6% less than what would be expected if users clicked on random links in their feed.

    The report concludes that, “we conclusively establish that on average in the context of Facebook, individual choices matter » more than algorithms”. Nooo this just simply isn’t the case.
    Voting 0
  10. Critics say the social network does this by shaping our perception of the world with its algorithmically-filtered newsfeed. Facebook, however, has come out with a study that it says proves this isn’t true—if there is a filter bubble, the company says, it exists because users choose to see certain things, not because of Facebook’s algorithmic filters.

    But even that’s not the biggest problem, Jurgenson and others say. The biggest issue is that the Facebook study pretends that individuals choosing to limit their exposure to different topics is a completely separate thing from the Facebook algorithm doing so. The study makes it seem like the two are disconnected and can be compared to each other on some kind of equal basis. But in reality, says Jurgenson, the latter exaggerates the former, because personal choices are what the algorithmic filtering is ultimately based on

    But is this really what the study proves? There’s considerable debate about that among social scientists knowledgeable in the field, who note that the conclusions Facebook wants us to draw—by saying, for example, that the study “establishes that … individual choices matter more than algorithms”—aren’t necessarily supported by the evidence actually provided in the paper.

    For one thing, these researchers point out that the study only looked at a tiny fraction of the total Facebook user population: less than 4% of the overall user base, in fact (a number which doesn’t appear in the study itself but is only mentioned in an appendix). That’s because the study group was selected only from those users who specifically mention their political affiliation. Needless to say, extrapolating from that to the entire 1.2 billion-user Facebook universe is a huge leap.

    Sociologist Nathan Jurgenson points out that while the study claims it conclusively proves individual choices have more effect on what users see than algorithms, it doesn’t actually back this up. In fact, while that appears to be the case for conservative users, in the case of users who identified themselves as liberals, Facebook’s own data shows that exposure to different ideological views is reduced more by the algorithm (8%) than it is by a user’s personal choice.
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 1 Online Bookmarks of M. Fioretti: Tags: facebook study

About - Propulsed by SemanticScuttle