mfioretti: facebook study*

Bookmarks on this page are managed by an admin user.

11 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. As problematic as Facebook has become, it represents only one component of a much broader shift into a new human connectivity that is both omnipresent (consider the smartphone) and hypermediated—passing through and massaged by layer upon layer of machinery carefully hidden from view. The upshot is that it’s becoming increasingly difficult to determine what in our interactions is simply human and what is machine-generated. It is becoming difficult to know what is real.

    Before the agents of this new unreality finish this first phase of their work and then disappear completely from view to complete it, we have a brief opportunity to identify and catalogue the processes shaping our drift to a new world in which reality is both relative and carefully constructed by others, for their ends. Any catalogue must include at least these four items:

    the monetisation of propaganda as ‘fake news’;
    the use of machine learning to develop user profiles accurately measuring and modelling our emotional states;
    the rise of neuromarketing, targeting highly tailored messages that nudge us to act in ways serving the ends of others;
    a new technology, ‘augmented reality’, which will push us to sever all links with the evidence of our senses.



    The fake news stories floated past as jetsam on Facebook’s ‘newsfeed’, that continuous stream of shared content drawn from a user’s Facebook’s contacts, a stream generated by everything everyone else posts or shares. A decade ago that newsfeed had a raw, unfiltered quality, the notion that everyone was doing everything, but as Facebook has matured it has engaged increasingly opaque ‘algorithms’ to curate (or censor) the newsfeed, producing something that feels much more comfortable and familiar.

    This seems like a useful feature to have, but the taming of the newsfeed comes with a consequence: Facebook’s billions of users compose their world view from what flows through their feeds. Consider the number of people on public transport—or any public place—staring into their smartphones, reviewing their feeds, marvelling at the doings of their friends, reading articles posted by family members, sharing video clips or the latest celebrity outrages. It’s an activity now so routine we ignore its omnipresence.

    Curating that newsfeed shapes what Facebook’s users learn about the world. Some of that content is controlled by the user’s ‘likes’, but a larger part is derived from Facebook’s deep analysis of a user’s behaviour. Facebook uses ‘cookies’ (invisible bits of data hidden within a user’s web browser) to track the behaviour of its users even when they’re not on the Facebook site—and even when they’re not users of Facebook. Facebook knows where its users spend time on the web, and how much time they spend there. All of that allows Facebook to tailor a newsfeed to echo the interests of each user. There’s no magic to it, beyond endless surveillance.

    What is clear is that Facebook has the power to sway the moods of billions of users. Feed people a steady diet of playful puppy videos and they’re likely to be in a happier mood than people fed images of war. Over the last two years, that capacity to manage mood has been monetised through the sharing of fake news and political feeds atuned to reader preference: you can also make people happy by confirming their biases.

    We all like to believe we’re in the right, and when we get some sign from the universe at large that we are correct, we feel better about ourselves. That’s how the curated newsfeed became wedded to the world of profitable propaganda.

    Adding a little art to brighten an other-wise dull wall seems like an unalloyed good, but only if one completely ignores bad actors. What if that blank canvas gets painted with hate speech? What if, perchance, the homes of ‘undesirables’ are singled out with graffiti that only bad actors can see? What happens when every gathering place for any oppressed community gets invisibly ‘tagged’? In short, what happens when bad actors use Facebook’s augmented reality to amplify their own capacity to act badly?

    But that’s Zuckerberg: he seems to believe his creations will only be used to bring out the best in people. He seems to believe his gigantic sharing network would never be used to incite mob violence. Just as he seems to claim that Facebook’s capacity to collect and profile the moods of its users should never be monetised—but, given that presentation unearthed by the Australian, Facebook tells a different story to advertisers.

    Regulating Facebook enshrines its position as the data-gathering and profile-building organisation, while keeping it plugged into and responsive to the needs of national powers. Before anyone takes steps that would cement Facebook in our social lives for the foreseeable future, it may be better to consider how this situation arose, and whether—given what we now know—there might be an opportunity to do things differently.
    https://meanjin.com.au/essays/the-last-days-of-reality
    Voting 0
  2. The problem is this: Facebook has become a feedback loop which can and does, despite its best intentions, become a vicious spiral. At Facebook’s scale, behavioral targeting doesn’t just reflect our behavior, it actually influences it. Over time, a service which was supposed to connect humanity is actually partitioning us into fractal disconnected bubbles.

    The way Facebook’s News Feed works is that the more you “engage” with posts from a particular user, the more often their posts are shown to you. The more you engage with a particular kind of post, the more you will see its ilk. So far so good! It’s just showing you what you’ve demonstrated you’re interested in. What’s wrong with that?

    The answer is twofold. First, this eventually constructs a small “in-group” cluster of Facebook friends and topics that dominate your feed; and as you grow accustomed to interacting with them, this causes your behavior to change, and you interact with them even more, reinforcing their in-group status … and (relatively) isolating you from the rest of your friends, the out-group.

    Second, and substantially worse, because “engagement” is the metric, Facebook inevitably selects for the shocking and the outrageous. Ev Williams summed up the results brilliantly:

    Of course this doesn’t just apply to Facebook.
    https://techcrunch.com/2017/06/04/whe...m_source=tctwreshare&sr_share=twitter
    Voting 0
  3. earlier this month, The Australian uncovered something that felt like a breach in the social contract: a leaked confidential document prepared by Facebook that revealed the company had offered advertisers the opportunity to target 6.4 million younger users, some only 14 years old, during moments of psychological vulnerability, such as when they felt “worthless,” “insecure,” “stressed,” “defeated,” “anxious,” and like a “failure.”

    The 23-page document had been prepared for a potential advertiser and highlighted Facebook’s ability to micro-target ads down to “moments when young people need a confidence boost.” According to The Australian’s report, Facebook had been monitoring posts, photos, interactions, and internet activity in real time to track these emotional lows. (Facebook confirmed the existence of the report, but declined to respond to questions from WIRED about which types of posts were used to discern emotion.)

    The day the story broke, Facebook quickly issued a public statement arguing that the premise of the article was “misleading”
    https://www.wired.com/2017/05/welcome-next-phase-facebook-backlash
    Voting 0
  4. Esplora il significato del termine: L’attenzione a come ci sentiamo, di cui adesso comprendiamo bene la ragione, non è una novità: nel giugno del 2014, Facebook ha pubblicato i risultati di un esperimento che aveva esposto quasi 700 mila utenti a contenuti soprattutto positivi o soprattutto negativi. Lo scopo? Studiare le reazioni. Il passo successivo? Rivendersele, pare. » L’attenzione a come ci sentiamo, di cui adesso comprendiamo bene la ragione, non è una novità: nel giugno del 2014, Facebook ha pubblicato i risultati di un esperimento che aveva esposto quasi 700 mila utenti a contenuti soprattutto positivi o soprattutto negativi. Lo scopo? Studiare le reazioni. Il passo successivo? Rivendersele, pare.
    http://www.corriere.it/tecnologia/soc...f16-2e64-11e7-8176-4e0249fd95d5.shtml
    Voting 0
  5. On Facebook, people were told the world was either a disaster, or seeing monumental progress. On Facebook, a Trump victory was likely, or a Clinton win was all but assured. On Facebook, the thoughts in your head turned into news articles you liked, turned into things you could share. On Facebook, everyone and no one could hear you scream.

    And the louder we screamed, the more our time on the site increased. As did Facebook’s revenue.

    it pretends to be a neutral platform where people can share whatever they like, within reason. Its teams of moderators police the site for content like pornography or the illegal sales of firearms or drugs, and other generally prohibited things. But beyond that, it turns a blind eye to the nature of the content within its walls.

    Meanwhile, an increasing number of fake news sites with completely fabricated content have filled the network even as Facebook abdicated responsibility for the disinformation it lets virally spread.

    It even went so far as to fire news editors who managed the Trends section, leaving the matter up to an impartial, but entirely fallible, algorithm. This wholesale elimination of human judgement from the site’s news machinery could not have come at a worse time for the election.

    The algorithm later trended a number of stories that were “profoundly inaccurate,” according to a report that tracked the occurrences of fake news in this high-profile section of Facebook’s platform.
    https://techcrunch.com/2016/11/09/rigged
    Voting 0
  6. Alla luce di questa potenza di orientamento delle opinioni e, in futuro, anche delle emozioni, tanto più preoccupante appare la cristianofobia manifestata da Facebook in questi stessi giorni, stavolta la segnalazione è arrivata da Giuseppe Marino ne il Giornale del 7 luglio “Il Facebook “buonista” che censura tutto tranne le bestemmie“:

    Il David di Michelangelo offende la morale di Facebook, la bestemmia no. I bimbi che mimano la crocifissione in una campagna anti abusi sono da «bannare», la pagina che invita a violentare la Madonna è sacra libertà d’espressione.

    C’è da impazzire a cercare una logica in queste scelte già (giustamente) al centro di polemiche.

    Il puzzle si è complicato ulteriormente ieri, dopo che lo staff del leader di Fratelli d’Italia Giorgia Meloni ha segnalato al social network una pagina intitolata a una bestemmia contro Dio. La risposta di Facebook: «Grazie per la segnalazione, ma la pagina rispetta gli standard della comunità», mancando «discorsi o simboli di incitazione all’odio».

    E proprio negli stessi giorni un’analoga segnalazione da me fatta riguardo il gruppo “Gli anticlericalisti” sul quale vengono postate, insieme alle bestemmie, delle immagini e delle frasi che oltraggiano la fede e fortemente lesive della dignità dei cristiani, ha ottenuto la medesima risposta:


    facebsegn



    Le bestemmie e gli oltraggi ai cristiani non violano gli “standard della comunità”. Questa la posizione di Facebook.

    Secondo quanto emerso dagli studi sui meccanismi di orientamento delle masse non è eccessivo dire che Facebook sta promuovendo la discriminazione dei cristiani e sta gettando le basi per farne l’unico gruppo che sarà lecito colpire verbalmente perché questo sarà parte degli “standard” sociali.

    Facebook si sta dunque trasformando in uno strumento di formazione e orientamento del consenso, ma ancor di più, come avrebbe detto Guenon, una fonte di “état d’esprit”.

    Uno strumento antidemocratico che consegna nelle mani di una sola persona un potere che mina la formazione del giudizio dei cittadini e mette conseguentemente a rischio un’autentica dinamica democratica sostituendola con una parvenza della stessa svuotata di significato, in pratica il social sta diventando lo strumento di un totalitarismo subdolo.

    La prima, e per ora unica, vittima delle politiche di orientamento degli stati d’animo operate da Facebook sono i cristiani, il tempo dirà quali saranno le ricadute di una scelta della quale dovranno ritenersi responsabili.
    http://www.enzopennetta.it/2015/07/fa...-e-la-promozione-della-cristianofobia
    Voting 0
  7. Younger Internet users like to joke about how Facebook “is the new TV,” but in the case of political news consumption that appears to be literally true, according to a new study from the Pew Research Center for Media and Journalism. More than 60% of millennials who were surveyed said that during the previous week they got their political news from Facebook, compared with 37% who got it from TV.

    Facebook came under fire recently for a study that it funded, done by a number of in-house scientists, which looked at whether news-feed users were subjected to differing political points of view. Although the study said that the decisions of users themselves determined how much they were exposed to different points of view, a number of experts took issue with that explanation.

    These experts pointed out that Facebook’s own data confirmed that for one test group, the algorithmically filtered news-feed did affect the amount of alternative political commentary and news they were exposed to. But even more important than that, Facebook’s study pretended that a user’s experience on the site could be looked at separately from the functioning of the algorithm, when the two are so closely linked that it’s almost impossible to separate them.

    For older members of the “Baby Boom” generation, meanwhile, those figures were almost exactly reversed:
    https://fortune.com/2015/06/01/facebook-algorithm-news-millennials
    Voting 0
  8. Mitchell’s three replies are not adequate— for us or for Facebook.

    Q. What are you optimizing for, along with user interest? A. “It’s not that we control NewsFeed, you control NewsFeed.” No, sorry. As I wrote before: It simply isn’t true that an algorithmic filter can be designed to remove the designers from the equation. The assertion melts on contact.

    Q. How do you see your role in the news ecosystem where you are more and more the dominant player? A. Facebook should not be anyone’s primary news source or news experience. No, sorry. On mobile, especially, “primary” is exactly what’s happening. And everyone who pays attention knows how strenuously Facebook tries to keep users engaged with Facebook. So “we don’t want to be primary” is… I’m trying to be nice here… a little insulting.

    Q. In news you have a lot of power now. How do you intend to use that power? A. We just want to create a great experience for users. No, sorry, that’s not an answer because you just said the users have the power, not Facebook, so what you’re really saying is: power? us? whatever do you mean?

    Facebook’s smart, capable and caring-about-news people should be disappointed that this is as far as the company has gotten in being real with itself and with us.

    Facebook is not, and knows quite well it is not, a neutral machine passing on news. Its algorithm chooses what people see, it has ‘community standards’ that material must meet and it has to operate within the laws of many countries.

    The claim that Facebook doesn’t think about journalism has to be false. And, at least in the long run, it won’t work; in the end these issues have to faced. Facebook is a private company which has grown and made billions by very successfully keeping more people on its site for longer and longer. I can imagine that any suggestion that there are responsibilities which distract from that mission must seem like a nuisance.

    Google once claimed something similar. Its executives would sit in newspaper offices and claim, with perfectly straight faces, that Google was not a media company. As this stance gradually looked more and more absurd, Google grew up and began to discuss its own power in the media.

    I would put it differently: Facebook has to start recognizing that our questions are real— not error messages. We are not suggesting that it “edits” NewsFeed in the same way that a newspaper editor once edited the front page. It’s a very different way. That’s why we’re asking about it! We are not suggesting that algorithms work in the same way that elites deciding what’s news once operated. It’s a different way. That’s why we’re asking about it!

    No one is being simple-minded here and demanding that Facebook describe editorial criteria it clearly does not have— like reaching for a nice mix of foreign and domestic news. We get it. You want not to be making those decisions. You want user interest to drive those decisions.
    http://pressthink.org/2015/04/its-not...wsfeed-facebook-please-stop-with-this
    Voting 0
  9. Several academics have pointed to limitations of the study, such as the fact that the only people involved had indicated their political affiliation on their Facebook page. Critics point out that those users might behave in a different way from everyone else. But beyond that, a few academics have noted a potential tension between Facebook’s desire to explore the scientific value of its data and its own corporate interests.

    Zeynep Tufekci, an assistant professor at the University of North Carolina, says the study is fascinating but adds that she would like to see more transparency about the way research is conducted at Facebook. “The study is interesting; I’m thrilled they’re publishing this stuff,” says Tufekci. “But who knows what else they found?”

    Tufekci suggests that besides the new paper showing the “filter bubble” phenomenon to be less pronounced than some had thought, several other Facebook papers have painted the network in a positive light.

    Facebook has published several important social-science studies in recent years. The enormous amount of data it collects is extremely valuable as an academic resource
    http://www.technologyreview.com/news/...y-raises-questions-about-transparency
    Voting 0
  10. Much of the paper is written as if it is about adult U.S. Facebook users in general, but that is not the case. Those included in the study are just those who self-identify their politics on the site. This is a rare behavior, something only 9% of users do. This 9% number is not in the report but in a separate supporting materials appendix but is crucial for interpreting the results. The population number given in the report is 10.1 million people, which yea omg is a very big number but don’t fall for the Big-N trick, we don’t how this 9% is different from Facebook in general. We cannot treat this as a sample of “Facebook users” or even “Facebook liberals and conservatives”, as the authors do in various parts of the report, but instead as about the rare people who explicitly state their political orientation on their Facebook profile.* Descriptive statistics comparing the few who explicitly self-identify and therefore enter into the study versus those who do not are not provided. Who are they, how are they different from the rest of us, why are they important to study, are all obvious things to discuss that the report doesn’t. We might infer that people who self-identify are more politically engaged, but anecdotally, nearly all my super politically engaged Facebook friends don’t explicitly list their political orientation on the site. Facebook’s report talks about Facebook users, which isn’t accurate. All the findings should be understood to be about Facebook users who also put their political orientation on their profiles, who may or may not be like the rest of Facebook users in lots of interesting and research-confounding ways. The researchers had an obligation to make this limitation much more clear, even if it tempered their grand conclusions.

    So, AMONG THOSE RARE USERS WHO EXPLICITLY SELF-IDENTIFY THEIR POLITICAL ORIENTATION ON THEIR FACEBOOK PROFILES, the study looks at the flow of news stories that are more liberal versus conservative as they are shared on Facebook, how those stories are seen and clicked on as they are shared by liberals to other liberals, conservatives to other conservatives, and most important for this study, the information that is politically cross cutting, that is, shared by someone on the right and then seen by someone on the left and vice versa. The measure of conservative or liberal news stories is a simple and in my opinion an effective one: the degree that a web domain is shared by people on the right is the degree to which content on that domain is treated as conservative (and same goes for politically liberal content). And they differentiated between soft (entertainment) versus hard (news) content, only including the latter in this study. The important work is seeing if Facebook, as a platform, is creating a filter bubble where people only see what they’d already agree with as opposed to more diverse and challenging “cross cutting” information.

    The Facebook researchers looked at how much, specifically, the newsfeed algorithm promotes the filter bubble, that is, showing users what they will already agree with over and above a non-algorithmically-sorted newsfeed. The newsfeed algorithm provided 8% less conservative content for liberals versus a non-algorithmically sorted feed, and 5% less liberal content for conservatives. This is an outcome directly attributable to the structure of Facebook itself.

    Facebook published this finding, that the newsfeed algorithm encourages users seeing what they already would agree with more than if the algorithm wasn’t there, ultimately because Facebook wants to make the case that their algorithm isn’t as big a factor in this political confirmation bias as people’s individual choices, stating, “individual choice has a larger role in limiting exposure to ideologically cross cutting content.” The researchers estimate that conservatives click on 17% less ideologically opposed news stories and liberals click on 6% less than what would be expected if users clicked on random links in their feed.

    The report concludes that, “we conclusively establish that on average in the context of Facebook, individual choices matter » more than algorithms”. Nooo this just simply isn’t the case.
    http://thesocietypages.org/cyborgolog...2015/05/07/facebook-fair-and-balanced
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 2 Online Bookmarks of M. Fioretti: Tags: facebook study

About - Propulsed by SemanticScuttle