mfioretti: algorithms* + facebook*

Bookmarks on this page are managed by an admin user.

25 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. Which brings us back to Facebook, which to this day seems at best to dimly understand how the news business works, as is evident in its longstanding insistence that it's not a media company. Wired was even inspired to publish a sarcastic self-help quiz for Facebook execs on "How to tell if you're a media company." It included such questions as "Are you the country's largest source of news?"

    The answer is a resounding yes. An astonishing 45 percent of Americans get their news from this single source. Add Google, and above 70 percent of Americans get their news from a pair of outlets. The two firms also ate up about 89 percent of the digital-advertising growth last year, underscoring their monopolistic power in this industry.

    Facebook's cluelessness on this front makes the ease with which it took over the press that much more bizarre to contemplate. Of course, the entire history of Facebook is pretty weird, even by Silicon Valley standards, beginning with the fact that the firm thinks of itself as a movement and not a giant money-sucking machine.


    That Facebook saw meteoric rises without ever experiencing a big dip in users might have something to do with the fact that the site was consciously designed to be addictive, as early founder Parker recently noted at a conference in Philadelphia.

    Facebook is full of features such as "likes" that dot your surfing experience with neuro-rushes of micro-approval – a "little dopamine hit," as Parker put it. The hits might come with getting a like when you post a picture of yourself thumbs-upping the world's third-largest cheese wheel, or flashing the "Live Long and Prosper" sign on International Star Trek day, or whatever the hell it is you do in your cyber-time. "It's a social-validation feedback loop," Parker explained. "Exactly the kind of thing that a hacker like myself would come up with, because you're exploiting a vulnerability in human psychology."
    https://www.rollingstone.com/politics...e-be-saved-social-media-giant-w518655
    Voting 0
  2. As problematic as Facebook has become, it represents only one component of a much broader shift into a new human connectivity that is both omnipresent (consider the smartphone) and hypermediated—passing through and massaged by layer upon layer of machinery carefully hidden from view. The upshot is that it’s becoming increasingly difficult to determine what in our interactions is simply human and what is machine-generated. It is becoming difficult to know what is real.

    Before the agents of this new unreality finish this first phase of their work and then disappear completely from view to complete it, we have a brief opportunity to identify and catalogue the processes shaping our drift to a new world in which reality is both relative and carefully constructed by others, for their ends. Any catalogue must include at least these four items:

    the monetisation of propaganda as ‘fake news’;
    the use of machine learning to develop user profiles accurately measuring and modelling our emotional states;
    the rise of neuromarketing, targeting highly tailored messages that nudge us to act in ways serving the ends of others;
    a new technology, ‘augmented reality’, which will push us to sever all links with the evidence of our senses.



    The fake news stories floated past as jetsam on Facebook’s ‘newsfeed’, that continuous stream of shared content drawn from a user’s Facebook’s contacts, a stream generated by everything everyone else posts or shares. A decade ago that newsfeed had a raw, unfiltered quality, the notion that everyone was doing everything, but as Facebook has matured it has engaged increasingly opaque ‘algorithms’ to curate (or censor) the newsfeed, producing something that feels much more comfortable and familiar.

    This seems like a useful feature to have, but the taming of the newsfeed comes with a consequence: Facebook’s billions of users compose their world view from what flows through their feeds. Consider the number of people on public transport—or any public place—staring into their smartphones, reviewing their feeds, marvelling at the doings of their friends, reading articles posted by family members, sharing video clips or the latest celebrity outrages. It’s an activity now so routine we ignore its omnipresence.

    Curating that newsfeed shapes what Facebook’s users learn about the world. Some of that content is controlled by the user’s ‘likes’, but a larger part is derived from Facebook’s deep analysis of a user’s behaviour. Facebook uses ‘cookies’ (invisible bits of data hidden within a user’s web browser) to track the behaviour of its users even when they’re not on the Facebook site—and even when they’re not users of Facebook. Facebook knows where its users spend time on the web, and how much time they spend there. All of that allows Facebook to tailor a newsfeed to echo the interests of each user. There’s no magic to it, beyond endless surveillance.

    What is clear is that Facebook has the power to sway the moods of billions of users. Feed people a steady diet of playful puppy videos and they’re likely to be in a happier mood than people fed images of war. Over the last two years, that capacity to manage mood has been monetised through the sharing of fake news and political feeds atuned to reader preference: you can also make people happy by confirming their biases.

    We all like to believe we’re in the right, and when we get some sign from the universe at large that we are correct, we feel better about ourselves. That’s how the curated newsfeed became wedded to the world of profitable propaganda.

    Adding a little art to brighten an other-wise dull wall seems like an unalloyed good, but only if one completely ignores bad actors. What if that blank canvas gets painted with hate speech? What if, perchance, the homes of ‘undesirables’ are singled out with graffiti that only bad actors can see? What happens when every gathering place for any oppressed community gets invisibly ‘tagged’? In short, what happens when bad actors use Facebook’s augmented reality to amplify their own capacity to act badly?

    But that’s Zuckerberg: he seems to believe his creations will only be used to bring out the best in people. He seems to believe his gigantic sharing network would never be used to incite mob violence. Just as he seems to claim that Facebook’s capacity to collect and profile the moods of its users should never be monetised—but, given that presentation unearthed by the Australian, Facebook tells a different story to advertisers.

    Regulating Facebook enshrines its position as the data-gathering and profile-building organisation, while keeping it plugged into and responsive to the needs of national powers. Before anyone takes steps that would cement Facebook in our social lives for the foreseeable future, it may be better to consider how this situation arose, and whether—given what we now know—there might be an opportunity to do things differently.
    https://meanjin.com.au/essays/the-last-days-of-reality
    Voting 0
  3. CEO Mark Zuckerberg wrote on Facebook today, “I’m changing the goal I give our product teams from focusing on helping you find relevant content to helping you have more meaningful social interactions.” VP of News Feed Adam Mosseri tells TechCrunch “I expect that the amount of distribution for publishers will go down because a lot of publisher content is just passively consumed and not talked about. Overall time on Facebook will decrease, but we think this is the right thing to do.”

    The winners in this change will be users and their sense of community, since they should find Facebook more rewarding and less of a black hole of wasted time viewing mindless video clips and guilty-pleasure articles. And long-term, it should preserve Facebook’s business and ensure it still has a platform to provide referral traffic for news publishers and marketers, albeit less than before.

    The biggest losers will be publishers who’ve shifted resources to invest in eye-catching pre-recorded social videos, because Mosseri says “video is such a passive experience”. He admits that he expects publishers to react with “a certain amount of scrutiny and anxiety”, but didn’t have many concrete answers about how publishers should scramble to react beyond “experimenting . . . and seeing . . what content gets more comments, more likes, more reshares.”
    https://techcrunch.com/2018/01/11/facebook-time-well-spent
    Voting 0
  4. this article analyses Google’s two main advertising systems, AdWords and AdSense, and proposes that these financial models have significant effects upon online discourse. In discussing AdWords, this article details some of the tensions between the local and the global that develop when tracing flows of information and capital, specifically highlighting Google’s impact on the decline of online language diversity. In outlining AdSense, this article demonstrates how Google’s hegemonic control prescribes which parts of the web can be monetised and which remain unprofitable. In particular, in drawing from existing studies, evidence is provided that Google’s AdSense programme, along with Google’s relationship with Facebook, incentivised the rise of fake news in the 2016 US presidential election. This work builds on existing scholarship to demonstrate that Google’s economic influence has varied and far-reaching effects in a number of contexts and is relevant to scholars in a range of disciplines. As such, this article is intended as a discursive introduction to the topic and does not require specific disciplinary background knowledge. In doing so, this article does not attempt to provide the final word on Google’s relationship to digital capitalism, but rather, demonstrate the profitability of a Post-Fordist perspective, in order to enable a wider engagement with the issues identified.
    https://www.nature.com/articles/s41599-017-0021-4
    Tags: , , , by M. Fioretti (2018-01-02)
    Voting 0
  5. no serious scholar of modern geopolitics disputes that we are now at war — a new kind of information-based war, but war, nevertheless — with Russia in particular, but in all honesty, with a multitude of nation states and stateless actors bent on destroying western democratic capitalism. They are using our most sophisticated and complex technology platforms to wage this war — and so far, we’re losing. Badly.

    Why? According to sources I’ve talked to both at the big tech companies and in government, each side feels the other is ignorant, arrogant, misguided, and incapable of understanding the other side’s point of view. There’s almost no data sharing, trust, or cooperation between them. We’re stuck in an old model of lobbying, soft power, and the occasional confrontational hearing.

    Not exactly the kind of public-private partnership we need to win a war, much less a peace.

    Am I arguing that the government should take over Google, Amazon, Facebook, and Apple so as to beat back Russian info-ops? No, of course not. But our current response to Russian aggression illustrates the lack of partnership and co-ordination between government and our most valuable private sector companies. And I am hoping to raise an alarm: When the private sector has markedly better information, processing power, and personnel than the public sector, one will only strengthen, while the latter will weaken. We’re seeing it play out in our current politics, and if you believe in the American idea, you should be extremely concerned.
    https://shift.newco.co/data-power-and-war-465933dcb372
    Voting 0
  6. Earlier this month, writer James Bridle published an in-depth look at the underbelly of creepy, violent content targeted at kids on YouTube – from knock-off Peppa Pig cartoons, such as one where a trip to the dentist morphs into a graphic torture scene, to live-action “gross-out” videos, which show real kids vomiting and in pain.

    These videos are being produced and added to YouTube by the thousand, then tagged with what Bridle calls “keyword salad” – long lists of popular search terms packed into their titles. These keywords are designed to game or manipulate the algorithm that sorts, ranks and selects content for users to see. And thanks to a business model aimed at maximising views (and therefore ad revenue), these videos are being auto-played and promoted to kids based on their “similarity” – at least in terms of keywords used – to content that the kids have already seen. That means a child might start out watching a normal Peppa Pig episode on the official channel, finish it, then be automatically immersed in a dark, violent and unauthorised episode – without their parent realising it.
    Advertisement

    YouTube’s response to the problem has been to hand responsibility to its users, asking them to flag videos as inappropriate. From there, the videos go to a review team that YouTube says comprises thousands of people working 24 hours a day to review content. If the content is found to be inappropriate for children, it will be age-restricted and not appear in the YouTube Kids app. It will still appear on YouTube proper, however, where, officially, users must be at least 13 years old, but in reality, is still a system which countless kids use (just think about how often antsy kids are handed a phone or tablet to keep them occupied in a public space).

    Like Facebook’s scheme, this approach has several flaws: since it’s trying to ferret out inappropriate videos from kids’ content, it’s likely that most of the people who will encounter these videos are kids themselves. I don’t expect a lot of six-year-olds to become aggressive content moderators any time soon. And if the content is flagged, it still needs to be reviewed by humans, which, as YouTube has already acknowledged, takes “round the clock” monitoring.

    When we talk about this kind of challenge, the tech companies’ response is often that it’s simply the inevitability of scale – there’s no way to serve billions of users endless streams of engaging content without getting it wrong or allowing abuse to slip by some of the time. But of course, these companies don’t have to do any of this. Auto-playing an endless stream of algorithmically selected videos to kids isn’t some sort of mandate. The internet didn’t have to become a smorgasbord of “suggested content”. It’s a choice that YouTube made, because ad views are ad views. You’ve got to break a few eggs to make an omelette, and you’ve got to traumatise a few kids to build a global behemoth worth $600bn.
    Facebook asks users for nude photos in project to combat revenge porn
    Read more

    And that’s the issue: in their unblinking pursuit of growth over the past decade, these companies have built their platforms around features that aren’t just vulnerable to abuse, but literally optimised for it. Take a system that’s easy to game, profitable to misuse, intertwined with our vulnerable people and our most intimate moments, and operating at a scale that’s impossible to control or even monitor, and this is what you get.

    The question now is, when will we force tech companies to reckon with what they’ve wrought? We’ve long decided that we won’t let companies sell cigarettes to children or put asbestos into their building materials. If we want, we can decide that there are limits to what tech can do to “engage” us, too, rather than watching these platforms spin further and further away from the utopian dreams they were sold to us on.
    https://www.theguardian.com/technolog...ra-wachter-boettcher?CMP=share_btn_tw
    Voting 0
  7. Similarly, GOOG in 2014 started reorganizing itself to focus on artificial intelligence only. In January 2014, GOOG bought DeepMind, and in September they shutdown Orkut (one of their few social products which had momentary success in some countries) forever. The Alphabet Inc restructuring was announced in August 2015 but it likely took many months of meetings and bureaucracy. The restructuring was important to focus the web-oriented departments at GOOG towards a simple mission. GOOG sees no future in the simple Search market, and announces to be migrating “From Search to Suggest” (in Eric Schmidt’s own words) and being an “AI first company” (in Sundar Pichai’s own words). GOOG is currently slightly behind FB in terms of how fast it is growing its dominance of the web, but due to their technical expertise, vast budget, influence and vision, in the long run its AI assets will play a massive role on the internet. They know what they are doing.

    These are no longer the same companies as 4 years ago. GOOG is not anymore an internet company, it’s the knowledge internet company. FB is not an internet company, it’s the social internet company. They used to attempt to compete, and this competition kept the internet market diverse. Today, however, they seem mostly satisfied with their orthogonal dominance of parts of the Web, and we are losing diversity of choices. Which leads us to another part of the internet: e-commerce and AMZN.

    AMZN does not focus on making profit.
    https://staltz.com/the-web-began-dying-in-2014-heres-how.html
    Voting 0
  8. The problem is this: Facebook has become a feedback loop which can and does, despite its best intentions, become a vicious spiral. At Facebook’s scale, behavioral targeting doesn’t just reflect our behavior, it actually influences it. Over time, a service which was supposed to connect humanity is actually partitioning us into fractal disconnected bubbles.

    The way Facebook’s News Feed works is that the more you “engage” with posts from a particular user, the more often their posts are shown to you. The more you engage with a particular kind of post, the more you will see its ilk. So far so good! It’s just showing you what you’ve demonstrated you’re interested in. What’s wrong with that?

    The answer is twofold. First, this eventually constructs a small “in-group” cluster of Facebook friends and topics that dominate your feed; and as you grow accustomed to interacting with them, this causes your behavior to change, and you interact with them even more, reinforcing their in-group status … and (relatively) isolating you from the rest of your friends, the out-group.

    Second, and substantially worse, because “engagement” is the metric, Facebook inevitably selects for the shocking and the outrageous. Ev Williams summed up the results brilliantly:

    Of course this doesn’t just apply to Facebook.
    https://techcrunch.com/2017/06/04/whe...m_source=tctwreshare&sr_share=twitter
    Voting 0
  9. "All of us, when we are uploading something, when we are tagging people, when we are commenting, we are basically working for Facebook," he says.

    The data our interactions provide feeds the complex algorithms that power the social media site, where, as Mr Joler puts it, our behaviour is transformed into a product.

    Trying to untangle that largely hidden process proved to be a mammoth task.

    "We tried to map all the inputs, the fields in which we interact with Facebook, and the outcome," he says.

    "We mapped likes, shares, search, update status, adding photos, friends, names, everything our devices are saying about us, all the permissions we are giving to Facebook via apps, such as phone status, wifi connection and the ability to record audio."

    All of this research provided only a fraction of the full picture. So the team looked into Facebook's acquisitions, and scoured its myriad patent filings.

    The results were astonishing.

    Visually arresting flow charts that take hours to absorb fully, but which show how the data we give Facebook is used to calculate our ethnic affinity (Facebook's term), sexual orientation, political affiliation, social class, travel schedule and much more.
    Image copyright Share Lab
    Image caption Share Lab presents its information in minutely detailed tables and flow charts

    One map shows how everything - from the links we post on Facebook, to the pages we like, to our online behaviour in many other corners of cyber-space that are owned or interact with the company (Instagram, WhatsApp or sites that merely use your Facebook log-in) - could all be entering a giant algorithmic process.

    And that process allows Facebook to target users with terrifying accuracy, with the ability to determine whether they like Korean food, the length of their commute to work, or their baby's age.

    Another map details the permissions many of us willingly give Facebook via its many smartphone apps, including the ability to read all text messages, download files without permission, and access our precise location.

    Individually, these are powerful tools; combined they amount to a data collection engine that, Mr Joler argues, is ripe for exploitation.

    "If you think just about cookies, just about mobile phone permissions, or just about the retention of metadata - each of those things, from the perspective of data analysis, are really intrusive."
    http://www.bbc.com/news/business-39947942
    Voting 0
  10. Facebook’s entire project, when it comes to news, rests on the assumption that people’s individual preferences ultimately coincide with the public good, and that if it doesn’t appear that way at first, you’re not delving deeply enough into the data. By contrast, decades of social-science research shows that most of us simply prefer stuff that feels true to our worldview even if it isn’t true at all and that the mining of all those preference signals is likely to lead us deeper into bubbles rather than out of them.

    What’s needed, he argues, is some global superstructure to advance humanity.

    This is not an especially controversial idea; Zuckerberg is arguing for a kind of digital-era version of the global institution-building that the Western world engaged in after World War II. But because he is a chief executive and not an elected president, there is something frightening about his project. He is positioning Facebook — and, considering that he commands absolute voting control of the company, he is positioning himself — as a critical enabler of the next generation of human society. A minor problem with his mission is that it drips with megalomania, albeit of a particularly sincere sort. With his wife, Priscilla Chan, Zuckerberg has pledged to give away nearly all of his wealth to a variety of charitable causes, including a long-term medical-research project to cure all disease. His desire to take on global social problems through digital connectivity, and specifically through Facebook, feels like part of the same impulse.

    Yet Zuckerberg is often blasé about the messiness of the transition between the world we’re in and the one he wants to create through software. Building new “social infrastructure” usually involves tearing older infrastructure down. If you manage the demolition poorly, you might undermine what comes next.
    https://www.nytimes.com/2017/04/25/ma...n-facebook-fix-its-own-worst-bug.html
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 3 Online Bookmarks of M. Fioretti: Tags: algorithms + facebook

About - Propulsed by SemanticScuttle