mfioretti: algorithms* + facebook*

Bookmarks on this page are managed by an admin user.

23 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. CEO Mark Zuckerberg wrote on Facebook today, “I’m changing the goal I give our product teams from focusing on helping you find relevant content to helping you have more meaningful social interactions.” VP of News Feed Adam Mosseri tells TechCrunch “I expect that the amount of distribution for publishers will go down because a lot of publisher content is just passively consumed and not talked about. Overall time on Facebook will decrease, but we think this is the right thing to do.”

    The winners in this change will be users and their sense of community, since they should find Facebook more rewarding and less of a black hole of wasted time viewing mindless video clips and guilty-pleasure articles. And long-term, it should preserve Facebook’s business and ensure it still has a platform to provide referral traffic for news publishers and marketers, albeit less than before.

    The biggest losers will be publishers who’ve shifted resources to invest in eye-catching pre-recorded social videos, because Mosseri says “video is such a passive experience”. He admits that he expects publishers to react with “a certain amount of scrutiny and anxiety”, but didn’t have many concrete answers about how publishers should scramble to react beyond “experimenting . . . and seeing . . what content gets more comments, more likes, more reshares.”
    https://techcrunch.com/2018/01/11/facebook-time-well-spent
    Voting 0
  2. this article analyses Google’s two main advertising systems, AdWords and AdSense, and proposes that these financial models have significant effects upon online discourse. In discussing AdWords, this article details some of the tensions between the local and the global that develop when tracing flows of information and capital, specifically highlighting Google’s impact on the decline of online language diversity. In outlining AdSense, this article demonstrates how Google’s hegemonic control prescribes which parts of the web can be monetised and which remain unprofitable. In particular, in drawing from existing studies, evidence is provided that Google’s AdSense programme, along with Google’s relationship with Facebook, incentivised the rise of fake news in the 2016 US presidential election. This work builds on existing scholarship to demonstrate that Google’s economic influence has varied and far-reaching effects in a number of contexts and is relevant to scholars in a range of disciplines. As such, this article is intended as a discursive introduction to the topic and does not require specific disciplinary background knowledge. In doing so, this article does not attempt to provide the final word on Google’s relationship to digital capitalism, but rather, demonstrate the profitability of a Post-Fordist perspective, in order to enable a wider engagement with the issues identified.
    https://www.nature.com/articles/s41599-017-0021-4
    Tags: , , , by M. Fioretti (2018-01-02)
    Voting 0
  3. no serious scholar of modern geopolitics disputes that we are now at war — a new kind of information-based war, but war, nevertheless — with Russia in particular, but in all honesty, with a multitude of nation states and stateless actors bent on destroying western democratic capitalism. They are using our most sophisticated and complex technology platforms to wage this war — and so far, we’re losing. Badly.

    Why? According to sources I’ve talked to both at the big tech companies and in government, each side feels the other is ignorant, arrogant, misguided, and incapable of understanding the other side’s point of view. There’s almost no data sharing, trust, or cooperation between them. We’re stuck in an old model of lobbying, soft power, and the occasional confrontational hearing.

    Not exactly the kind of public-private partnership we need to win a war, much less a peace.

    Am I arguing that the government should take over Google, Amazon, Facebook, and Apple so as to beat back Russian info-ops? No, of course not. But our current response to Russian aggression illustrates the lack of partnership and co-ordination between government and our most valuable private sector companies. And I am hoping to raise an alarm: When the private sector has markedly better information, processing power, and personnel than the public sector, one will only strengthen, while the latter will weaken. We’re seeing it play out in our current politics, and if you believe in the American idea, you should be extremely concerned.
    https://shift.newco.co/data-power-and-war-465933dcb372
    Voting 0
  4. Earlier this month, writer James Bridle published an in-depth look at the underbelly of creepy, violent content targeted at kids on YouTube – from knock-off Peppa Pig cartoons, such as one where a trip to the dentist morphs into a graphic torture scene, to live-action “gross-out” videos, which show real kids vomiting and in pain.

    These videos are being produced and added to YouTube by the thousand, then tagged with what Bridle calls “keyword salad” – long lists of popular search terms packed into their titles. These keywords are designed to game or manipulate the algorithm that sorts, ranks and selects content for users to see. And thanks to a business model aimed at maximising views (and therefore ad revenue), these videos are being auto-played and promoted to kids based on their “similarity” – at least in terms of keywords used – to content that the kids have already seen. That means a child might start out watching a normal Peppa Pig episode on the official channel, finish it, then be automatically immersed in a dark, violent and unauthorised episode – without their parent realising it.
    Advertisement

    YouTube’s response to the problem has been to hand responsibility to its users, asking them to flag videos as inappropriate. From there, the videos go to a review team that YouTube says comprises thousands of people working 24 hours a day to review content. If the content is found to be inappropriate for children, it will be age-restricted and not appear in the YouTube Kids app. It will still appear on YouTube proper, however, where, officially, users must be at least 13 years old, but in reality, is still a system which countless kids use (just think about how often antsy kids are handed a phone or tablet to keep them occupied in a public space).

    Like Facebook’s scheme, this approach has several flaws: since it’s trying to ferret out inappropriate videos from kids’ content, it’s likely that most of the people who will encounter these videos are kids themselves. I don’t expect a lot of six-year-olds to become aggressive content moderators any time soon. And if the content is flagged, it still needs to be reviewed by humans, which, as YouTube has already acknowledged, takes “round the clock” monitoring.

    When we talk about this kind of challenge, the tech companies’ response is often that it’s simply the inevitability of scale – there’s no way to serve billions of users endless streams of engaging content without getting it wrong or allowing abuse to slip by some of the time. But of course, these companies don’t have to do any of this. Auto-playing an endless stream of algorithmically selected videos to kids isn’t some sort of mandate. The internet didn’t have to become a smorgasbord of “suggested content”. It’s a choice that YouTube made, because ad views are ad views. You’ve got to break a few eggs to make an omelette, and you’ve got to traumatise a few kids to build a global behemoth worth $600bn.
    Facebook asks users for nude photos in project to combat revenge porn
    Read more

    And that’s the issue: in their unblinking pursuit of growth over the past decade, these companies have built their platforms around features that aren’t just vulnerable to abuse, but literally optimised for it. Take a system that’s easy to game, profitable to misuse, intertwined with our vulnerable people and our most intimate moments, and operating at a scale that’s impossible to control or even monitor, and this is what you get.

    The question now is, when will we force tech companies to reckon with what they’ve wrought? We’ve long decided that we won’t let companies sell cigarettes to children or put asbestos into their building materials. If we want, we can decide that there are limits to what tech can do to “engage” us, too, rather than watching these platforms spin further and further away from the utopian dreams they were sold to us on.
    https://www.theguardian.com/technolog...ra-wachter-boettcher?CMP=share_btn_tw
    Voting 0
  5. Similarly, GOOG in 2014 started reorganizing itself to focus on artificial intelligence only. In January 2014, GOOG bought DeepMind, and in September they shutdown Orkut (one of their few social products which had momentary success in some countries) forever. The Alphabet Inc restructuring was announced in August 2015 but it likely took many months of meetings and bureaucracy. The restructuring was important to focus the web-oriented departments at GOOG towards a simple mission. GOOG sees no future in the simple Search market, and announces to be migrating “From Search to Suggest” (in Eric Schmidt’s own words) and being an “AI first company” (in Sundar Pichai’s own words). GOOG is currently slightly behind FB in terms of how fast it is growing its dominance of the web, but due to their technical expertise, vast budget, influence and vision, in the long run its AI assets will play a massive role on the internet. They know what they are doing.

    These are no longer the same companies as 4 years ago. GOOG is not anymore an internet company, it’s the knowledge internet company. FB is not an internet company, it’s the social internet company. They used to attempt to compete, and this competition kept the internet market diverse. Today, however, they seem mostly satisfied with their orthogonal dominance of parts of the Web, and we are losing diversity of choices. Which leads us to another part of the internet: e-commerce and AMZN.

    AMZN does not focus on making profit.
    https://staltz.com/the-web-began-dying-in-2014-heres-how.html
    Voting 0
  6. The problem is this: Facebook has become a feedback loop which can and does, despite its best intentions, become a vicious spiral. At Facebook’s scale, behavioral targeting doesn’t just reflect our behavior, it actually influences it. Over time, a service which was supposed to connect humanity is actually partitioning us into fractal disconnected bubbles.

    The way Facebook’s News Feed works is that the more you “engage” with posts from a particular user, the more often their posts are shown to you. The more you engage with a particular kind of post, the more you will see its ilk. So far so good! It’s just showing you what you’ve demonstrated you’re interested in. What’s wrong with that?

    The answer is twofold. First, this eventually constructs a small “in-group” cluster of Facebook friends and topics that dominate your feed; and as you grow accustomed to interacting with them, this causes your behavior to change, and you interact with them even more, reinforcing their in-group status … and (relatively) isolating you from the rest of your friends, the out-group.

    Second, and substantially worse, because “engagement” is the metric, Facebook inevitably selects for the shocking and the outrageous. Ev Williams summed up the results brilliantly:

    Of course this doesn’t just apply to Facebook.
    https://techcrunch.com/2017/06/04/whe...m_source=tctwreshare&sr_share=twitter
    Voting 0
  7. "All of us, when we are uploading something, when we are tagging people, when we are commenting, we are basically working for Facebook," he says.

    The data our interactions provide feeds the complex algorithms that power the social media site, where, as Mr Joler puts it, our behaviour is transformed into a product.

    Trying to untangle that largely hidden process proved to be a mammoth task.

    "We tried to map all the inputs, the fields in which we interact with Facebook, and the outcome," he says.

    "We mapped likes, shares, search, update status, adding photos, friends, names, everything our devices are saying about us, all the permissions we are giving to Facebook via apps, such as phone status, wifi connection and the ability to record audio."

    All of this research provided only a fraction of the full picture. So the team looked into Facebook's acquisitions, and scoured its myriad patent filings.

    The results were astonishing.

    Visually arresting flow charts that take hours to absorb fully, but which show how the data we give Facebook is used to calculate our ethnic affinity (Facebook's term), sexual orientation, political affiliation, social class, travel schedule and much more.
    Image copyright Share Lab
    Image caption Share Lab presents its information in minutely detailed tables and flow charts

    One map shows how everything - from the links we post on Facebook, to the pages we like, to our online behaviour in many other corners of cyber-space that are owned or interact with the company (Instagram, WhatsApp or sites that merely use your Facebook log-in) - could all be entering a giant algorithmic process.

    And that process allows Facebook to target users with terrifying accuracy, with the ability to determine whether they like Korean food, the length of their commute to work, or their baby's age.

    Another map details the permissions many of us willingly give Facebook via its many smartphone apps, including the ability to read all text messages, download files without permission, and access our precise location.

    Individually, these are powerful tools; combined they amount to a data collection engine that, Mr Joler argues, is ripe for exploitation.

    "If you think just about cookies, just about mobile phone permissions, or just about the retention of metadata - each of those things, from the perspective of data analysis, are really intrusive."
    http://www.bbc.com/news/business-39947942
    Voting 0
  8. Facebook’s entire project, when it comes to news, rests on the assumption that people’s individual preferences ultimately coincide with the public good, and that if it doesn’t appear that way at first, you’re not delving deeply enough into the data. By contrast, decades of social-science research shows that most of us simply prefer stuff that feels true to our worldview even if it isn’t true at all and that the mining of all those preference signals is likely to lead us deeper into bubbles rather than out of them.

    What’s needed, he argues, is some global superstructure to advance humanity.

    This is not an especially controversial idea; Zuckerberg is arguing for a kind of digital-era version of the global institution-building that the Western world engaged in after World War II. But because he is a chief executive and not an elected president, there is something frightening about his project. He is positioning Facebook — and, considering that he commands absolute voting control of the company, he is positioning himself — as a critical enabler of the next generation of human society. A minor problem with his mission is that it drips with megalomania, albeit of a particularly sincere sort. With his wife, Priscilla Chan, Zuckerberg has pledged to give away nearly all of his wealth to a variety of charitable causes, including a long-term medical-research project to cure all disease. His desire to take on global social problems through digital connectivity, and specifically through Facebook, feels like part of the same impulse.

    Yet Zuckerberg is often blasé about the messiness of the transition between the world we’re in and the one he wants to create through software. Building new “social infrastructure” usually involves tearing older infrastructure down. If you manage the demolition poorly, you might undermine what comes next.
    https://www.nytimes.com/2017/04/25/ma...n-facebook-fix-its-own-worst-bug.html
    Voting 0
  9. In the digital age, algorithms are a source not only of wealth but of power. The decoupling of the generation of news (by journalists) from the means of its distribution (social networks) is a major shift in information power. What a Facebook user sees is an interplay between strategies set by humans, algorithms which they design and the information which the software accumulates about an individual user’s preferences. So the distribution of power in ths new relationship is subtle, but it is still power. Networks have power which other institutions and organisations don’t. The giants of Silicon Valley have, simply by virtue of their scale, informational power which is unprecedented. Digital technology re-routes information at a speed and on a scale previously unknown. We are only just getting a handle on where and how power has been shifted by those opportunities. Private companies rarely admit their power, but to write – as Zuckerberg does – as if Facebook was a benign, self-governing collective with no decisions about how to use its power is to deny a reality plain to see
    http://georgebrock.net/mr-zuckerbergs-education
    Voting 0
  10. Yours might be one of angst and despair, or celebrations and "I told you so's." It depends on the people you're friends with and the online community you've created with your clicks, likes and shares.

    Facebook's algorithm knows what you like based on the videos you watch, people you talk to, and content you interact with. It then shows you more of the same. This creates something called "filter bubbles." You begin to see only the content you like and agree with, while Facebook (FB, Tech30) hides dissenting points of view.

    This means news on Facebook comes with confirmation bias -- it reinforces what you already think is true -- and people are increasingly frustrated.

    Facebook denies it's a media company, yet almost half of U.S. adults get news from Facebook.

    When Facebook fired its human curators and began to rely on algorithms to surface popular stories earlier this year, fake news proliferated.

    Viral memes and propaganda spread among people with similar beliefs and interests. It's cheaper and easier to create and spread ideological disinformation than deeply-researched and reported news. And it comes from all over -- teens in Macedonia are responsible for a large portion of fake pro-Trump news, according to a BuzzFeed analysis.

    Related: The plague of fake news is getting worse -- here's how to protect yourself

    Filter bubbles became especially problematic during the presidential election.

    Hyperpartisan news sites and fake websites distributed false stories about voter fraud, election conspiracies, and the candidates' pasts that spread like wildfire on Facebook. It was more prevalent on right-leaning Facebook pages. As CNNMoney's Brian Stelter said in response to the growing number of false viral stories, people should have a "triple check before you share" rule.

    Today, many people are shocked by Trump's victory. Words of fear and sorrow fill their Facebook feeds, and even those with thousands of friends are probably only seeing posts that echo their feelings.

    But if you voted for Trump, chances are your feed reflects the opposite. You might see a cascade of #MakeAmericaGreatAgain hashtags and friends celebrating.
    http://money.cnn.com/2016/11/09/techn...2Fedition_us+%28RSS%3A+CNNi+-+U.S.%29
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 3 Online Bookmarks of M. Fioretti: Tags: algorithms + facebook

About - Propulsed by SemanticScuttle