mfioretti: facebook* + google*

Bookmarks on this page are managed by an admin user.

50 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. this article analyses Google’s two main advertising systems, AdWords and AdSense, and proposes that these financial models have significant effects upon online discourse. In discussing AdWords, this article details some of the tensions between the local and the global that develop when tracing flows of information and capital, specifically highlighting Google’s impact on the decline of online language diversity. In outlining AdSense, this article demonstrates how Google’s hegemonic control prescribes which parts of the web can be monetised and which remain unprofitable. In particular, in drawing from existing studies, evidence is provided that Google’s AdSense programme, along with Google’s relationship with Facebook, incentivised the rise of fake news in the 2016 US presidential election. This work builds on existing scholarship to demonstrate that Google’s economic influence has varied and far-reaching effects in a number of contexts and is relevant to scholars in a range of disciplines. As such, this article is intended as a discursive introduction to the topic and does not require specific disciplinary background knowledge. In doing so, this article does not attempt to provide the final word on Google’s relationship to digital capitalism, but rather, demonstrate the profitability of a Post-Fordist perspective, in order to enable a wider engagement with the issues identified.
    https://www.nature.com/articles/s41599-017-0021-4
    Tags: , , , by M. Fioretti (2018-01-02)
    Voting 0
  2. Earlier this month, writer James Bridle published an in-depth look at the underbelly of creepy, violent content targeted at kids on YouTube – from knock-off Peppa Pig cartoons, such as one where a trip to the dentist morphs into a graphic torture scene, to live-action “gross-out” videos, which show real kids vomiting and in pain.

    These videos are being produced and added to YouTube by the thousand, then tagged with what Bridle calls “keyword salad” – long lists of popular search terms packed into their titles. These keywords are designed to game or manipulate the algorithm that sorts, ranks and selects content for users to see. And thanks to a business model aimed at maximising views (and therefore ad revenue), these videos are being auto-played and promoted to kids based on their “similarity” – at least in terms of keywords used – to content that the kids have already seen. That means a child might start out watching a normal Peppa Pig episode on the official channel, finish it, then be automatically immersed in a dark, violent and unauthorised episode – without their parent realising it.
    Advertisement

    YouTube’s response to the problem has been to hand responsibility to its users, asking them to flag videos as inappropriate. From there, the videos go to a review team that YouTube says comprises thousands of people working 24 hours a day to review content. If the content is found to be inappropriate for children, it will be age-restricted and not appear in the YouTube Kids app. It will still appear on YouTube proper, however, where, officially, users must be at least 13 years old, but in reality, is still a system which countless kids use (just think about how often antsy kids are handed a phone or tablet to keep them occupied in a public space).

    Like Facebook’s scheme, this approach has several flaws: since it’s trying to ferret out inappropriate videos from kids’ content, it’s likely that most of the people who will encounter these videos are kids themselves. I don’t expect a lot of six-year-olds to become aggressive content moderators any time soon. And if the content is flagged, it still needs to be reviewed by humans, which, as YouTube has already acknowledged, takes “round the clock” monitoring.

    When we talk about this kind of challenge, the tech companies’ response is often that it’s simply the inevitability of scale – there’s no way to serve billions of users endless streams of engaging content without getting it wrong or allowing abuse to slip by some of the time. But of course, these companies don’t have to do any of this. Auto-playing an endless stream of algorithmically selected videos to kids isn’t some sort of mandate. The internet didn’t have to become a smorgasbord of “suggested content”. It’s a choice that YouTube made, because ad views are ad views. You’ve got to break a few eggs to make an omelette, and you’ve got to traumatise a few kids to build a global behemoth worth $600bn.
    Facebook asks users for nude photos in project to combat revenge porn
    Read more

    And that’s the issue: in their unblinking pursuit of growth over the past decade, these companies have built their platforms around features that aren’t just vulnerable to abuse, but literally optimised for it. Take a system that’s easy to game, profitable to misuse, intertwined with our vulnerable people and our most intimate moments, and operating at a scale that’s impossible to control or even monitor, and this is what you get.

    The question now is, when will we force tech companies to reckon with what they’ve wrought? We’ve long decided that we won’t let companies sell cigarettes to children or put asbestos into their building materials. If we want, we can decide that there are limits to what tech can do to “engage” us, too, rather than watching these platforms spin further and further away from the utopian dreams they were sold to us on.
    https://www.theguardian.com/technolog...ra-wachter-boettcher?CMP=share_btn_tw
    Voting 0
  3. Similarly, GOOG in 2014 started reorganizing itself to focus on artificial intelligence only. In January 2014, GOOG bought DeepMind, and in September they shutdown Orkut (one of their few social products which had momentary success in some countries) forever. The Alphabet Inc restructuring was announced in August 2015 but it likely took many months of meetings and bureaucracy. The restructuring was important to focus the web-oriented departments at GOOG towards a simple mission. GOOG sees no future in the simple Search market, and announces to be migrating “From Search to Suggest” (in Eric Schmidt’s own words) and being an “AI first company” (in Sundar Pichai’s own words). GOOG is currently slightly behind FB in terms of how fast it is growing its dominance of the web, but due to their technical expertise, vast budget, influence and vision, in the long run its AI assets will play a massive role on the internet. They know what they are doing.

    These are no longer the same companies as 4 years ago. GOOG is not anymore an internet company, it’s the knowledge internet company. FB is not an internet company, it’s the social internet company. They used to attempt to compete, and this competition kept the internet market diverse. Today, however, they seem mostly satisfied with their orthogonal dominance of parts of the Web, and we are losing diversity of choices. Which leads us to another part of the internet: e-commerce and AMZN.

    AMZN does not focus on making profit.
    https://staltz.com/the-web-began-dying-in-2014-heres-how.html
    Voting 0
  4. Quando Facebook dichiara un miliardo di utenti, il mondo non sarà mai più come prima: ogni azienda deve esserci, attratta dall’idea di poter mandare messaggi gratis ai propri fan. Ben presto non più gratis, bensì pagando, per la gioia degli investitori.

    Il confine fra contenuti e pubblicità sembra ormai un ricordo del passato.
    Il futuro è la televisione

    Google e Facebook continuano la propria corsa, all’apparenza inarrestabili. Negli Stati Uniti, il duopolio porta a casa 3 dollari su 4 della “pubblicità” (si fa per dire: è direct marketing) su Internet, e addirittura il 99% dei nuovi investimenti sul web.

    Il problema è che questo filone aureo (si fa per dire) si è ormai esaurito.

    Google e Facebook hanno un rapporto price per earning che è il doppio di quello di altre aziende media americane, ma non hanno più praterie davanti a sé da conquistare e facili e prevedibili guadagni futuri che possano giustificare un elevato rapporto P/E.

    Per difendere il proprio titolo in Borsa, devono attaccare la pubblicità di tipo brand.
    E la pubblicità di tipo brand non va sui banner, non va sui social e non va sui video delle Mentos, bensì in televisione, su programmi come serie TV, film e sport.

    Google o Facebook dovranno reinventarsi come produttori di contenuti di qualità, come ha già iniziato a fare Netflix. Ma che vantaggio competitivo possono vantare Google o Facebook su Disney (ABC), Comcast (NBC), Viacom (CBS) o Time Warner (HBO)?
    https://www.dotcoma.it/2017/10/24/il-declino-di-google-e-facebook.html
    Voting 0
  5. When it comes to human beings — what motivates them, how they interact socially, to what end they organize politically — figures like Page and Zuckerberg know very little. Almost nothing, in fact. And that ignorance has enormous consequences for us all.
    http://theweek.com/articles/731764/genius-stupidity-silicon-valley
    Voting 0
  6. The FT reports on Wednesday that “Facebook and Google have announced they will restrict advertising on online platforms with fake news, after a furore over the role of such stories in last week’s US presidential election.”

    The following is a personal view and thus not representative of the wider views of the FT, so no doubt biased to whatever cultural norms impacted my formative years — among them being of Polish descent, being brought up Catholic, having staunchly anti-communist parents, experiencing a youthful rebellion against that framework and later moderating to a middle ground. With that out of the way…

    Surely having Facebook and Google restrict advertising on subjective grounds is the worst possible outcome of this entire affair?

    The idea all-powerful platforms like Google and Facebook should be charged with the responsibility of strategically filtering and determining what constitutes fake news is not just questionable but frightening in the Orwellian Newspeak sense of the word.


    Habermas’ most profound observation is that the formation of the public news arena is intimately connected to the rise of the coffee houses and stock exchanges. This is because it is only on the stock exchange that the full range of conflicting views collide to forge a clearing price. Repression or manipulation of information flow, meanwhile, only ensures that the clearing price will be off to someone’s advantage and to someone else’s disadvantage.

    Interestingly, back in the 90s and noughties, when the internet was first becoming a thing, media academics would often ponder whether this new form of information exchange represented the reconstitution of a public sphere in a digital form (especially in light of Herman/Chomsky’s Manufacturing Consent critique, which argued the advertising funding model had skewed the public debate and turned the industry into a corporate propaganda outlet). Mostly, they erred towards the notion it did not precisely because it captured a small slice of the population and had a tendency to compartmentalise discussion rather than broaden it.

    Based on all that, if Facebook and Google moves to filter “fake news” it will only exacerbate the problem because these institutions will always be governed by commercial interest not public duty. That as a whole makes them inequipped to judge what news is fit for publication and which is not. What it does do in the long run is open the door to an even more sinister advertising propaganda model than that which inspired Herman/Chomsky’s Manufacture of Consent.

    In that light, here’s some commentary from Habermas about what aspects of salon and coffee-house culture constituted a public sphere (and which I’d argue are lacking today):

    However exclusive the public might be in any given instance, it could never close itself off entirely and become consolidated as a clique; for it always understood and found itself immersed within a more inclusive public of all private people, persons who- insofar as they were propertied and educated — as readers, listeners, and spectators could avail themselves via the market of the objects that were subject to discussion. The issues discussed became “general” not merely in their significance, but also in their accessibility; everyone had to be able to participate.

    What of the uneducated and unpropertied or too poor to engage in the market for objects, you ask? According to Habermas, they were brought into the public sphere by way of festival gatherings, theatre performances and the music halls, all of which spurred public debate.

    In a highly atomised and compartmentalised culture, however — where even workplace gatherings don’t bring people together because everyone is being encouraged to “work for himself” in the gig economy or from home — there seem to be ever fewer occurrences where we, the public, have no choice but to interact with those who disagree with us.

    This in turn encourages the cultivation of safe spaces, which in turn twists our perception of reality into something it simply is not.
    https://ftalphaville.ft.com/2016/11/1...cebook-and-the-manufacture-of-consent
    Voting 0
  7. These investments, which give these companies dedicated capacity on these undersea cables, represent a big shift in how these cables are built and managed. Earlier this year, Jonathan Hjembo, a senior analyst at Telegeography, told us that private networks now account for about 60 percent of the capacity of trans-Atlantic traffic.
    https://www.wired.com/2016/10/faceboo...ble-la-hong-kong/?mbid=social_twitter
    Voting 0
  8. While the prospect of a Donald Trump presidency is a terrifying one, perhaps this is scarier: Facebook could use its unprecedented powers to tilt the 2016 presidential election away from him – and the social network’s employees have apparently openly discussed whether they should do so.

    As Gizmodo reported on Friday, “Last month, some Facebook employees used a company poll to ask Facebook founder Mark » Zuckerberg whether the company should try ‘to help prevent President Trump in 2017’.”

    Facebook employees are probably just expressing the fear that millions of Americans have of the Republican demagogue. But while there’s no evidence that the company plans on taking anti-Trump action, the extraordinary ability that the social network has to manipulate millions of people with just a tweak to its algorithm is a serious cause for concern.

    The fact that an internet giant like Facebook or Google could turn an election based on hidden changes to its code has been a hypothetical scenario for years (and it’s even a plot point in this season’s House of Cards). Harvard Law professor Jonathan Zittrain explained in 2010 how “Facebook could decide an election without anyone ever finding out”, after the tech giant secretly conducted a test in which they were able to allegedly increase voter turnout by 340,000 votes around the country on election day simply by showing users a photo of someone they knew saying “I voted”.
    http://www.theguardian.com/commentisf...facebook-election-manipulate-behavior
    Voting 0
  9. It's undeniable that companies like Google and Facebook have made the web much easier to use and helped bring billions online. They've provided a forum for people to connect and share information, and they've had a huge impact on human rights and civil liberties. These are many things for which we should applaud them.

    But their scale is also concerning. For example, Chinese messaging service Wechat (which is somewhat like Twitter) recently used its popularity to limit market choice. The company banned access to Uber to drive more business to their own ride-hailing service. Meanwhile, Facebook engineered limited web access in developing economies with its Free Basics service. Touted in India and other emerging markets as a solution to help underserved citizens come online, Free Basics allows viewers access to only a handful of pre-approved websites (including, of course, Facebook). India recently banned Free Basics and similar services, claiming that these restricted web offerings violated the essential rules of net neutrality.
    Algorithmic oversight

    Beyond market control, the algorithms powering these platforms can wade into murky waters. According to a recent study from the American Institute for Behavioral Research and Technology, information displayed in Google could shift voting preferences for undecided voters by 20 percent or more -- all without their knowledge. Considering how narrow the results of many elections can become, this margin is significant. In many ways, Google controls what information people see, and any bias, intentional or not, has a potential impact on society.

    In the future, data and algorithms will power even more grave decisions. For example, code will decide whether a self-driving car stops for an oncoming bus or runs into pedestrians.

    It's possible that we're reaching the point where we need oversight for consumer-facing
    http://buytaert.net/can-we-save-the-o...ource=twitter.com&utm_campaign=buffer
    Voting 0
  10. Why are some of the world’s most powerful technologists so focused on providing internet access by hook, crook, drones, balloon or satellite?

    Silicon Valley techno-utopians also dream of rising above the planet’s problems.

    Above the Facebook flag at Facebook HQ flies another, bearing the symbol of Facebook’s non-profit organisation, Internet.org. The internet-dispersing drones under development are designed to bring about the objectives of Internet.org – connecting up the next three billion people yet to join the internet. But it isn’t the “internet” as we know it today, instead, Internet.org allows users to access only Facebook and select other sites, not the entire internet. In an open letter to Facebook CEO Mark Zuckerberg, 65 organisations from 31 countries criticised the project, claiming it violated the principle of network neutrality, that no site should be favoured over others. Security, privacy, censorship, and freedom of expression were among the other concerns voiced over Facebook’s growing control.

    can we really put our faith in Facebook’s drones? It is possible to overthrow a government and depose a dictator but it is nearly impossible to revolt against corporate drones and extraterritorial CEOs.
    https://theconversation.com/who-reall...onversationedu+%28The+Conversation%29
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 5 Online Bookmarks of M. Fioretti: Tags: facebook + google

About - Propulsed by SemanticScuttle