mfioretti: google* + percloud*

Bookmarks on this page are managed by an admin user.

17 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. Earlier this month, writer James Bridle published an in-depth look at the underbelly of creepy, violent content targeted at kids on YouTube – from knock-off Peppa Pig cartoons, such as one where a trip to the dentist morphs into a graphic torture scene, to live-action “gross-out” videos, which show real kids vomiting and in pain.

    These videos are being produced and added to YouTube by the thousand, then tagged with what Bridle calls “keyword salad” – long lists of popular search terms packed into their titles. These keywords are designed to game or manipulate the algorithm that sorts, ranks and selects content for users to see. And thanks to a business model aimed at maximising views (and therefore ad revenue), these videos are being auto-played and promoted to kids based on their “similarity” – at least in terms of keywords used – to content that the kids have already seen. That means a child might start out watching a normal Peppa Pig episode on the official channel, finish it, then be automatically immersed in a dark, violent and unauthorised episode – without their parent realising it.
    Advertisement

    YouTube’s response to the problem has been to hand responsibility to its users, asking them to flag videos as inappropriate. From there, the videos go to a review team that YouTube says comprises thousands of people working 24 hours a day to review content. If the content is found to be inappropriate for children, it will be age-restricted and not appear in the YouTube Kids app. It will still appear on YouTube proper, however, where, officially, users must be at least 13 years old, but in reality, is still a system which countless kids use (just think about how often antsy kids are handed a phone or tablet to keep them occupied in a public space).

    Like Facebook’s scheme, this approach has several flaws: since it’s trying to ferret out inappropriate videos from kids’ content, it’s likely that most of the people who will encounter these videos are kids themselves. I don’t expect a lot of six-year-olds to become aggressive content moderators any time soon. And if the content is flagged, it still needs to be reviewed by humans, which, as YouTube has already acknowledged, takes “round the clock” monitoring.

    When we talk about this kind of challenge, the tech companies’ response is often that it’s simply the inevitability of scale – there’s no way to serve billions of users endless streams of engaging content without getting it wrong or allowing abuse to slip by some of the time. But of course, these companies don’t have to do any of this. Auto-playing an endless stream of algorithmically selected videos to kids isn’t some sort of mandate. The internet didn’t have to become a smorgasbord of “suggested content”. It’s a choice that YouTube made, because ad views are ad views. You’ve got to break a few eggs to make an omelette, and you’ve got to traumatise a few kids to build a global behemoth worth $600bn.
    Facebook asks users for nude photos in project to combat revenge porn
    Read more

    And that’s the issue: in their unblinking pursuit of growth over the past decade, these companies have built their platforms around features that aren’t just vulnerable to abuse, but literally optimised for it. Take a system that’s easy to game, profitable to misuse, intertwined with our vulnerable people and our most intimate moments, and operating at a scale that’s impossible to control or even monitor, and this is what you get.

    The question now is, when will we force tech companies to reckon with what they’ve wrought? We’ve long decided that we won’t let companies sell cigarettes to children or put asbestos into their building materials. If we want, we can decide that there are limits to what tech can do to “engage” us, too, rather than watching these platforms spin further and further away from the utopian dreams they were sold to us on.
    https://www.theguardian.com/technolog...ra-wachter-boettcher?CMP=share_btn_tw
    Voting 0
  2. Nel 2007 Google ha acquisito DoubleClick, società che raccoglieva dati di navigazione web, assicurando che mai avrebbe incrociato tali risultati con le informazioni personali possedute grazie all'utilizzo dei propri servizi. Tuttavia, a distanza di quasi 10 anni ha aggiornato le proprie condizioni per l'uso dell'account Google, informando che adesso avrà la possibilità di effettuare tale incrocio. Nel documento si legge adesso: "A seconda delle impostazioni dell'account utente, la sua attività su altri siti e app potrebbe essere associata alle relative informazioni personali allo scopo di migliorare i servizi Google e gli annunci pubblicati da Google". La modifica alle impostazioni deve essere approvata, ed infatti Google richiede specificatamente, una volta effettuato l'accesso al proprio account via browser web, di accettare tali nuove condizioni. L'utente ha la possibilità di mantenere le impostazioni attuali e continuare ad utilizzare i servizi Google allo stesso modo, mentre per i nuovi account invece le nuove opzioni sono abilitate di default. Coi nuovi termini, se accettati, Google potrà unire i dati di navigazione acquisiti tramite i servizi di analisi o tracking alle informazioni già ottenute dal profilo utente. Tutto ciò permetterà alla casa di Mountain View di comporre un ritratto completo dei propri utenti composto dai dati personali, da ciò che viene scritto nelle email, dai siti web visitati e dalle ricerche effettuate, facendo cadere definitivamente il principio di anonimato del tracciamento web.
    http://www.saggiamente.com/2016/10/ad...ource=twitter.com&utm_campaign=buffer
    Voting 0
  3. It's undeniable that companies like Google and Facebook have made the web much easier to use and helped bring billions online. They've provided a forum for people to connect and share information, and they've had a huge impact on human rights and civil liberties. These are many things for which we should applaud them.

    But their scale is also concerning. For example, Chinese messaging service Wechat (which is somewhat like Twitter) recently used its popularity to limit market choice. The company banned access to Uber to drive more business to their own ride-hailing service. Meanwhile, Facebook engineered limited web access in developing economies with its Free Basics service. Touted in India and other emerging markets as a solution to help underserved citizens come online, Free Basics allows viewers access to only a handful of pre-approved websites (including, of course, Facebook). India recently banned Free Basics and similar services, claiming that these restricted web offerings violated the essential rules of net neutrality.
    Algorithmic oversight

    Beyond market control, the algorithms powering these platforms can wade into murky waters. According to a recent study from the American Institute for Behavioral Research and Technology, information displayed in Google could shift voting preferences for undecided voters by 20 percent or more -- all without their knowledge. Considering how narrow the results of many elections can become, this margin is significant. In many ways, Google controls what information people see, and any bias, intentional or not, has a potential impact on society.

    In the future, data and algorithms will power even more grave decisions. For example, code will decide whether a self-driving car stops for an oncoming bus or runs into pedestrians.

    It's possible that we're reaching the point where we need oversight for consumer-facing
    http://buytaert.net/can-we-save-the-o...ource=twitter.com&utm_campaign=buffer
    Voting 0
  4. I switched from using a BlackBerry to an Android phone a few years ago it really irked me that the only way to keep my contacts info on the phone was to also let Google sync them into their cloud. This may not be true universally (I think some samsung phones will let you store contacts to the SD card) but it was true for phone I was using then and is true on the Nexus 4 I'm using now. It took a lot of painful digging through Android source and googling, but I successfully ended up writing a bunch of code to get around this.

    I've been meaning to put up the code and post this for a while, but kept procrastinating because the code wasn't generic/pretty enough to publish. It still isn't but it's better to post it anyway in case somebody finds it useful, so that's what I'm doing.

    In a nutshell, what I wrote is an Android app that includes (a) an account authenticator, (b) a contacts sync adapter and (c) a calendar sync adapter. On a stock Android phone this will allow you to create an "account" on the device and add contacts/calendar entries to it.

    Note that I wrote this to interface with the way I already have my data stored, so the account creation process actually tries to validate the entered credentials against a webhost, and the the contacts sync adapter is actually a working one-way sync adapter that will download contact info from a remote server in vcard format and update the local database. The calendar sync adapter, though, is just a dummy. You're encouraged to rip out the parts that you don't want and use the rest as you see fit. It's mostly meant to be a working example of how this can be accomplished.

    The net effect is that you can store contacts and calendar entries on the device so they don't get synced to Google, but you can still use the built-in contacts and calendar apps to manipulate them. This benefits from much better integration with the rest of the OS than if you were to use a third-party contacts or calendar app.
    https://staktrace.com/spout/entry.php?id=827
    Voting 0
  5. Google routinely uses software to scan the contents of e-mails, including images, to feed its advertising and to identify malware. But many may not have been aware that the company is also scanning users' accounts looking for illegal activity -- namely, matching images in e-mails against its known database of illegal and pornographic images of children.

    That bit of Google policy came to light last week, when a Houston man was arrested on charges of having and promoting child pornography after Google told the National Center for Missing and Exploited Children that he had the images in his Gmail account. The tipoff, according to a report from Houston television channel KHOU, led to the man's arrest.

    While it's hard to argue with the outcome of this particular case, the news did raise some alarm bells among researchers at the security firm Sophos, who questioned whether Google was stepping outside its place as a company and into the role of a pseudo law enforcement agency.

    Chester Wisniewski, a senior security researcher at Sophos, said that Google's "proactive" decision to tip off law enforcement makes "some of us wonder if they're crossing the line."
    http://www.washingtonpost.com/blogs/t...e-really-reading-your-e-mail/?hpid=z4
    Voting 0
  6. Fish gotta fly. Birds gotta swim. Google has to buy smaller companies and change their products’ privacy policies so they can now share data with Google. It’s just the natural order of things, and now it’s happening with Nest.

    Nest co-founder Matt Rogers posted to Nest’s official blog late Monday night to announce the company’s Nest Developer Program, a set of developer tools that will allow other products to securely integrate with Nest. Your Nest could trigger actions, like turning your lights on when you get home, and other products can also control the Nest. Your Jawbone UP24, for example, can alert your Nest thermostat to start heating your house when it detects that you’ve woken up
    http://www.techhive.com/article/23669...ly-sharing-your-data-with-google.html
    Voting 0
  7. : 'Psychologist Robert Epstein has been researching how much influence search engines have on voting behavior » and says he is alarmed at what he has discovered. His most recent experiment, whose findings were released Monday, : found that search engines have the potential to profoundly influence voters without them noticing the impact ... Epstein, former editor-in-chief of Psychology Today and a vocal critic of Google, has not produced evidence that this or any other search engine has intentionally deployed this power. But the new experiment builds on his earlier work by measuring SEME (Search Engine Manipulation Effect) in the concrete setting of India's national election, whose voting concludes Monday
    http://www.washingtonpost.com/blogs/t...rch-results-can-influence-an-election
    Voting 0
  8. Mr. Schmidt’s open letter to Europe shows evidence of such absolutism. Democratic oversight is characterized as „heavy-handed regulation.” The „Internet”, „Web”, and „Google” are referenced interchangeably, as if Goggle’s interests stand for the entire Web and Internet. That’s a magician’s sleight of hand intended to distract from the real issue. Google’s absolutist pursuit of its interests is now regarded by many as responsible for the Web’s fading prospects as an open information platform in which participants can agree on rules, rights, and choice.

    Schmidt warns that were the E.U. to oppose Google’s practices, Europe risks becoming „an innovation desert.” Just the opposite is more likely true. Thanks in part to Google’s exquisite genius in the science of surveillance, the audacity with which it has expropriated users’ rights to privacy, and the aggressive tactics of the NSA, people are losing trust in the entire digital medium. It is this loss of trust that stands to kill innovation. To make some sense of our predicament, let’s take a fresh look at how we got here, the nature of the threats we face, and the stakes for the future.
    http://www.faz.net/aktuell/feuilleton...-12916679.html?printPagedArticle=true
    Voting 0
  9. Search systems have four distinct components: Crawler, Index, Search and Rank, and GUI. We could and should build a public infrastructure where first two components are shared, and on top of the indexed Web, open interfaces to various Search and Rank algorithms and user interfaces are provided (Rieder 2008). There are different ways this could be done. One is through existing grid systems used in academia, this system is already distributed, staffed with highly skilled people and like the rest of the Web, mostly built using Free Software. Other option is to internationalize Google. A worldwide public organization could demand from USA to break Google search system away from the rest of the company, release all knowledge to do with how it operates (technical documentation) into the common and make it into a separate globally owned company. Democratic ownership would also ensure accountability in dealing with user data, something Google arrogantly refuses to do. The form of such global ownership, the model of the new management of the commons, remains an issue to solve. Google uses Free Software to utilize the commons (Web) as their core profit stream. Yet neither belong to any single nation.

    Hence, the solution on how to manage it should not belong to any single nation’s economic and legal system – regardless of where the Google corporation, or any other entity utilizing the commons for the profit, is legally based. Indeed, in the discussion on the patenting of biological material, the question of disclosing the origin of the material part of a patent application is one of the key political issues (Howard 2008). When a seed of a Brazilian, or an Indian origin is to be patented, mandating disclosing the origin in the application can be used to deny bio-piracy by the more developed economies of the biological material originating in less developed countries. In a similar way, who gives the Google right to utilize what is common to the world, the Web, for private profit and without global accountability?

    Why would we allow Google to be subject to the laws of any single state? The French state attempt to control what Google does within their web-territory renders the tension between the commons, for profit organizations and the state visible.
    http://p2pfoundation.net/Google#Discu..._What.27s_Wrong_with_Google_Search.3F
    Voting 0
  10. Google is in hot water for scanning millions of students' email messages and allegedly building "surreptitious" profiles to target advertising at them.

    According to Education Week, a "potentially explosive" lawsuit is wending its way through US federal court, now being heard in the US District Court for the Northern District of California.

    In court filings, plaintiffs charge that Google data-mines Gmail users - a group that includes students who use the company's Apps for Education tool suite.

    Of course it's not news that Google reads the emails in its users' inboxes. The consumer Gmail product is free and pays its way with targeted advertising. The targeting is done by building up profiles of users' interests based on the content of their email.

    The situation is a little muddier when it comes to Apps for Education.

    Apps for Education is used by K-12 schools and institutions of higher education throughout the world for free online applications such as email, calendar, word processing, spreadsheet and collaborative document sharing.

    Google admitted to Education Week that it automatically "scans and indexes" the email of Apps for Education users even though ads are off by default.
    http://nakedsecurity.sophos.com/2014/...e-sued-for-data-mining-students-email
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 2 Online Bookmarks of M. Fioretti: Tags: google + percloud

About - Propulsed by SemanticScuttle