mfioretti: facebook* + privacy*

Bookmarks on this page are managed by an admin user.

147 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. Where is Facebook located? Well, if you're the taxman, Facebook's global HQ is a tiny shed somewhere in Ireland, where Facebook can escape virtually all taxation; but on the other hand, if you're the EU, Facebook is headquartered in America, where the General Data Protection Regulation doesn't apply.

    It's a remarkably ballsy bit of legal fictioneering: Facebook has spent a decade solemnly swearing that it is a European company, able to take advantage of Ireland's lawless tax-havens. But European companies have to comply with the most stringent privacy rules in the world, so Facebook is claiming to straddle multiple jurisdictions, being European for tax purposes and American for privacy purposes.

    To accomplish this fiction, Facebook is making 1.5 billion users click through a new EULA that says, "By clicking I Agree, I acknowledge that I am a user of Facebook, USA's services, and have no connection with those filthy, privacy-respecting Europeans." And voila, with the click of a mouse, the solemn, decade-long arrangement by which Facebook has claimed that its users were inextricably, utterly connected to Ireland is severed like the frayed thread it always was.

    This is the third leg of the Big Tech/jurisdiction question. Telegram can outmanuever Russia by hiding behind giant US cloud companies; the US bases of the cloud giants means that the US exports all its worst laws to the whole world -- and Facebook's ridiculous games with territorial fictions shows us how fragile the whole thing is, because companies are not US-based or EU-based under late-stage capitalism: they are freefloating entities who exist everywhere and nowhere at once, depending on which fiction suits them best.

    Don't miss this kicker: "Facebook said the latest change does not have tax implications."
    https://boingboing.net/2018/04/19/schroedingers-zuck.html
    Voting 0
  2. “I believe it’s important to tell people exactly how the information that they share on Facebook is going to be used.

    “That’s why, every single time you go to share something on Facebook, whether it’s a photo in Facebook, or a message, every single time, there’s a control right there about who you’re going to be sharing it with ... and you can change that and control that in line.

    “To your broader point about the privacy policy ... long privacy policies are very confusing. And if you make it long and spell out all the detail, then you’re probably going to reduce the per cent of people who read it and make it accessible to them.”
    https://www.theguardian.com/technolog...testimony-to-congress-the-key-moments
    Voting 0
  3. Should there be regulation?#
    Yes. On privacy disclosure, and prohibiting the most draconian uses of user data. It should not be possible for users to give those rights up in exchange for use of a social system like Facebook. The idea is similar to the law in California that says that most non-competes are not enforceable. The benefit you receive has to be somewhat equivalent to the data you give up. #
    What about Google, Apple, Amazon?#
    This is the really important stuff.#
    This affair should get users, government and the press to look at other tech companies who have business models based on getting users to disclose ever-more-intimate information. Here are some examples.#
    Google, through Android, knows every place you go. They use that data. Do they sell it? I don't know, but I'm pretty sure you can use it to target ads. Apple, through the iPhone also knows where you go.#
    Apps on Android or iPhones can be told where you go. Many of them are only useful if you let them have the info. Apps can also have all your pictures, contacts. Face recognition makes it possible to construct a social graph without any access to the Facebook API.#
    Google and Apple can listen to all your phone calls.#
    Google, through their Chrome browser, knows everywhere you go on the web, and everything you type into the browser. #
    Amazon Echo and Google Home are always listening. Imagine a leak based on conversations at home, phone calls, personal habits, arguments you have with your spouse, kids, any illegal activities that might be going on in your home. #
    If you have a Gmail account, Google reads your mail, and targets ads at you based on what you're writing about. They also read the email that people send to you, people who may not also be Gmail users. Some examples of how creepy this can be -- they seem to know what my investments are, btw -- I assume they figured this out through email. Recently they told me when a friend's flight to NYC was arriving. I don't know how they made this connection. I assume it was through email.#
    Amazon, of course, knows everything you buy through Amazon. #
    Google knows everything you search for. #
    And on and on. We've reconstructed our whole society around companies having all the data about us that they want. It's kind of funny that we're all freaking out about Cambridge Analytica and Facebook. The problem is so much bigger. #
    Summary#
    It seems like a non-event to me. The press knew all about the API going back to 2012. That they didn't foresee the problem then is a result of the press accepting the hype of big tech companies on their terms, and not trying to find out what the implications of technology are from non-partisan experts. This was a story that could have and should have been written in 2010, warning users of a hidden cost to Facebook.#
    Today's scandal, the equivalent of the one in 2010, is that Google is attempting to turn the web into a corporate platform. Once they control the web as Facebook controls the Social Graph, we'll have another impossibly huge problem to deal with. Better to head this one off with regulation, now, when it can do some good
    http://scripting.com/2018/04/11/140429.html
    Voting 0
  4. These users have invested time and money in building their social networks on Facebook, yet they have no means to port the connectivity elsewhere. Whenever a serious competitor to Facebook has arisen, the company has quickly copied it (Snapchat) or purchased it (WhatsApp, Instagram), often at a mind-boggling price that only a behemoth with massive cash reserves could afford. Nor do people have any means to completely stop being tracked by Facebook. The surveillance follows them not just on the platform, but elsewhere on the internet—some of them apparently can’t even text their friends without Facebook trying to snoop in on the conversation. Facebook doesn’t just collect data itself; it has purchased external data from data brokers; it creates “shadow profiles” of nonusers and is now attempting to match offline data to its online profiles.

    Again, this isn’t a community; this is a regime of one-sided, highly profitable surveillance, carried out on a scale that has made Facebook one of the largest companies in the world by market capitalization.

    There is no other way to interpret Facebook’s privacy invading moves over the years—even if it’s time to simplify! finally!―as anything other than decisions driven by a combination of self-serving impulses: namely, profit motives, the structural incentives inherent to the company’s business model, and the one-sided ideology of its founders and some executives. All these are forces over which the users themselves have little input, aside from the regular opportunity to grouse through repeated scandals.

    And even the ideology—a vague philosophy that purports to prize openness and connectivity with little to say about privacy and other values—is one that does not seem to apply to people who run Facebook or work for it. Zuckerberg buys houses surrounding his and tapes over his computer’s camera to preserve his own privacy, and company employees went up in arms when a controversial internal memo that made an argument for growth at all costs was recently leaked to the press—a nonconsensual, surprising, and uncomfortable disclosure of the kind that Facebook has routinely imposed upon its billions of users over the years.

    This isn’t to say Facebook doesn’t provide real value to its users, even as it locks them in through network effects and by crushing, buying, and copying its competition. I wrote a whole book in which I document, among other things, how useful Facebook has been to anticensorship efforts around the world. It doesn’t even mean that Facebook executives ...
    https://www.wired.com/story/why-zucke...nt-fixed-facebook?mbid=social_twitter
    Voting 0
  5. Di nuovo: dove sta lo scandalo di questi giorni, dunque? Lo scandalo sta nell’evidenza di un errore di fondo nella concezione delle interazioni umane, la concezione che Mark Zuckerberg ha imposto — per sua stessa ammissione, nel tanto agnognato intervento post-Cambridge Analytica — dal 2007. L’idea cioè di costruire un “web dove si è social di default”. Dove cioè la norma è condividere. Un principio che è strutturalmente opposto alla tutela della privacy individuale, che si fonda sulla riservatezza come norma, riguardo ai propri dati personali.

    Zuckerberg lo spiega benissimo nel suo più recente intervento, individuando - giustamente - in quell’errore filosofico e antropologico la radice della tempesta in cui è costretto a destreggiarsi: “Nel 2007, abbiamo lanciato la Facebook Platform nella convinzione (“vision”) che più app dovessero essere social. Il tuo calendario doveva poterti mostrare il compleanno degli amici, le tue mappe mostrare dove vivono i tuoi amici, il tuo address book le loro foto. Per farlo, abbiamo consentito di accedere alle app e condividere chi fossero i tuoi amici e alcune informazioni su di loro”.

    È questo che conduce, nel 2013, Kogan a ottenere l’accesso ai dati di milioni di persone. E certo, quei dati hanno un immenso valore scientifico — ed è giusto che la ricerca, se condotta nel pieno rispetto del consenso informato degli utenti divenuti soggetti sperimentali, possa accedervi. Per soli scopi accademici, però. E anche così, già nel 2014 il famoso esperimento condotto da Facebook stessa sulla manipolazione delle emozioni di centinaia di migliaia di utenti, a cui erano stati mostrati deliberatamente più contenuti positivi o negativi, aveva dimostrato che anche quando non ci sono di mezzo fini commerciali, la questione è ambigua, complessa. E che no, non basta accettare condizioni di utilizzo intricate e che non legge nessuno per dire che allora ogni utente ha, per il fatto stesso di avere accettato di essere su Facebook, di diventare indiscriminatamente un topo di laboratorio arruolato in esperimenti di cui ignora tutto.

    Eppure è proprio la piattaforma a rendersi conto, già in quello stesso anno, che così le cose non vanno. Che a quel modo Facebook perde il controllo su quali terze parti hanno accesso ai dati dei suoi utenti. La policy dunque cambia, e da allora gli “amici” devono acconsentire al trattamento dei propri dati da parte di una app. La nuova filosofia, ricorda Albright, è “people first”. Ma è tardi. E l’incapacità di rientrare davvero in possesso di quell’ammasso di informazioni, dimostrata dal caso Cambridge Analytica – possibile Facebook debba scoprire dai giornali che l’azienda non aveva cancellato i dati che diceva di aver cancellato, o che debba comunque condurre un auditing serio per verificarlo ora, dimostrando di non avere idea se lo siano o meno? – fa capire che il problema va ben oltre il singolo caso in questione, ma è sistematico.

    Per capirci più chiaramente: come scrive Albright, la prima versione delle API v.1.0 per il Facebook Graph – cioè ciò che gli sviluppatori di applicazioni potevano ottenere dal social network tra il 2010, data di lancio, e il 2014, data in cui la policy è cambiata – consentiva di sapere non su chi si iscriveva a una determinata app, ma dagli amici inconsapevoli, i seguenti dati: “about, azioni, attività, compleanno, check-ins, istruzione, eventi, giochi, gruppi, residenza, interessi, like, luogo, note, status, tag, foto, domande, relazioni, religione/politica, iscrizioni, siti, storia lavorativa”. Davvero si poteva pensare di controllare dove finissero tutti questi dati, per milioni e milioni di persone?

    E davvero Facebook lo scopre oggi? Nel 2011, la Federal Trade Commission americana aveva già segnalato la questione come problematica. Non ha insegnato nulla
    https://www.valigiablu.it/facebook-cambridge-analytica-scandalo
    Voting 0
  6. Mark Zuckerberg also launched Facebook with a disdain for intrusive advertising, but it wasn’t long before the social network giant became Google’s biggest competitor for ad dollars. After going public with 845 million users in 2012, Facebook became a multibillion-dollar company and Zuckerberg one of the richest men on Earth, but with only a promise that the company would figure out how to monetize its platform.

    Facebook ultimately sold companies on its platform by promising “brand awareness” and the best possible data on what consumers actually liked. Brands could start their own Facebook pages, which people would actually “like” and interact with. This provided unparalleled information about what company each individual person wanted to interact with the most. By engaging with companies on Facebook, people gave corporate marketing departments more information than they could have ever dreamed of buying, but here it was offered up free.

    This was the “grand bargain,” as Columbia University law professor Tim Wu called it in his book, The Attention Merchants, that users struck with corporations. Wu wrote that Facebook’s “billions of users worldwide were simply handing over a treasure trove of detailed demographic data and exposing themselves to highly targeted advertising in return for what, exactly?”

    In other words: We will give you every detail of our lives and you will get rich by selling that information to advertisers.

    European regulators are now saying that bargain was a bad deal. The big question that remains is whether their counterparts in the U.S. will follow their lead.
    https://www.huffingtonpost.com/entry/...antitrust_us_5a625023e4b0dc592a088f6c
    Voting 0
  7. Facebook's ability to figure out the "people we might know" is sometimes eerie. Many a Facebook user has been creeped out when a one-time Tinder date or an ex-boss from 10 years ago suddenly pops up as a friend recommendation. How does the big blue giant know?

    While some of these incredibly accurate friend suggestions are amusing, others are alarming, such as this story from Lisa*, a psychiatrist who is an infrequent Facebook user, mostly signing in to RSVP for events. Last summer, she noticed that the social network had started recommending her patients as friends—and she had no idea why.

    "I haven't shared my email or phone contacts with Facebook," she told me over the phone.

    The next week, things got weirder.

    Most of her patients are senior citizens or people with serious health or developmental issues, but she has one outlier: a 30-something snowboarder. Usually, Facebook would recommend he friend people his own age, who snowboard and jump out of planes. But Lisa told me that he had started seeing older and infirm people, such as a 70-year-old
    https://splinternews.com/facebook-rec...s-psychiatrists-patients-f-1793861472
    Tags: , , , by M. Fioretti (2018-01-28)
    Voting 0
  8. urope has propelled past the United States when it comes to constraining the abuses of Big Tech. In June, the European Union fined Google $2.7 billion for steering web users to its shopping site, and investigations remain active over similar treatment on Android phones. European regulators fined Facebook for lying about whether it could match user profiles with phone numbers on its messaging acquisition WhatsApp. They demanded Apple repay $15.3 billion in back taxes in Ireland. And they forced Amazon to change its e-book contracts, which they claimed inappropriately squeezed publishers.
    AP Photo/Rick Bowmer

    Trust-Busted: In 2002, Microsoft Chairman Bill Gates had to testify at federal court in his company's antitrust case. The public trial led Microsoft to sfoten its aggressive strategy against rivals.

    Unfortunately, these actions were treated mainly as the cost of doing business. The Facebook fine totaled not even 1 percent of the $22 billion purchase price for WhatsApp, and it allowed the two companies to remain partnered. Government policy, in effect, has “told these companies that the smart thing to do is to lie to us and break the law,” said Scott Galloway in his presentation. Google’s remedy in the shopping case still forces rivals to bid for placement at the top of the page, with Google Shopping spun off as a stand-alone competitor. This does weaken Google’s power and solves the “equal treatment” problem, but it doesn’t protect consumers, who will ultimately pay for those costly bids. “The EU got a $2.7 billion fine to hold a party and bail out Greek banks,” said Gary Reback, an antitrust lawyer and critic of the EU’s actions. “No amount of money will make a difference.”

    However, one thing might: Europe’s increasing move toward data privacy. The General Data Protection Regulation (GDPR), scheduled for implementation in May 2018, empowers European web users to affirmatively opt out of having their data collected, with high penalties for non-compliance. Consumers will be able to obtain their personal data and learn how it is used. They can request that their data be erased completely (known as the “right to be forgotten”) as well as prohibited from sale to third parties. Platforms could not condition use of their products on data collection. A separate, not-yet-finalized regulation called ePrivacy would forbid platforms from tracking users across separate apps, websites, and devices.
    http://prospect.org/article/big-tech-new-predatory-capitalism
    Voting 0
  9. Since the earliest days of Facebook, social scientists have sent up warnings saying that the ability to maintain separate "contexts" (where you reveal different aspects of yourself to different people) was key to creating and maintaining meaningful relationships, but Mark Zuckerberg ignored this advice, insisting that everyone be identified only by their real names and present a single identity to everyone in their lives, because anything else was "two-faced."

    Zuck was following in the footsteps of other social network entrepreneurs who attempted to impose their own theories of social interaction on mass audiences -- danah boyd has written and presented extensively on the user rebellions of Friendster from people who wanted to form interest-based affinity groups and use pseudonymous identities for different activities, which Friendster rejected out of a mix of commercial concerns (it wanted users to arrange their social affairs to make it easier to monetize them) and fringe theories of social interaction.

    But while all the other social networks collapsed, Facebook thrived, and imposed the Zuckerberg model of "one identity, one context" on billions of users, who, research consistently finds, are made unhappy and angry by their use of the service, but are nevertheless psychologically compelled to continue using it, creating a vicious feedback loop that even Zuck has acknowledged as a risk to his business.

    In 2008, I found myself speaking with the big boss himself, Facebook CEO Mark Zuckerberg. I was in the second year of my Ph.D. research on Facebook at Curtin University. And I had questions.

    Why did Facebook make everyone be the same for all of their contacts? Was Facebook going to add features that would make managing this easier?

    To my surprise, Zuckerberg told me that he had designed the site to be that way on purpose. And, he added, it was "lying" to behave differently in different social situations.

    Up until this point, I had assumed Facebook's socially awkward design was unintentional. It was simply the result of computer nerds designing for the rest of humanity, without realising it was not how people actually want to interact.

    The realisation that Facebook's context collapse was intentional not only changed the whole direction of my research but provides the key to understanding why Facebook may not be so great for your mental health.

    The secret history of Facebook depression Dr Kate Raynes-Goldie/Phys.org »
    https://phys.org/news/2018-01-secret-history-facebook-depression.html
    Voting 0
  10. "Continueremo a lavorare con le autorità francesi per garantire che gli utenti comprendano quali informazioni vengono raccolte e come vengono utilizzate", ha affermato WhatsApp in una dichiarazione inviata per posta elettronica. "Ci impegniamo a risolvere le diverse e talvolta contraddittorie preoccupazioni che hanno sollevato le autorità per la protezione dei dati, con un approccio comune a livello europeo prima che nuove norme sulla protezione dei dati a livello di blocco entrino in vigore nel maggio 2018".

    I trasferimenti di dati da WhatsApp a Facebook avvengono in parte senza il consenso dell'utente, ha ribadito l'ente francese che ha anche respinto le argomentazioni di WhatsApp secondo le quali l'azienda sarebbe soggetta solo alla legge degli Stati Uniti. Il monito francese, è "un avviso formale, non una sanzione", ma il colosso dei messaggi rischierebbe di incorrere in multe in una fase successiva.
    http://www.repubblica.it/tecnologia/2...ncia_il_garante_fb_whatsapp-184580045
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 15 Online Bookmarks of M. Fioretti: Tags: facebook + privacy

About - Propulsed by SemanticScuttle