mfioretti: data ownership* + facebook*

Bookmarks on this page are managed by an admin user.

68 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. "Continueremo a lavorare con le autorità francesi per garantire che gli utenti comprendano quali informazioni vengono raccolte e come vengono utilizzate", ha affermato WhatsApp in una dichiarazione inviata per posta elettronica. "Ci impegniamo a risolvere le diverse e talvolta contraddittorie preoccupazioni che hanno sollevato le autorità per la protezione dei dati, con un approccio comune a livello europeo prima che nuove norme sulla protezione dei dati a livello di blocco entrino in vigore nel maggio 2018".

    I trasferimenti di dati da WhatsApp a Facebook avvengono in parte senza il consenso dell'utente, ha ribadito l'ente francese che ha anche respinto le argomentazioni di WhatsApp secondo le quali l'azienda sarebbe soggetta solo alla legge degli Stati Uniti. Il monito francese, è "un avviso formale, non una sanzione", ma il colosso dei messaggi rischierebbe di incorrere in multe in una fase successiva.
    http://www.repubblica.it/tecnologia/2...ncia_il_garante_fb_whatsapp-184580045
    Voting 0
  2. Come è possibile fatturare 18 miliardi con 20mila dipendenti?

    Per esempio, una foto fornisce luogo e ora del caricamento e dello scatto, oltre che la marca dello smartphone. Una miniera di dati che Facebook – come agenzia pubblicitaria – rivende ai suoi clienti inserzionisti. “Se carico la foto del mio gatto, visualizzerò con ogni probabilità inserzioni di cibo per animali. Ma se carico la foto alle 4 del mattino, potrei essere inserito nel segmento dei nottambuli e ricevere pubblicità di prodotti contro l’insonnia”, spiega Casilli.

    Nel quarto trimestre del 2016, l’azienda di Mark Zuckerberg ha guadagnato 4,83 dollari per utente. Nel 2015 ha fatturato 17,93 milioni di dollari l’anno con circa 20mila dipendenti fissi. Come è possibile? Grazie a 1,86 miliardi di lavoratori invisibili, cioè tutti noi che ogni giorno carichiamo contenuti consapevolmente e creiamo metriche pubblicitarie senza rendercene conto.
    https://www.terrelibere.org/facebook-pagami
    Voting 0
  3. Il 28 giugno 2016 la Hill aveva pubblicato un articolo in cui sosteneva che Facebook stesse utilizzando la geolocalizzazione degli utenti per migliorare i propri suggerimenti. Menlo Park inizialmente conferma (due volte, stando a quanto riporta Hill), precisando però che questo sarebbe solo uno dei fattori che determinano chi compare nella sezione “persone che potresti conoscere”, per poi fare dietrofront e dire che no, assolutamente, la geolocalizzazione non viene usata. Il che però non spiega in che modo il padre di un ragazzino con tendenze suicide, dopo aver frequentato un incontro di supporto, il giorno dopo si sia trovato tra i suggerimenti un altro dei genitori presenti.

    Kashmir Hill, che da qualche mese ormai tiene d’occhio la sezione “persone che potresti conoscere”, scaricando e stampando di volta in volta la lista di suggerimenti che le viene proposta (circa 160 nomi al giorno), ha tentato più volte di chiedere delucidazioni sul funzionamento dell’algoritmo. Dopo aver ragionevolmente ipotizzato che lo spunto per queste serie di nomi arrivasse da blocchi di informazioni vendute da data broker, un portavoce le risponde in modo piuttosto secco che “Facebook non utilizza informazioni di fornitori di dati per la sezione ‘persone che potresti conoscere’”. Nel frattempo raccoglie testimonianze di gente che con quella sezione non ha avuto interazioni del tutto felici. Hill racconta del caso di una prostituta, Leia (ovviamente uno pseudonimo), che ha visto comparire tra le “persone che avrebbe potuto conoscere” alcuni dei propri clienti. Questo nonostante avesse usato la massima cautela sul social, registrandosi con un indirizzo email accademico e non facendo mai menzione della propria professione. C’è poi il caso di una psichiatra che si è vista suggerire molti dei propri pazienti come possibili amici, due dei quali a loro volta sono comparsi l’uno nella lista di suggerimenti dell’altro. O quello di un avvocato che racconta di aver cancellato il proprio account dopo che tra le possibili amicizie aveva trovato il nome del difensore di una controparte in uno dei casi di cui si era occupato. I due avevano corrisposto solo tramite l’indirizzo email di lavoro, che Facebook non avrebbe avuto modo di ottenere per cessione volontaria dell’utente, visto che quest’ultimo non aveva dato il consenso per la condivisione dei proprio contatti.

    Tutto ciò è possibile perché oltre all’enorme mole di dati che forniamo volontariamente, Facebook immagazzina tutta una serie di altre informazioni su di noi e sui nostri contatti, che va a costituire il cosiddetto “shadow profile”. Immaginate un ipotetico iceberg, la cui punta sono i dati che cediamo di nostra spontanea volontà: ne rimane una gigantesca porzione nascosta, che è appunto utilizzata dal social non solo per decidere chi sarà la prossima persona a cui chiederemo l’amicizia, ma anche per arricchire il nostro pacchetto di preferenze e caratteristiche personali – pacchetto che andrà a migliorare le targhettizzazioni delle inserzioni.

    Come vengono raccolte queste informazioni? Stando a quanto riporta Hill, nel momento in cui si decide di utilizzare il servizio “Trova amici” si accetta di trasmettere tutti i propri contatti a Facebook. Il che significa: numeri di telefono (se si utilizza l’app per mobile), indirizzi email (se si usa in desktop o da cellulare), nomi ed eventuali soprannomi degli intestatari. Avete appena ceduto a terzi tutta la vostra rubrica. Questi contatti vengono quindi messi a confronto e incrociati. Se anche un solo indirizzo email e numero di telefono tra quelli che avete appena ceduto è presente nella rubrica di un qualsiasi altro utente – che deve però a sua volta aver accettato di condividere i contatti – allora sarete reciprocamente suggeriti come amici. Se il vostro nome è salvato nella rubrica di un user che dà il proprio consenso per caricarla online, il vostro contatto entrerà nel sistema con il vostro nome e cognome, che voi abbiate o meno deciso di caricare i dati su Facebook.

    Tornando al caso dell’avvocato citato prima, com’è possibile che gli sia stato proposto come amico un altro legale con cui ha corrisposto solo tramite la propria mail lavorativa? Non serve nemmeno che l’avvocato in questione decida di condividere i propri contatti. Basta che un qualsiasi altro utente abbia quella mail salvata sotto il nome del legale e che la condivida con Facebook. A questo punto il sito, avendo il suo nome, cognome e indirizzo email, è in grado di metterlo in contatto con qualsiasi altro utente in possesso di quella stessa mail. Nel caso in questione, il legale di controparte.

    Gli shadow profile non sono una novità. L’iniziativa Europe v. Facebook aveva presentato un reclamo formale già nel 2011 (vi vengono elencate sette istanze in cui Facebook, con sede in Irlanda, avrebbe violato la legge irlandese per la protezione dei dati personali), ma la questione è esplosa nel giugno 2013, quando un bug colpì il social media rendendo pubblici indirizzi email e numeri di telefono privati di 6 milioni di utenti. Molti fra questi non avevano condiviso volontariamente le proprie informazioni. Scaricando il proprio file personale da Facebook, si potevano vedere non solo i dati “ufficiali” relativi alla propria lista di amici, ma anche i loro shadow profile. Il bug, insomma, ha dimostrato l’esistenza di un profilo ombra la cui esistenza si ipotizzava già da un paio di anni.

    Pensate a tutti i numeri che avete salvati in rubrica. Pensate a tutte le persone che avete visto una volta e di cui avete salvato il contatto, per un motivo o per l’altro. Se voi o le persone in questione caricate le vostre rispettive info su Facebook, i vostri profili saranno aggiornati. E così i suggerimenti diventeranno sempre più accurati, fino a includere qualsiasi pittoresco individuo che per qualche ragione ha anche un solo contatto in comune con voi. Tale profilo finirà per costituire una sorta di cronologia della vostra rubrica, con tutte le informazioni annesse a quest’ultima.

    Complice una piattaforma che non sempre dà prova di tutta la trasparenza che decanta, siamo convinti che tutto ciò che Facebook sa di noi sia frutto di interazioni dirette e delle nostre scelte consapevoli. In molti casi, purtroppo, ciò si rivela un clamoroso errore.

    Questo è uno di quei casi. L’elenco di “persone che potresti conoscere”, in realtà, non è altro che una delle possibili dimostrazioni di ciò che Facebook potrebbe conoscere su di voi.
    http://thevision.com/scienza/facebook-sa
    Voting 0
  4. The point is that Facebook has a strong, paternalistic view on what’s best for you, and it’s trying to transport you there. “To get people to this point where there’s more openness – that’s a big challenge. But I think we’ll do it,” Zuckerberg has said. He has reason to believe that he will achieve that goal. With its size, Facebook has amassed outsized powers. “In a lot of ways Facebook is more like a government than a traditional company,” Zuckerberg has said. “We have this large community of people, and more than other technology companies we’re really setting policies.”

    Without knowing it, Zuckerberg is the heir to a long political tradition. Over the last 200 years, the west has been unable to shake an abiding fantasy, a dream sequence in which we throw out the bum politicians and replace them with engineers – rule by slide rule. The French were the first to entertain this notion in the bloody, world-churning aftermath of their revolution. A coterie of the country’s most influential philosophers (notably, Henri de Saint-Simon and Auguste Comte) were genuinely torn about the course of the country. They hated all the old ancient bastions of parasitic power – the feudal lords, the priests and the warriors – but they also feared the chaos of the mob. To split the difference, they proposed a form of technocracy – engineers and assorted technicians would rule with beneficent disinterestedness. Engineers would strip the old order of its power, while governing in the spirit of science. They would impose rationality and order.
    Advertisement

    This dream has captivated intellectuals ever since, especially Americans. The great sociologist Thorstein Veblen was obsessed with installing engineers in power and, in 1921, wrote a book making his case. His vision briefly became a reality. In the aftermath of the first world war, American elites were aghast at all the irrational impulses unleashed by that conflict – the xenophobia, the racism, the urge to lynch and riot. And when the realities of economic life had grown so complicated, how could politicians possibly manage them? Americans of all persuasions began yearning for the salvific ascendance of the most famous engineer of his time: Herbert Hoover. In 1920, Franklin D Roosevelt – who would, of course, go on to replace him in 1932 – organised a movement to draft Hoover for the presidency.

    The Hoover experiment, in the end, hardly realised the happy fantasies about the Engineer King. A very different version of this dream, however, has come to fruition, in the form of the CEOs of the big tech companies. We’re not ruled by engineers, not yet, but they have become the dominant force in American life – the highest, most influential tier of our elite.

    There’s another way to describe this historical progression. Automation has come in waves. During the industrial revolution, machinery replaced manual workers. At first, machines required human operators. Over time, machines came to function with hardly any human intervention. For centuries, engineers automated physical labour; our new engineering elite has automated thought. They have perfected technologies that take over intellectual processes, that render the brain redundant. Or, as the former Google and Yahoo executive Marissa Mayer once argued, “You have to make words less human and more a piece of the machine.” Indeed, we have begun to outsource our intellectual work to companies that suggest what we should learn, the topics we should consider, and the items we ought to buy. These companies can justify their incursions into our lives with the very arguments that Saint-Simon and Comte articulated: they are supplying us with efficiency; they are imposing order on human life.

    Nobody better articulates the modern faith in engineering’s power to transform society than Zuckerberg. He told a group of software developers, “You know, I’m an engineer, and I think a key part of the engineering mindset is this hope and this belief that you can take any system that’s out there and make it much, much better than it is today. Anything, whether it’s hardware or software, a company, a developer ecosystem – you can take anything and make it much, much better.” The world will improve, if only Zuckerberg’s reason can prevail – and it will.

    The precise source of Facebook’s power is algorithms. That’s a concept repeated dutifully in nearly every story about the tech giants, yet it remains fuzzy at best to users of those sites. From the moment of the algorithm’s invention, it was possible to see its power, its revolutionary potential. The algorithm was developed in order to automate thinking, to remove difficult decisions from the hands of humans, to settle contentious debates.
    Advertisement

    The essence of the algorithm is entirely uncomplicated. The textbooks compare them to recipes – a series of precise steps that can be followed mindlessly. This is different from equations, which have one correct result. Algorithms merely capture the process for solving a problem and say nothing about where those steps ultimately lead.

    These recipes are the crucial building blocks of software. Programmers can’t simply order a computer to, say, search the internet. They must give the computer a set of specific instructions for accomplishing that task. These instructions must take the messy human activity of looking for information and transpose that into an orderly process that can be expressed in code. First do this … then do that. The process of translation, from concept to procedure to code, is inherently reductive. Complex processes must be subdivided into a series of binary choices. There’s no equation to suggest a dress to wear, but an algorithm could easily be written for that – it will work its way through a series of either/or questions (morning or night, winter or summer, sun or rain), with each choice pushing to the next.

    Facebook would never put it this way, but algorithms are meant to erode free will, to relieve humans of the burden of choosing, to nudge them in the right direction. Algorithms fuel a sense of omnipotence, the condescending belief that our behaviour can be altered, without our even being aware of the hand guiding us, in a superior direction. That’s always been a danger of the engineering mindset, as it moves beyond its roots in building inanimate stuff and begins to design a more perfect social world. We are the screws and rivets in the grand design.
    https://www.theguardian.com/technolog...r-on-free-will?CMP=Share_iOSApp_Other
    Voting 0
  5. La relazione dell’Autorità garante per la protezione dei dati personali non fa alcun riferimento con quei toni così banali ad “allarmi”, sottolinea invece le grandi questioni in campo, anzi raccontando i risultati ottenuti lavorando sulle discipline, le garanzie, il dialogo coi player, seguendo le linee guida dei gruppi dell’Unione. Se il mainstream dev’essere sempre quello dell’allarme e dei titoli fuorvianti, inutile lamentarsi della disintermediazione e dello strapotere delle piattaforme di social media.
    http://www.webnews.it/2017/06/06/garante-privacy-relazione-annuale
    Voting 0
  6. "All of us, when we are uploading something, when we are tagging people, when we are commenting, we are basically working for Facebook," he says.

    The data our interactions provide feeds the complex algorithms that power the social media site, where, as Mr Joler puts it, our behaviour is transformed into a product.

    Trying to untangle that largely hidden process proved to be a mammoth task.

    "We tried to map all the inputs, the fields in which we interact with Facebook, and the outcome," he says.

    "We mapped likes, shares, search, update status, adding photos, friends, names, everything our devices are saying about us, all the permissions we are giving to Facebook via apps, such as phone status, wifi connection and the ability to record audio."

    All of this research provided only a fraction of the full picture. So the team looked into Facebook's acquisitions, and scoured its myriad patent filings.

    The results were astonishing.

    Visually arresting flow charts that take hours to absorb fully, but which show how the data we give Facebook is used to calculate our ethnic affinity (Facebook's term), sexual orientation, political affiliation, social class, travel schedule and much more.
    Image copyright Share Lab
    Image caption Share Lab presents its information in minutely detailed tables and flow charts

    One map shows how everything - from the links we post on Facebook, to the pages we like, to our online behaviour in many other corners of cyber-space that are owned or interact with the company (Instagram, WhatsApp or sites that merely use your Facebook log-in) - could all be entering a giant algorithmic process.

    And that process allows Facebook to target users with terrifying accuracy, with the ability to determine whether they like Korean food, the length of their commute to work, or their baby's age.

    Another map details the permissions many of us willingly give Facebook via its many smartphone apps, including the ability to read all text messages, download files without permission, and access our precise location.

    Individually, these are powerful tools; combined they amount to a data collection engine that, Mr Joler argues, is ripe for exploitation.

    "If you think just about cookies, just about mobile phone permissions, or just about the retention of metadata - each of those things, from the perspective of data analysis, are really intrusive."
    http://www.bbc.com/news/business-39947942
    Voting 0
  7. earlier this month, The Australian uncovered something that felt like a breach in the social contract: a leaked confidential document prepared by Facebook that revealed the company had offered advertisers the opportunity to target 6.4 million younger users, some only 14 years old, during moments of psychological vulnerability, such as when they felt “worthless,” “insecure,” “stressed,” “defeated,” “anxious,” and like a “failure.”

    The 23-page document had been prepared for a potential advertiser and highlighted Facebook’s ability to micro-target ads down to “moments when young people need a confidence boost.” According to The Australian’s report, Facebook had been monitoring posts, photos, interactions, and internet activity in real time to track these emotional lows. (Facebook confirmed the existence of the report, but declined to respond to questions from WIRED about which types of posts were used to discern emotion.)

    The day the story broke, Facebook quickly issued a public statement arguing that the premise of the article was “misleading”
    https://www.wired.com/2017/05/welcome-next-phase-facebook-backlash
    Voting 0
  8. The High Court of Delhi (‘the High Court’) pronounced, on 23 September 2016, its decision in a public interest litigation case, Karmanya Singh Sareen and Anr v. Union of India and Ors (‘the Decision’), in relation to WhatsApp Inc.’s new privacy policy that allows for the sharing of users’ data with Facebook, Inc., for advertising and marketing purposes. The High Court ordered WhatsApp to delete users’ data completely from its servers and refrain from sharing users’ data with Facebook, provided that the users requested the deletion of their WhatsApp account before 25 September 2016, the date on which the users were asked to agree to the new terms, and also prohibited WhatsApp from sharing existing users’ data dated before 25 September 2016.

    Parul Sharma, Analyst at the Centre for Communication Governance at National Law University of Delhi, told DataGuidance, “In the absence of a privacy law and strong data protection measures it is a strong judgement. » The general implication of the case on mobile application providers and internet based messaging services is dependent on how courts interpret this judgement in the future.”

    Although the High Court ordered WhatsApp not to share its users’ data with Facebook, which have been collected before WhatsApp changed its privacy policy on 25 September 2016, it emphasised that WhatsApp users have voluntarily agreed and are bound by the new terms of service offered. In addition, WhatsApp’s 2012 privacy policy provides that in the event of a merger WhatsApp reserves the right to transfer or assign users’ information.

    While the existence of this right is pending before the Supreme Court in K.S. Puttaswamy, multiple courts have affirmed the constitutional right to privacy since then

    Smitha Krishna Prasad and Abhishek Senthilnathan, Associates at Nishith Desai Associates noted, “There is no statutory framework to govern the functioning of internet based messaging services like WhatsApp in India » Therefore, the High Court has correctly taken the view that WhatsApp may choose to change the terms and conditions of service and users cannot compel WhatsApp to operate within specific parameters.”

    It was argued in the case that the right to privacy guaranteed under Article 2I of the Constitution of India could be a valid ground to prevent WhatsApp sharing data with Facebook. However, the High Court rejected this argument on the basis that the existence of the fundamental right to privacy is yet to be decided in the pending case K.S. Puttaswamy and Anr. v. Union of India & Ors. (2015) 8 SCC 735.
    http://www.dataguidance.com/india-high-court-case-whatsapp-strong-judgment
    Voting 0
  9. Facebook, which now owns WhatsApp, is fighting a challenge to its new privacy policy that it unveiled last year. According to the new privacy policy WhatsApp can share some user data with Facebook, which the Mark Zuckerberg-led company can then use in various ways. Although WhatsApp says that it will (still) not share all the information that users generate through their chats, India Today Tech noted earlier , Facebook only needs the phone number of a user to build a full WhatsApp profile for that user. The company most likely already has other details on users.

    Also Read: WhatsApp will ONLY share phone number but that is all Facebook needs

    The new WhatsApp privacy has been criticised worldwide. Just days ago, a court in Germany asked Facebook to stop harvesting user information from WhatsApp. After the court order, Facebook said that it was pausing the sharing of WhatsApp user data with Facebook in whole of Europe. The ruling came even as the European Union privacy watchdog continues to probe the new privacy policy.

    However, in India where privacy laws are non-existent, Facebook and WhatsApp have so far defended their new privacy policy. It is also important to note that India is one of the biggest markets for both Facebook and WhatsApp and that could also be one of the reasons why Facebook wants to enforce its new privacy policies here. Data from Indian users could be commercially very attractive for the company.
    http://indiatoday.intoday.in/technolo...hatsapp-facebook-lawyer/1/940551.html
    Voting 0
  10. Facebook’s entire project, when it comes to news, rests on the assumption that people’s individual preferences ultimately coincide with the public good, and that if it doesn’t appear that way at first, you’re not delving deeply enough into the data. By contrast, decades of social-science research shows that most of us simply prefer stuff that feels true to our worldview even if it isn’t true at all and that the mining of all those preference signals is likely to lead us deeper into bubbles rather than out of them.

    What’s needed, he argues, is some global superstructure to advance humanity.

    This is not an especially controversial idea; Zuckerberg is arguing for a kind of digital-era version of the global institution-building that the Western world engaged in after World War II. But because he is a chief executive and not an elected president, there is something frightening about his project. He is positioning Facebook — and, considering that he commands absolute voting control of the company, he is positioning himself — as a critical enabler of the next generation of human society. A minor problem with his mission is that it drips with megalomania, albeit of a particularly sincere sort. With his wife, Priscilla Chan, Zuckerberg has pledged to give away nearly all of his wealth to a variety of charitable causes, including a long-term medical-research project to cure all disease. His desire to take on global social problems through digital connectivity, and specifically through Facebook, feels like part of the same impulse.

    Yet Zuckerberg is often blasé about the messiness of the transition between the world we’re in and the one he wants to create through software. Building new “social infrastructure” usually involves tearing older infrastructure down. If you manage the demolition poorly, you might undermine what comes next.
    https://www.nytimes.com/2017/04/25/ma...n-facebook-fix-its-own-worst-bug.html
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 7 Online Bookmarks of M. Fioretti: Tags: data ownership + facebook

About - Propulsed by SemanticScuttle