mfioretti: privacy* + percloud*

Bookmarks on this page are managed by an admin user.

68 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. The company’s financial performance is more of a reflection of Facebook’s unstoppability than its cause. Despite personal reservations about Facebook’s interwoven privacy, data, and advertising practices, the vast majority of people find that they can’t (and don’t want to) quit. Facebook has rewired people’s lives, routing them through its servers, and to disentangle would require major sacrifice. And even if one could get free of the service, the social pathways that existed before Facebook have shriveled up, like the towns along the roads that preceded the interstate highway system. Just look at how the very meaning of the telephone call has changed as we’ve expanded the number of ways we talk with each other. A method of communication that was universally seen as a great way of exchanging information has been transformed into a rarity reserved for close friends, special occasions, emergencies, and debt collectors.

    Most of the general pressures on the internet industry’s data practices, whether from Europe or anywhere else, don’t seem to scare Facebook. Their relative position will still be secure, unless something radical changes. In the company’s conference call with analysts last week, Sheryl Sandberg summed it up.

    “The thing that won’t change is that advertisers are going to look at the highest return-on-investment » opportunity,” Sandberg said. “And what’s most important in winning budgets is relative performance in the industry.”

    As long as dollars going into the Facebook ad machine sell products, dollars will keep going into the Facebook ad machine.

    As long as their friends are still on Instagram, Facebook, and WhatsApp, people will keep using Facebook products.
    https://www.theatlantic.com/technolog...18/05/facebook-the-unstoppable/559301
    Voting 0
  2. Today’s Internet and digital platforms are becoming increasingly centralised, slowing innovation and challenging their potential to revolutionise society and the economy in a pluralistic manner.

    The DECODE project will develop practical alternatives, through the creation, evaluation and demonstration of a distributed and open architecture for managing online access and aggregation of private information to allow a citizen-friendly and privacy-aware governance of access entitlements.

    Strong ethical and digital rights principles are at the base of DECODE’s mission, moving towards the implementation of open standards for a technical architecture resting on the use of Attribute Based Cryptography, distributed ledgers, secure operating system and a privacy focused smart rules language
    https://decodeproject.github.io/whitepaper/#pf6
    Voting 0
  3. “I believe it’s important to tell people exactly how the information that they share on Facebook is going to be used.

    “That’s why, every single time you go to share something on Facebook, whether it’s a photo in Facebook, or a message, every single time, there’s a control right there about who you’re going to be sharing it with ... and you can change that and control that in line.

    “To your broader point about the privacy policy ... long privacy policies are very confusing. And if you make it long and spell out all the detail, then you’re probably going to reduce the per cent of people who read it and make it accessible to them.”
    https://www.theguardian.com/technolog...testimony-to-congress-the-key-moments
    Voting 0
  4. Di nuovo: dove sta lo scandalo di questi giorni, dunque? Lo scandalo sta nell’evidenza di un errore di fondo nella concezione delle interazioni umane, la concezione che Mark Zuckerberg ha imposto — per sua stessa ammissione, nel tanto agnognato intervento post-Cambridge Analytica — dal 2007. L’idea cioè di costruire un “web dove si è social di default”. Dove cioè la norma è condividere. Un principio che è strutturalmente opposto alla tutela della privacy individuale, che si fonda sulla riservatezza come norma, riguardo ai propri dati personali.

    Zuckerberg lo spiega benissimo nel suo più recente intervento, individuando - giustamente - in quell’errore filosofico e antropologico la radice della tempesta in cui è costretto a destreggiarsi: “Nel 2007, abbiamo lanciato la Facebook Platform nella convinzione (“vision”) che più app dovessero essere social. Il tuo calendario doveva poterti mostrare il compleanno degli amici, le tue mappe mostrare dove vivono i tuoi amici, il tuo address book le loro foto. Per farlo, abbiamo consentito di accedere alle app e condividere chi fossero i tuoi amici e alcune informazioni su di loro”.

    È questo che conduce, nel 2013, Kogan a ottenere l’accesso ai dati di milioni di persone. E certo, quei dati hanno un immenso valore scientifico — ed è giusto che la ricerca, se condotta nel pieno rispetto del consenso informato degli utenti divenuti soggetti sperimentali, possa accedervi. Per soli scopi accademici, però. E anche così, già nel 2014 il famoso esperimento condotto da Facebook stessa sulla manipolazione delle emozioni di centinaia di migliaia di utenti, a cui erano stati mostrati deliberatamente più contenuti positivi o negativi, aveva dimostrato che anche quando non ci sono di mezzo fini commerciali, la questione è ambigua, complessa. E che no, non basta accettare condizioni di utilizzo intricate e che non legge nessuno per dire che allora ogni utente ha, per il fatto stesso di avere accettato di essere su Facebook, di diventare indiscriminatamente un topo di laboratorio arruolato in esperimenti di cui ignora tutto.

    Eppure è proprio la piattaforma a rendersi conto, già in quello stesso anno, che così le cose non vanno. Che a quel modo Facebook perde il controllo su quali terze parti hanno accesso ai dati dei suoi utenti. La policy dunque cambia, e da allora gli “amici” devono acconsentire al trattamento dei propri dati da parte di una app. La nuova filosofia, ricorda Albright, è “people first”. Ma è tardi. E l’incapacità di rientrare davvero in possesso di quell’ammasso di informazioni, dimostrata dal caso Cambridge Analytica – possibile Facebook debba scoprire dai giornali che l’azienda non aveva cancellato i dati che diceva di aver cancellato, o che debba comunque condurre un auditing serio per verificarlo ora, dimostrando di non avere idea se lo siano o meno? – fa capire che il problema va ben oltre il singolo caso in questione, ma è sistematico.

    Per capirci più chiaramente: come scrive Albright, la prima versione delle API v.1.0 per il Facebook Graph – cioè ciò che gli sviluppatori di applicazioni potevano ottenere dal social network tra il 2010, data di lancio, e il 2014, data in cui la policy è cambiata – consentiva di sapere non su chi si iscriveva a una determinata app, ma dagli amici inconsapevoli, i seguenti dati: “about, azioni, attività, compleanno, check-ins, istruzione, eventi, giochi, gruppi, residenza, interessi, like, luogo, note, status, tag, foto, domande, relazioni, religione/politica, iscrizioni, siti, storia lavorativa”. Davvero si poteva pensare di controllare dove finissero tutti questi dati, per milioni e milioni di persone?

    E davvero Facebook lo scopre oggi? Nel 2011, la Federal Trade Commission americana aveva già segnalato la questione come problematica. Non ha insegnato nulla
    https://www.valigiablu.it/facebook-cambridge-analytica-scandalo
    Voting 0
  5. Mark Zuckerberg also launched Facebook with a disdain for intrusive advertising, but it wasn’t long before the social network giant became Google’s biggest competitor for ad dollars. After going public with 845 million users in 2012, Facebook became a multibillion-dollar company and Zuckerberg one of the richest men on Earth, but with only a promise that the company would figure out how to monetize its platform.

    Facebook ultimately sold companies on its platform by promising “brand awareness” and the best possible data on what consumers actually liked. Brands could start their own Facebook pages, which people would actually “like” and interact with. This provided unparalleled information about what company each individual person wanted to interact with the most. By engaging with companies on Facebook, people gave corporate marketing departments more information than they could have ever dreamed of buying, but here it was offered up free.

    This was the “grand bargain,” as Columbia University law professor Tim Wu called it in his book, The Attention Merchants, that users struck with corporations. Wu wrote that Facebook’s “billions of users worldwide were simply handing over a treasure trove of detailed demographic data and exposing themselves to highly targeted advertising in return for what, exactly?”

    In other words: We will give you every detail of our lives and you will get rich by selling that information to advertisers.

    European regulators are now saying that bargain was a bad deal. The big question that remains is whether their counterparts in the U.S. will follow their lead.
    https://www.huffingtonpost.com/entry/...antitrust_us_5a625023e4b0dc592a088f6c
    Voting 0
  6. The point is that Facebook has a strong, paternalistic view on what’s best for you, and it’s trying to transport you there. “To get people to this point where there’s more openness – that’s a big challenge. But I think we’ll do it,” Zuckerberg has said. He has reason to believe that he will achieve that goal. With its size, Facebook has amassed outsized powers. “In a lot of ways Facebook is more like a government than a traditional company,” Zuckerberg has said. “We have this large community of people, and more than other technology companies we’re really setting policies.”

    Without knowing it, Zuckerberg is the heir to a long political tradition. Over the last 200 years, the west has been unable to shake an abiding fantasy, a dream sequence in which we throw out the bum politicians and replace them with engineers – rule by slide rule. The French were the first to entertain this notion in the bloody, world-churning aftermath of their revolution. A coterie of the country’s most influential philosophers (notably, Henri de Saint-Simon and Auguste Comte) were genuinely torn about the course of the country. They hated all the old ancient bastions of parasitic power – the feudal lords, the priests and the warriors – but they also feared the chaos of the mob. To split the difference, they proposed a form of technocracy – engineers and assorted technicians would rule with beneficent disinterestedness. Engineers would strip the old order of its power, while governing in the spirit of science. They would impose rationality and order.
    Advertisement

    This dream has captivated intellectuals ever since, especially Americans. The great sociologist Thorstein Veblen was obsessed with installing engineers in power and, in 1921, wrote a book making his case. His vision briefly became a reality. In the aftermath of the first world war, American elites were aghast at all the irrational impulses unleashed by that conflict – the xenophobia, the racism, the urge to lynch and riot. And when the realities of economic life had grown so complicated, how could politicians possibly manage them? Americans of all persuasions began yearning for the salvific ascendance of the most famous engineer of his time: Herbert Hoover. In 1920, Franklin D Roosevelt – who would, of course, go on to replace him in 1932 – organised a movement to draft Hoover for the presidency.

    The Hoover experiment, in the end, hardly realised the happy fantasies about the Engineer King. A very different version of this dream, however, has come to fruition, in the form of the CEOs of the big tech companies. We’re not ruled by engineers, not yet, but they have become the dominant force in American life – the highest, most influential tier of our elite.

    There’s another way to describe this historical progression. Automation has come in waves. During the industrial revolution, machinery replaced manual workers. At first, machines required human operators. Over time, machines came to function with hardly any human intervention. For centuries, engineers automated physical labour; our new engineering elite has automated thought. They have perfected technologies that take over intellectual processes, that render the brain redundant. Or, as the former Google and Yahoo executive Marissa Mayer once argued, “You have to make words less human and more a piece of the machine.” Indeed, we have begun to outsource our intellectual work to companies that suggest what we should learn, the topics we should consider, and the items we ought to buy. These companies can justify their incursions into our lives with the very arguments that Saint-Simon and Comte articulated: they are supplying us with efficiency; they are imposing order on human life.

    Nobody better articulates the modern faith in engineering’s power to transform society than Zuckerberg. He told a group of software developers, “You know, I’m an engineer, and I think a key part of the engineering mindset is this hope and this belief that you can take any system that’s out there and make it much, much better than it is today. Anything, whether it’s hardware or software, a company, a developer ecosystem – you can take anything and make it much, much better.” The world will improve, if only Zuckerberg’s reason can prevail – and it will.

    The precise source of Facebook’s power is algorithms. That’s a concept repeated dutifully in nearly every story about the tech giants, yet it remains fuzzy at best to users of those sites. From the moment of the algorithm’s invention, it was possible to see its power, its revolutionary potential. The algorithm was developed in order to automate thinking, to remove difficult decisions from the hands of humans, to settle contentious debates.
    Advertisement

    The essence of the algorithm is entirely uncomplicated. The textbooks compare them to recipes – a series of precise steps that can be followed mindlessly. This is different from equations, which have one correct result. Algorithms merely capture the process for solving a problem and say nothing about where those steps ultimately lead.

    These recipes are the crucial building blocks of software. Programmers can’t simply order a computer to, say, search the internet. They must give the computer a set of specific instructions for accomplishing that task. These instructions must take the messy human activity of looking for information and transpose that into an orderly process that can be expressed in code. First do this … then do that. The process of translation, from concept to procedure to code, is inherently reductive. Complex processes must be subdivided into a series of binary choices. There’s no equation to suggest a dress to wear, but an algorithm could easily be written for that – it will work its way through a series of either/or questions (morning or night, winter or summer, sun or rain), with each choice pushing to the next.

    Facebook would never put it this way, but algorithms are meant to erode free will, to relieve humans of the burden of choosing, to nudge them in the right direction. Algorithms fuel a sense of omnipotence, the condescending belief that our behaviour can be altered, without our even being aware of the hand guiding us, in a superior direction. That’s always been a danger of the engineering mindset, as it moves beyond its roots in building inanimate stuff and begins to design a more perfect social world. We are the screws and rivets in the grand design.
    https://www.theguardian.com/technolog...r-on-free-will?CMP=Share_iOSApp_Other
    Voting 0
  7. "All of us, when we are uploading something, when we are tagging people, when we are commenting, we are basically working for Facebook," he says.

    The data our interactions provide feeds the complex algorithms that power the social media site, where, as Mr Joler puts it, our behaviour is transformed into a product.

    Trying to untangle that largely hidden process proved to be a mammoth task.

    "We tried to map all the inputs, the fields in which we interact with Facebook, and the outcome," he says.

    "We mapped likes, shares, search, update status, adding photos, friends, names, everything our devices are saying about us, all the permissions we are giving to Facebook via apps, such as phone status, wifi connection and the ability to record audio."

    All of this research provided only a fraction of the full picture. So the team looked into Facebook's acquisitions, and scoured its myriad patent filings.

    The results were astonishing.

    Visually arresting flow charts that take hours to absorb fully, but which show how the data we give Facebook is used to calculate our ethnic affinity (Facebook's term), sexual orientation, political affiliation, social class, travel schedule and much more.
    Image copyright Share Lab
    Image caption Share Lab presents its information in minutely detailed tables and flow charts

    One map shows how everything - from the links we post on Facebook, to the pages we like, to our online behaviour in many other corners of cyber-space that are owned or interact with the company (Instagram, WhatsApp or sites that merely use your Facebook log-in) - could all be entering a giant algorithmic process.

    And that process allows Facebook to target users with terrifying accuracy, with the ability to determine whether they like Korean food, the length of their commute to work, or their baby's age.

    Another map details the permissions many of us willingly give Facebook via its many smartphone apps, including the ability to read all text messages, download files without permission, and access our precise location.

    Individually, these are powerful tools; combined they amount to a data collection engine that, Mr Joler argues, is ripe for exploitation.

    "If you think just about cookies, just about mobile phone permissions, or just about the retention of metadata - each of those things, from the perspective of data analysis, are really intrusive."
    http://www.bbc.com/news/business-39947942
    Voting 0
  8. Facebook, which now owns WhatsApp, is fighting a challenge to its new privacy policy that it unveiled last year. According to the new privacy policy WhatsApp can share some user data with Facebook, which the Mark Zuckerberg-led company can then use in various ways. Although WhatsApp says that it will (still) not share all the information that users generate through their chats, India Today Tech noted earlier , Facebook only needs the phone number of a user to build a full WhatsApp profile for that user. The company most likely already has other details on users.

    Also Read: WhatsApp will ONLY share phone number but that is all Facebook needs

    The new WhatsApp privacy has been criticised worldwide. Just days ago, a court in Germany asked Facebook to stop harvesting user information from WhatsApp. After the court order, Facebook said that it was pausing the sharing of WhatsApp user data with Facebook in whole of Europe. The ruling came even as the European Union privacy watchdog continues to probe the new privacy policy.

    However, in India where privacy laws are non-existent, Facebook and WhatsApp have so far defended their new privacy policy. It is also important to note that India is one of the biggest markets for both Facebook and WhatsApp and that could also be one of the reasons why Facebook wants to enforce its new privacy policies here. Data from Indian users could be commercially very attractive for the company.
    http://indiatoday.intoday.in/technolo...hatsapp-facebook-lawyer/1/940551.html
    Voting 0
  9. the experts are right about many things. OpenPGP is old and more recent tools with more modern designs have a lot going for them. But I still think they're mostly wrong.

    The experts, by and large, have yet to offer any credible replacements for PGP. And when they suggest abandoning PGP, what they're really saying is we should give up on secure e-mail and just use something else. That doesn't fly. Many people have to use e-mail. E-mail is everywhere. Not improving the security of e-mail and instead expecting people to just use other tools (or go without), is the security elite proclaiming from their ivory tower: "Let them eat cake!"

    Furthermore, if that "something else" also requires people use their phone number for everything... well, that's the messaging world's equivalent of the widely despised Facebook Real Name Policy. If you ever needed a clear example of why the lack of diversity (and empathy) in tech is a problem, there it is!

    Compartmentalization, presenting different identities in different contexts, is a fundamental, necessary part of human behaviour. It's one of the basics. If you think taking that away and offering fancy crypto, forward secrecy, deniability instead is a win... well, I think your threat models need some work! You have failed and people will just keep on using insecure e-mail for their accounting, their work, their hobbies, their doctor visits and their interaction with local government. Because people know their needs better than you do.

    But I digress.

    The ridiculous phone number thing aside, I also take issue with the fact that when our opinionated experts do suggest replacements, the things they recommend are proprietary, centralized and controlled by for-profit companies. Some of them (mostly the underdogs) may be open source, but even the best of those use a centralized design and are hostile to federation. In pursuit of security and convenience (and, let's be honest, control, power and money), openness has been hung out to dry.

    This is short-sighted at best.

    These cool new apps may be secure today. But what about tomorrow? Odds are, they will be compromised by government mandate, blocked or shut down.
    https://www.mailpile.is/blog/2016-12-13_Too_Cool_for_PGP.html
    Voting 0
  10. Nel 2007 Google ha acquisito DoubleClick, società che raccoglieva dati di navigazione web, assicurando che mai avrebbe incrociato tali risultati con le informazioni personali possedute grazie all'utilizzo dei propri servizi. Tuttavia, a distanza di quasi 10 anni ha aggiornato le proprie condizioni per l'uso dell'account Google, informando che adesso avrà la possibilità di effettuare tale incrocio. Nel documento si legge adesso: "A seconda delle impostazioni dell'account utente, la sua attività su altri siti e app potrebbe essere associata alle relative informazioni personali allo scopo di migliorare i servizi Google e gli annunci pubblicati da Google". La modifica alle impostazioni deve essere approvata, ed infatti Google richiede specificatamente, una volta effettuato l'accesso al proprio account via browser web, di accettare tali nuove condizioni. L'utente ha la possibilità di mantenere le impostazioni attuali e continuare ad utilizzare i servizi Google allo stesso modo, mentre per i nuovi account invece le nuove opzioni sono abilitate di default. Coi nuovi termini, se accettati, Google potrà unire i dati di navigazione acquisiti tramite i servizi di analisi o tracking alle informazioni già ottenute dal profilo utente. Tutto ciò permetterà alla casa di Mountain View di comporre un ritratto completo dei propri utenti composto dai dati personali, da ciò che viene scritto nelle email, dai siti web visitati e dalle ricerche effettuate, facendo cadere definitivamente il principio di anonimato del tracciamento web.
    http://www.saggiamente.com/2016/10/ad...ource=twitter.com&utm_campaign=buffer
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 7 Online Bookmarks of M. Fioretti: Tags: privacy + percloud

About - Propulsed by SemanticScuttle