mfioretti: privacy*

Bookmarks on this page are managed by an admin user.

614 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. Speaking as a statistician, it is quite easy to identify people in anonymous datasets. There are only so many 5'4" jews living in San Francisco with chronic back pain. Every bit of information we reveal about ourselves will be one more disease that we can track, and another life saved.

    If I want to know whether I will suffer a heart attack, I will have to release my data for public research. In the end, privacy will be an early death sentence.

    Already, health insurers are beginning to offer discounts for people who wear health trackers and let others analyze their personal movements. Many, if not most, consumers in the next generation will choose cash and a longer life in exchange for publicizing their most intimate details.

    What can we tell with basic health information, such as calories burned throughout the day? Pretty much everything.

    With a rudimentary step and calorie counter, I was able to distinguish whether I was having sex or at the gym, since the minute-by-minute calorie burn profile of sex is quite distinct (the image below from my health tracker shows lots of energy expended at the beginning and end, with few steps taken. Few activities besides sex have this distinct shape)
    https://medium.com/the-ferenstein-wir...rs-of-history-in-50-images-614c26059e
    Voting 0
  2. The point is that Facebook has a strong, paternalistic view on what’s best for you, and it’s trying to transport you there. “To get people to this point where there’s more openness – that’s a big challenge. But I think we’ll do it,” Zuckerberg has said. He has reason to believe that he will achieve that goal. With its size, Facebook has amassed outsized powers. “In a lot of ways Facebook is more like a government than a traditional company,” Zuckerberg has said. “We have this large community of people, and more than other technology companies we’re really setting policies.”

    Without knowing it, Zuckerberg is the heir to a long political tradition. Over the last 200 years, the west has been unable to shake an abiding fantasy, a dream sequence in which we throw out the bum politicians and replace them with engineers – rule by slide rule. The French were the first to entertain this notion in the bloody, world-churning aftermath of their revolution. A coterie of the country’s most influential philosophers (notably, Henri de Saint-Simon and Auguste Comte) were genuinely torn about the course of the country. They hated all the old ancient bastions of parasitic power – the feudal lords, the priests and the warriors – but they also feared the chaos of the mob. To split the difference, they proposed a form of technocracy – engineers and assorted technicians would rule with beneficent disinterestedness. Engineers would strip the old order of its power, while governing in the spirit of science. They would impose rationality and order.
    Advertisement

    This dream has captivated intellectuals ever since, especially Americans. The great sociologist Thorstein Veblen was obsessed with installing engineers in power and, in 1921, wrote a book making his case. His vision briefly became a reality. In the aftermath of the first world war, American elites were aghast at all the irrational impulses unleashed by that conflict – the xenophobia, the racism, the urge to lynch and riot. And when the realities of economic life had grown so complicated, how could politicians possibly manage them? Americans of all persuasions began yearning for the salvific ascendance of the most famous engineer of his time: Herbert Hoover. In 1920, Franklin D Roosevelt – who would, of course, go on to replace him in 1932 – organised a movement to draft Hoover for the presidency.

    The Hoover experiment, in the end, hardly realised the happy fantasies about the Engineer King. A very different version of this dream, however, has come to fruition, in the form of the CEOs of the big tech companies. We’re not ruled by engineers, not yet, but they have become the dominant force in American life – the highest, most influential tier of our elite.

    There’s another way to describe this historical progression. Automation has come in waves. During the industrial revolution, machinery replaced manual workers. At first, machines required human operators. Over time, machines came to function with hardly any human intervention. For centuries, engineers automated physical labour; our new engineering elite has automated thought. They have perfected technologies that take over intellectual processes, that render the brain redundant. Or, as the former Google and Yahoo executive Marissa Mayer once argued, “You have to make words less human and more a piece of the machine.” Indeed, we have begun to outsource our intellectual work to companies that suggest what we should learn, the topics we should consider, and the items we ought to buy. These companies can justify their incursions into our lives with the very arguments that Saint-Simon and Comte articulated: they are supplying us with efficiency; they are imposing order on human life.

    Nobody better articulates the modern faith in engineering’s power to transform society than Zuckerberg. He told a group of software developers, “You know, I’m an engineer, and I think a key part of the engineering mindset is this hope and this belief that you can take any system that’s out there and make it much, much better than it is today. Anything, whether it’s hardware or software, a company, a developer ecosystem – you can take anything and make it much, much better.” The world will improve, if only Zuckerberg’s reason can prevail – and it will.

    The precise source of Facebook’s power is algorithms. That’s a concept repeated dutifully in nearly every story about the tech giants, yet it remains fuzzy at best to users of those sites. From the moment of the algorithm’s invention, it was possible to see its power, its revolutionary potential. The algorithm was developed in order to automate thinking, to remove difficult decisions from the hands of humans, to settle contentious debates.
    Advertisement

    The essence of the algorithm is entirely uncomplicated. The textbooks compare them to recipes – a series of precise steps that can be followed mindlessly. This is different from equations, which have one correct result. Algorithms merely capture the process for solving a problem and say nothing about where those steps ultimately lead.

    These recipes are the crucial building blocks of software. Programmers can’t simply order a computer to, say, search the internet. They must give the computer a set of specific instructions for accomplishing that task. These instructions must take the messy human activity of looking for information and transpose that into an orderly process that can be expressed in code. First do this … then do that. The process of translation, from concept to procedure to code, is inherently reductive. Complex processes must be subdivided into a series of binary choices. There’s no equation to suggest a dress to wear, but an algorithm could easily be written for that – it will work its way through a series of either/or questions (morning or night, winter or summer, sun or rain), with each choice pushing to the next.

    Facebook would never put it this way, but algorithms are meant to erode free will, to relieve humans of the burden of choosing, to nudge them in the right direction. Algorithms fuel a sense of omnipotence, the condescending belief that our behaviour can be altered, without our even being aware of the hand guiding us, in a superior direction. That’s always been a danger of the engineering mindset, as it moves beyond its roots in building inanimate stuff and begins to design a more perfect social world. We are the screws and rivets in the grand design.
    https://www.theguardian.com/technolog...r-on-free-will?CMP=Share_iOSApp_Other
    Voting 0
  3. Prediction of human physical traits and demographic informa-
    tion from genomic data challenges privacy and data deidenti-
    fication in personalized medicine. To explore the current capa-
    bilities of phenotype-based genomic identification, we applied
    whole-genome sequencing, detailed phenotyping, and statistical
    modeling to predict biometric traits in a cohort of 1,061 partici-
    pants of diverse ancestry. Individually, for a large fraction of the
    traits, their predictive accuracy beyond ancestry and demographic
    information is limited. However, we have developed a maximum
    entropy algorithm that integrates multiple predictions to deter-
    mine which genomic samples and phenotype measurements origi-
    nate from the same person. Using this algorithm, we have reiden-
    tified an average of
    >
    8 of 10 held-out individuals in an ethnically
    mixed cohort and an average of 5 of either 10 African Americans or
    10 Europeans. This work challenges current conceptions of personal
    privacy and may have far-reaching ethical and legal implications.
    http://www.pnas.org/content/early/2017/08/29/1711125114.full.pdf
    Voting 0
  4. Human judges performed much worse than the algorithm, accurately identifying orientation only 61% of the time for men and 54% for women. When the software reviewed five images per person, it was even more successful – 91% of the time with men and 83% with women. Broadly, that means “faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain”, the authors wrote.
    Elon Musk says AI could lead to third world war
    Read more

    The paper suggested that the findings provide “strong support” for the theory that sexual orientation stems from exposure to certain hormones before birth, meaning people are born gay and being queer is not a choice. The machine’s lower success rate for women also could support the notion that female sexual orientation is more fluid.

    While the findings have clear limits when it comes to gender and sexuality – people of color were not included in the study, and there was no consideration of transgender or bisexual people – the implications for artificial intelligence (AI) are vast and alarming. With billions of facial images of people stored on social media sites and in government databases, the researchers suggested that public data could be used to detect people’s sexual orientation without their consent.

    It’s easy to imagine spouses using the technology on partners they suspect are closeted, or teenagers using the algorithm on themselves or their peers. More frighteningly, governments that continue to prosecute LGBT people could hypothetically use the technology to out and target populations. That means building this kind of software and publicizing it is itself controversial given concerns that it could encourage harmful applications.

    But the authors argued that the technology already exists, and its capabilities are important to expose so that governments and companies can proactively consider privacy risks and the need for safeguards and regulations.
    https://www.theguardian.com/technolog...m-a-photograph?CMP=Share_iOSApp_Other
    Voting 0
  5. Smart toys including My Friend Cayla, Hello Barbie, and CloudPets are designed to learn and grow with your kid. Cool, right? Unfortunately, many of these toys have privacy problems. As the 2015 data breach of Vtech's InnoTab Max uncovered, hackers specifically target kids because they offer clean credit histories and unused Social Security numbers that they can use for identity theft. These toys also collect a lot of information about your kid, and they aren't always clear about when they do it and how they use it.
    http://edition.cnn.com/2017/06/07/hea...2Fedition_us+%28RSS%3A+CNNi+-+U.S.%29
    Voting 0
  6. he same data-richness that interests police departments should also give us pause: it's never been the case that a cop busting a low-level, nonviolent offender would be allowed to probe that person's entire network of friends and relations; read all the correspondence between the arrestee and their doctors, lawyers, kids and spouse; get a neat list of all the places the person had visited; and be able to look at everything from bank balances to spending history.

    The major provider of mobile forensic tools is the Israeli firm Cellebrite, who made headlines when the FBI revealed that they'd used a Cellebrite tool to crack the San Bernadino shooters' phones, and then again when a hacker dumped 900GB worth of internal Cellebrite info, revealing that the company routinely repackaged hacking tools from the darkweb and sold them to police departments without first verifying that these weren't leaking data to third parties or otherwise creating risks for their users and their targets.
    https://boingboing.net/2017/06/07/uni...A+boingboing%2FiBag+%28Boing+Boing%29
    Voting 0
  7. La relazione dell’Autorità garante per la protezione dei dati personali non fa alcun riferimento con quei toni così banali ad “allarmi”, sottolinea invece le grandi questioni in campo, anzi raccontando i risultati ottenuti lavorando sulle discipline, le garanzie, il dialogo coi player, seguendo le linee guida dei gruppi dell’Unione. Se il mainstream dev’essere sempre quello dell’allarme e dei titoli fuorvianti, inutile lamentarsi della disintermediazione e dello strapotere delle piattaforme di social media.
    http://www.webnews.it/2017/06/06/garante-privacy-relazione-annuale
    Voting 0
  8. Con riferimento alla diffusione dei dati tramite internet, inoltre, si registra in
    molti casi una non sempre oculata gestione dell’ambito di conoscibilità dei dati; nel
    contesto scolastico molte informazioni personali possono avere un ambito di circo-
    lazione che va oltre il diretto interessato e coinvolge anche la comunità scolastica
    (alunni, studenti, genitori, insegnanti) ma non possono invece essere diffuse sulla
    rete e rese disponibili a chiunque non faccia parte di detta comunità. In questo
    senso quindi il sito web può senz’altro facilitare forme di comunicazione sistematica
    mettendo a disposizioni taluni dati all’interno di un’area ad accesso riservato, ferma
    restando la riservatezza di informazioni di carattere personale che possono essere
    conosciute però solo dagli interessati e dalla loro famiglia.
    http://194.242.234.211/documents/10160/0/Relazione+annuale+2016+-+Il+testo
    Tags: , , by M. Fioretti (2017-06-07)
    Voting 0
  9. "All of us, when we are uploading something, when we are tagging people, when we are commenting, we are basically working for Facebook," he says.

    The data our interactions provide feeds the complex algorithms that power the social media site, where, as Mr Joler puts it, our behaviour is transformed into a product.

    Trying to untangle that largely hidden process proved to be a mammoth task.

    "We tried to map all the inputs, the fields in which we interact with Facebook, and the outcome," he says.

    "We mapped likes, shares, search, update status, adding photos, friends, names, everything our devices are saying about us, all the permissions we are giving to Facebook via apps, such as phone status, wifi connection and the ability to record audio."

    All of this research provided only a fraction of the full picture. So the team looked into Facebook's acquisitions, and scoured its myriad patent filings.

    The results were astonishing.

    Visually arresting flow charts that take hours to absorb fully, but which show how the data we give Facebook is used to calculate our ethnic affinity (Facebook's term), sexual orientation, political affiliation, social class, travel schedule and much more.
    Image copyright Share Lab
    Image caption Share Lab presents its information in minutely detailed tables and flow charts

    One map shows how everything - from the links we post on Facebook, to the pages we like, to our online behaviour in many other corners of cyber-space that are owned or interact with the company (Instagram, WhatsApp or sites that merely use your Facebook log-in) - could all be entering a giant algorithmic process.

    And that process allows Facebook to target users with terrifying accuracy, with the ability to determine whether they like Korean food, the length of their commute to work, or their baby's age.

    Another map details the permissions many of us willingly give Facebook via its many smartphone apps, including the ability to read all text messages, download files without permission, and access our precise location.

    Individually, these are powerful tools; combined they amount to a data collection engine that, Mr Joler argues, is ripe for exploitation.

    "If you think just about cookies, just about mobile phone permissions, or just about the retention of metadata - each of those things, from the perspective of data analysis, are really intrusive."
    http://www.bbc.com/news/business-39947942
    Voting 0
  10. The High Court of Delhi (‘the High Court’) pronounced, on 23 September 2016, its decision in a public interest litigation case, Karmanya Singh Sareen and Anr v. Union of India and Ors (‘the Decision’), in relation to WhatsApp Inc.’s new privacy policy that allows for the sharing of users’ data with Facebook, Inc., for advertising and marketing purposes. The High Court ordered WhatsApp to delete users’ data completely from its servers and refrain from sharing users’ data with Facebook, provided that the users requested the deletion of their WhatsApp account before 25 September 2016, the date on which the users were asked to agree to the new terms, and also prohibited WhatsApp from sharing existing users’ data dated before 25 September 2016.

    Parul Sharma, Analyst at the Centre for Communication Governance at National Law University of Delhi, told DataGuidance, “In the absence of a privacy law and strong data protection measures it is a strong judgement. » The general implication of the case on mobile application providers and internet based messaging services is dependent on how courts interpret this judgement in the future.”

    Although the High Court ordered WhatsApp not to share its users’ data with Facebook, which have been collected before WhatsApp changed its privacy policy on 25 September 2016, it emphasised that WhatsApp users have voluntarily agreed and are bound by the new terms of service offered. In addition, WhatsApp’s 2012 privacy policy provides that in the event of a merger WhatsApp reserves the right to transfer or assign users’ information.

    While the existence of this right is pending before the Supreme Court in K.S. Puttaswamy, multiple courts have affirmed the constitutional right to privacy since then

    Smitha Krishna Prasad and Abhishek Senthilnathan, Associates at Nishith Desai Associates noted, “There is no statutory framework to govern the functioning of internet based messaging services like WhatsApp in India » Therefore, the High Court has correctly taken the view that WhatsApp may choose to change the terms and conditions of service and users cannot compel WhatsApp to operate within specific parameters.”

    It was argued in the case that the right to privacy guaranteed under Article 2I of the Constitution of India could be a valid ground to prevent WhatsApp sharing data with Facebook. However, the High Court rejected this argument on the basis that the existence of the fundamental right to privacy is yet to be decided in the pending case K.S. Puttaswamy and Anr. v. Union of India & Ors. (2015) 8 SCC 735.
    http://www.dataguidance.com/india-high-court-case-whatsapp-strong-judgment
    Voting 0

Top of the page

First / Previous / Next / Last / Page 2 of 62 Online Bookmarks of M. Fioretti: Tags: privacy

About - Propulsed by SemanticScuttle