mfioretti: privacy* + surveillance*

Bookmarks on this page are managed by an admin user.

227 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. IoT will be able to take stock of your choices, moods, preferences and tastes, the same way Google Search does. With enough spreadsheets, many practical questions are rendered trivial. How hard will it be for the IoT — maybe through Alexa, maybe through your phone — to statistically study why, where and when you raise your voice at your child? If you can correlate people’s habits and physical attributes, it will be toddler-easy to correlate mood to environment. The digitally connected devices of tomorrow would be poor consumer products if they did not learn you well. Being a good and faithful servant means monitoring the master closely, and that is what IoT devices will do. They will analyze your feedback and automate their responses — and predict your needs. In the IoT, Big Data is weaponized, and can peer deeper into the seeds your life than the government has ever dreamed.
    https://www.salon.com/2018/02/19/why-...signed-for-corporations-not-consumers
    Voting 0
  2. Mark Zuckerberg also launched Facebook with a disdain for intrusive advertising, but it wasn’t long before the social network giant became Google’s biggest competitor for ad dollars. After going public with 845 million users in 2012, Facebook became a multibillion-dollar company and Zuckerberg one of the richest men on Earth, but with only a promise that the company would figure out how to monetize its platform.

    Facebook ultimately sold companies on its platform by promising “brand awareness” and the best possible data on what consumers actually liked. Brands could start their own Facebook pages, which people would actually “like” and interact with. This provided unparalleled information about what company each individual person wanted to interact with the most. By engaging with companies on Facebook, people gave corporate marketing departments more information than they could have ever dreamed of buying, but here it was offered up free.

    This was the “grand bargain,” as Columbia University law professor Tim Wu called it in his book, The Attention Merchants, that users struck with corporations. Wu wrote that Facebook’s “billions of users worldwide were simply handing over a treasure trove of detailed demographic data and exposing themselves to highly targeted advertising in return for what, exactly?”

    In other words: We will give you every detail of our lives and you will get rich by selling that information to advertisers.

    European regulators are now saying that bargain was a bad deal. The big question that remains is whether their counterparts in the U.S. will follow their lead.
    https://www.huffingtonpost.com/entry/...antitrust_us_5a625023e4b0dc592a088f6c
    Voting 0
  3. Facebook's ability to figure out the "people we might know" is sometimes eerie. Many a Facebook user has been creeped out when a one-time Tinder date or an ex-boss from 10 years ago suddenly pops up as a friend recommendation. How does the big blue giant know?

    While some of these incredibly accurate friend suggestions are amusing, others are alarming, such as this story from Lisa*, a psychiatrist who is an infrequent Facebook user, mostly signing in to RSVP for events. Last summer, she noticed that the social network had started recommending her patients as friends—and she had no idea why.

    "I haven't shared my email or phone contacts with Facebook," she told me over the phone.

    The next week, things got weirder.

    Most of her patients are senior citizens or people with serious health or developmental issues, but she has one outlier: a 30-something snowboarder. Usually, Facebook would recommend he friend people his own age, who snowboard and jump out of planes. But Lisa told me that he had started seeing older and infirm people, such as a 70-year-old
    https://splinternews.com/facebook-rec...s-psychiatrists-patients-f-1793861472
    Tags: , , , by M. Fioretti (2018-01-28)
    Voting 0
  4. urope has propelled past the United States when it comes to constraining the abuses of Big Tech. In June, the European Union fined Google $2.7 billion for steering web users to its shopping site, and investigations remain active over similar treatment on Android phones. European regulators fined Facebook for lying about whether it could match user profiles with phone numbers on its messaging acquisition WhatsApp. They demanded Apple repay $15.3 billion in back taxes in Ireland. And they forced Amazon to change its e-book contracts, which they claimed inappropriately squeezed publishers.
    AP Photo/Rick Bowmer

    Trust-Busted: In 2002, Microsoft Chairman Bill Gates had to testify at federal court in his company's antitrust case. The public trial led Microsoft to sfoten its aggressive strategy against rivals.

    Unfortunately, these actions were treated mainly as the cost of doing business. The Facebook fine totaled not even 1 percent of the $22 billion purchase price for WhatsApp, and it allowed the two companies to remain partnered. Government policy, in effect, has “told these companies that the smart thing to do is to lie to us and break the law,” said Scott Galloway in his presentation. Google’s remedy in the shopping case still forces rivals to bid for placement at the top of the page, with Google Shopping spun off as a stand-alone competitor. This does weaken Google’s power and solves the “equal treatment” problem, but it doesn’t protect consumers, who will ultimately pay for those costly bids. “The EU got a $2.7 billion fine to hold a party and bail out Greek banks,” said Gary Reback, an antitrust lawyer and critic of the EU’s actions. “No amount of money will make a difference.”

    However, one thing might: Europe’s increasing move toward data privacy. The General Data Protection Regulation (GDPR), scheduled for implementation in May 2018, empowers European web users to affirmatively opt out of having their data collected, with high penalties for non-compliance. Consumers will be able to obtain their personal data and learn how it is used. They can request that their data be erased completely (known as the “right to be forgotten”) as well as prohibited from sale to third parties. Platforms could not condition use of their products on data collection. A separate, not-yet-finalized regulation called ePrivacy would forbid platforms from tracking users across separate apps, websites, and devices.
    http://prospect.org/article/big-tech-new-predatory-capitalism
    Voting 0
  5. Some entries are ambiguous. Take Microsoft, under the “operational services” category. PayPal apparently supplies the tech company with an image of a customer–a photo or video–or their image from an identity document for the purposes of “facial image comparison for fraud protection” and “research and testing as to appropriateness of new products.” The former sounds like some kind of facial recognition system that PayPal uses to look for fraud. But the latter is uneasily broad. What kind of research is Microsoft doing using pictures of PayPal users’ faces? PayPal did not comment on this specific question.
    https://www.fastcodesign.com/90157501...ource=twitter.com&utm_campaign=buffer
    Voting 0
  6. "Continueremo a lavorare con le autorità francesi per garantire che gli utenti comprendano quali informazioni vengono raccolte e come vengono utilizzate", ha affermato WhatsApp in una dichiarazione inviata per posta elettronica. "Ci impegniamo a risolvere le diverse e talvolta contraddittorie preoccupazioni che hanno sollevato le autorità per la protezione dei dati, con un approccio comune a livello europeo prima che nuove norme sulla protezione dei dati a livello di blocco entrino in vigore nel maggio 2018".

    I trasferimenti di dati da WhatsApp a Facebook avvengono in parte senza il consenso dell'utente, ha ribadito l'ente francese che ha anche respinto le argomentazioni di WhatsApp secondo le quali l'azienda sarebbe soggetta solo alla legge degli Stati Uniti. Il monito francese, è "un avviso formale, non una sanzione", ma il colosso dei messaggi rischierebbe di incorrere in multe in una fase successiva.
    http://www.repubblica.it/tecnologia/2...ncia_il_garante_fb_whatsapp-184580045
    Voting 0
  7. Unlike the Passport Officer, the RTO, the Electoral Officer, the CEO of UIDAI does not take any legal liability to certify the number as a proof of anyone’s identity, address or existence. Furthermore no one has verified or audited the database to establish how many of the billion numbers that are linked to data submitted by the outsourced parties are real individuals.

    The resulting Aadhaar database is the database being used to “purify”, as described by Ajay Bhushan Pandey the CEO of UIDAI, all databases that are seeded with Aadhaar. The seeding of other databases with the Aadhaar number is also unlike any other identification document. This seeding threatens to exclude the genuine and include the fake into other existing databases by seeding Aadhaar to other databases. The case of over 13,000 fake employees in Satyam’s who got salaries every month for years before being exposed is still fresh in India.

    As the government embarks to link the entire Consolidated Fund of India’s receipts and expenditure to this database, is it not reasonable to establish some CAG certificate on the existence of every person in this database?

    Mr. Nilekani has often highlighted the use of biometric to authenticate who you are as the core strength of the Aadhaar database. What he fails to state is that even if biometric could uniquely establish your identity uniquely throughout your life, which it cannot, its use for authentication is absurd.

    Once stolen, your biometric can be used, in a multiple of ways differing in simplicity and ease, by the thief, to perpetuate crimes that will be attributed to you and may be difficult, if not impossible, for you to deny.

    It is precisely this difference between the enrolment and use models of the Aadhaar in comparison with any other ID are a threat to you as well as the nation.
    https://tech.economictimes.indiatimes...ts/how-does-aadhaar-threaten-you/2277
    Voting 0
  8. Speaking as a statistician, it is quite easy to identify people in anonymous datasets. There are only so many 5'4" jews living in San Francisco with chronic back pain. Every bit of information we reveal about ourselves will be one more disease that we can track, and another life saved.

    If I want to know whether I will suffer a heart attack, I will have to release my data for public research. In the end, privacy will be an early death sentence.

    Already, health insurers are beginning to offer discounts for people who wear health trackers and let others analyze their personal movements. Many, if not most, consumers in the next generation will choose cash and a longer life in exchange for publicizing their most intimate details.

    What can we tell with basic health information, such as calories burned throughout the day? Pretty much everything.

    With a rudimentary step and calorie counter, I was able to distinguish whether I was having sex or at the gym, since the minute-by-minute calorie burn profile of sex is quite distinct (the image below from my health tracker shows lots of energy expended at the beginning and end, with few steps taken. Few activities besides sex have this distinct shape)
    https://medium.com/the-ferenstein-wir...rs-of-history-in-50-images-614c26059e
    Voting 0
  9. The point is that Facebook has a strong, paternalistic view on what’s best for you, and it’s trying to transport you there. “To get people to this point where there’s more openness – that’s a big challenge. But I think we’ll do it,” Zuckerberg has said. He has reason to believe that he will achieve that goal. With its size, Facebook has amassed outsized powers. “In a lot of ways Facebook is more like a government than a traditional company,” Zuckerberg has said. “We have this large community of people, and more than other technology companies we’re really setting policies.”

    Without knowing it, Zuckerberg is the heir to a long political tradition. Over the last 200 years, the west has been unable to shake an abiding fantasy, a dream sequence in which we throw out the bum politicians and replace them with engineers – rule by slide rule. The French were the first to entertain this notion in the bloody, world-churning aftermath of their revolution. A coterie of the country’s most influential philosophers (notably, Henri de Saint-Simon and Auguste Comte) were genuinely torn about the course of the country. They hated all the old ancient bastions of parasitic power – the feudal lords, the priests and the warriors – but they also feared the chaos of the mob. To split the difference, they proposed a form of technocracy – engineers and assorted technicians would rule with beneficent disinterestedness. Engineers would strip the old order of its power, while governing in the spirit of science. They would impose rationality and order.
    Advertisement

    This dream has captivated intellectuals ever since, especially Americans. The great sociologist Thorstein Veblen was obsessed with installing engineers in power and, in 1921, wrote a book making his case. His vision briefly became a reality. In the aftermath of the first world war, American elites were aghast at all the irrational impulses unleashed by that conflict – the xenophobia, the racism, the urge to lynch and riot. And when the realities of economic life had grown so complicated, how could politicians possibly manage them? Americans of all persuasions began yearning for the salvific ascendance of the most famous engineer of his time: Herbert Hoover. In 1920, Franklin D Roosevelt – who would, of course, go on to replace him in 1932 – organised a movement to draft Hoover for the presidency.

    The Hoover experiment, in the end, hardly realised the happy fantasies about the Engineer King. A very different version of this dream, however, has come to fruition, in the form of the CEOs of the big tech companies. We’re not ruled by engineers, not yet, but they have become the dominant force in American life – the highest, most influential tier of our elite.

    There’s another way to describe this historical progression. Automation has come in waves. During the industrial revolution, machinery replaced manual workers. At first, machines required human operators. Over time, machines came to function with hardly any human intervention. For centuries, engineers automated physical labour; our new engineering elite has automated thought. They have perfected technologies that take over intellectual processes, that render the brain redundant. Or, as the former Google and Yahoo executive Marissa Mayer once argued, “You have to make words less human and more a piece of the machine.” Indeed, we have begun to outsource our intellectual work to companies that suggest what we should learn, the topics we should consider, and the items we ought to buy. These companies can justify their incursions into our lives with the very arguments that Saint-Simon and Comte articulated: they are supplying us with efficiency; they are imposing order on human life.

    Nobody better articulates the modern faith in engineering’s power to transform society than Zuckerberg. He told a group of software developers, “You know, I’m an engineer, and I think a key part of the engineering mindset is this hope and this belief that you can take any system that’s out there and make it much, much better than it is today. Anything, whether it’s hardware or software, a company, a developer ecosystem – you can take anything and make it much, much better.” The world will improve, if only Zuckerberg’s reason can prevail – and it will.

    The precise source of Facebook’s power is algorithms. That’s a concept repeated dutifully in nearly every story about the tech giants, yet it remains fuzzy at best to users of those sites. From the moment of the algorithm’s invention, it was possible to see its power, its revolutionary potential. The algorithm was developed in order to automate thinking, to remove difficult decisions from the hands of humans, to settle contentious debates.
    Advertisement

    The essence of the algorithm is entirely uncomplicated. The textbooks compare them to recipes – a series of precise steps that can be followed mindlessly. This is different from equations, which have one correct result. Algorithms merely capture the process for solving a problem and say nothing about where those steps ultimately lead.

    These recipes are the crucial building blocks of software. Programmers can’t simply order a computer to, say, search the internet. They must give the computer a set of specific instructions for accomplishing that task. These instructions must take the messy human activity of looking for information and transpose that into an orderly process that can be expressed in code. First do this … then do that. The process of translation, from concept to procedure to code, is inherently reductive. Complex processes must be subdivided into a series of binary choices. There’s no equation to suggest a dress to wear, but an algorithm could easily be written for that – it will work its way through a series of either/or questions (morning or night, winter or summer, sun or rain), with each choice pushing to the next.

    Facebook would never put it this way, but algorithms are meant to erode free will, to relieve humans of the burden of choosing, to nudge them in the right direction. Algorithms fuel a sense of omnipotence, the condescending belief that our behaviour can be altered, without our even being aware of the hand guiding us, in a superior direction. That’s always been a danger of the engineering mindset, as it moves beyond its roots in building inanimate stuff and begins to design a more perfect social world. We are the screws and rivets in the grand design.
    https://www.theguardian.com/technolog...r-on-free-will?CMP=Share_iOSApp_Other
    Voting 0
  10. Prediction of human physical traits and demographic informa-
    tion from genomic data challenges privacy and data deidenti-
    fication in personalized medicine. To explore the current capa-
    bilities of phenotype-based genomic identification, we applied
    whole-genome sequencing, detailed phenotyping, and statistical
    modeling to predict biometric traits in a cohort of 1,061 partici-
    pants of diverse ancestry. Individually, for a large fraction of the
    traits, their predictive accuracy beyond ancestry and demographic
    information is limited. However, we have developed a maximum
    entropy algorithm that integrates multiple predictions to deter-
    mine which genomic samples and phenotype measurements origi-
    nate from the same person. Using this algorithm, we have reiden-
    tified an average of
    >
    8 of 10 held-out individuals in an ethnically
    mixed cohort and an average of 5 of either 10 African Americans or
    10 Europeans. This work challenges current conceptions of personal
    privacy and may have far-reaching ethical and legal implications.
    http://www.pnas.org/content/early/2017/08/29/1711125114.full.pdf
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 23 Online Bookmarks of M. Fioretti: Tags: privacy + surveillance

About - Propulsed by SemanticScuttle