mfioretti: algorithms* + privacy*

Bookmarks on this page are managed by an admin user.

13 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. Stratumseind in Eindhoven is one of the busiest nightlife streets in the Netherlands. On a Saturday night, bars are packed, music blares through the street, laughter and drunken shouting bounces off the walls. As the night progresses, the ground becomes littered with empty shot bottles, energy drink cans, cigarette butts and broken glass.

    It’s no surprise that the place is also known for its frequent fights. To change that image, Stratumseind has become one of the “smartest” streets in the Netherlands. Lamp-posts have been fitted with wifi-trackers, cameras and 64 microphones that can detect aggressive behaviour and alert police officers to altercations. There has been a failed experiment to change light intensity to alter the mood. The next plan, starting this spring, is to diffuse the smell of oranges to calm people down. The aim? To make Stratumseind a safer place.

    We get that comment a lot – ‘Big brother is watching you’. I prefer to say, ‘Big brother is helping you’
    Peter van de Crommert

    All the while, data is being collected and stored. “Visitors do not realise they are entering a living laboratory,” says Maša Galic, a researcher on privacy in the public space for the Tilburg Institute of Law, Technology and Society. Since the data on Stratumseind is used to profile, nudge or actively target people, this “smart city” experiment is subject to privacy law. According to the Dutch Personal Data Protection Act, people should be notified in advance of data collection and the purpose should be specified – but in Stratumseind, as in many other “smart cities”, this is not the case.

    Peter van de Crommert is involved at Stratumseind as project manager with the Dutch Institute for Technology, Safety and Security. He says visitors do not have to worry about their privacy: the data is about crowds, not individuals. “We often get that comment – ‘Big brother is watching you’ – but I prefer to say, ‘Big brother is helping you’. We want safe nightlife, but not a soldier on every street corner.”
    Revellers in Eindhoven’s Stratumseind celebrate King’s Day.
    Facebook
    Twitter
    Pinterest
    Revellers in Eindhoven’s Stratumseind celebrate King’s Day. Photograph: Filippo Manaresi/Moment Editorial/Getty Images

    When we think of smart cities, we usually think of big projects: Songdo in South Korea, the IBM control centre in Rio de Janeiro or the hundreds of new smart cities in India. More recent developments include Toronto, where Google will build an entirely new smart neighbourhood, and Arizona, where Bill Gates plans to build his own smart city. But the reality of the smart city is that it has stretched into the everyday fabric of urban life – particularly so in the Netherlands.
    Advertisement

    In the eastern city of Enschede, city traffic sensors pick up your phone’s wifi signal even if you are not connected to the wifi network. The trackers register your MAC address, the unique network card number in a smartphone. The city council wants to know how often people visit Enschede, and what their routes and preferred spots are. Dave Borghuis, an Enschede resident, was not impressed and filed an official complaint. “I don’t think it’s okay for the municipality to track its citizens in this way,” he said. “If you walk around the city, you have to be able to imagine yourself unwatched.”

    Enschede is enthusiastic about the advantages of the smart city. The municipality says it is saving €36m in infrastructure investments by launching a smart traffic app that rewards people for good behaviour like cycling, walking and using public transport. (Ironically, one of the rewards is a free day of private parking.) Only those who mine the small print will discover that the app creates “personal mobility profiles”, and that the collected personal data belongs to the company Mobidot.
    https://www.theguardian.com/cities/20...-privacy-eindhoven-utrecht?CMP=twt_gu
    Voting 0
  2. IoT will be able to take stock of your choices, moods, preferences and tastes, the same way Google Search does. With enough spreadsheets, many practical questions are rendered trivial. How hard will it be for the IoT — maybe through Alexa, maybe through your phone — to statistically study why, where and when you raise your voice at your child? If you can correlate people’s habits and physical attributes, it will be toddler-easy to correlate mood to environment. The digitally connected devices of tomorrow would be poor consumer products if they did not learn you well. Being a good and faithful servant means monitoring the master closely, and that is what IoT devices will do. They will analyze your feedback and automate their responses — and predict your needs. In the IoT, Big Data is weaponized, and can peer deeper into the seeds your life than the government has ever dreamed.
    https://www.salon.com/2018/02/19/why-...signed-for-corporations-not-consumers
    Voting 0
  3. "All of us, when we are uploading something, when we are tagging people, when we are commenting, we are basically working for Facebook," he says.

    The data our interactions provide feeds the complex algorithms that power the social media site, where, as Mr Joler puts it, our behaviour is transformed into a product.

    Trying to untangle that largely hidden process proved to be a mammoth task.

    "We tried to map all the inputs, the fields in which we interact with Facebook, and the outcome," he says.

    "We mapped likes, shares, search, update status, adding photos, friends, names, everything our devices are saying about us, all the permissions we are giving to Facebook via apps, such as phone status, wifi connection and the ability to record audio."

    All of this research provided only a fraction of the full picture. So the team looked into Facebook's acquisitions, and scoured its myriad patent filings.

    The results were astonishing.

    Visually arresting flow charts that take hours to absorb fully, but which show how the data we give Facebook is used to calculate our ethnic affinity (Facebook's term), sexual orientation, political affiliation, social class, travel schedule and much more.
    Image copyright Share Lab
    Image caption Share Lab presents its information in minutely detailed tables and flow charts

    One map shows how everything - from the links we post on Facebook, to the pages we like, to our online behaviour in many other corners of cyber-space that are owned or interact with the company (Instagram, WhatsApp or sites that merely use your Facebook log-in) - could all be entering a giant algorithmic process.

    And that process allows Facebook to target users with terrifying accuracy, with the ability to determine whether they like Korean food, the length of their commute to work, or their baby's age.

    Another map details the permissions many of us willingly give Facebook via its many smartphone apps, including the ability to read all text messages, download files without permission, and access our precise location.

    Individually, these are powerful tools; combined they amount to a data collection engine that, Mr Joler argues, is ripe for exploitation.

    "If you think just about cookies, just about mobile phone permissions, or just about the retention of metadata - each of those things, from the perspective of data analysis, are really intrusive."
    http://www.bbc.com/news/business-39947942
    Voting 0
  4. Facebook’s entire project, when it comes to news, rests on the assumption that people’s individual preferences ultimately coincide with the public good, and that if it doesn’t appear that way at first, you’re not delving deeply enough into the data. By contrast, decades of social-science research shows that most of us simply prefer stuff that feels true to our worldview even if it isn’t true at all and that the mining of all those preference signals is likely to lead us deeper into bubbles rather than out of them.

    What’s needed, he argues, is some global superstructure to advance humanity.

    This is not an especially controversial idea; Zuckerberg is arguing for a kind of digital-era version of the global institution-building that the Western world engaged in after World War II. But because he is a chief executive and not an elected president, there is something frightening about his project. He is positioning Facebook — and, considering that he commands absolute voting control of the company, he is positioning himself — as a critical enabler of the next generation of human society. A minor problem with his mission is that it drips with megalomania, albeit of a particularly sincere sort. With his wife, Priscilla Chan, Zuckerberg has pledged to give away nearly all of his wealth to a variety of charitable causes, including a long-term medical-research project to cure all disease. His desire to take on global social problems through digital connectivity, and specifically through Facebook, feels like part of the same impulse.

    Yet Zuckerberg is often blasé about the messiness of the transition between the world we’re in and the one he wants to create through software. Building new “social infrastructure” usually involves tearing older infrastructure down. If you manage the demolition poorly, you might undermine what comes next.
    https://www.nytimes.com/2017/04/25/ma...n-facebook-fix-its-own-worst-bug.html
    Voting 0
  5. non voglio farla lunga, ma in allora, come oggi, io non controllavo affatto il dato e l’informazione personale volontariamente o forzosamente appresa ad ogni mio movimento; ciò che in qualche modo mi salvava nella tribolata adolescenza (non sempre invero) era il controllo della situazione sociale e del contesto.

    Il controllo sul dato-informazione non l’avevo con il macellaio del paese e non posso pensare di averlo oggi sul web con Google, Facebook e soprattutto con le mille agenzie statuali affette, per svariate e talvolta encomiabili ragioni, da bulimia informativa. Ma in allora avevo contezza e in qualche modo governavo le banali regole tecniche (le vie del paese, gli orari della corriera) e quelle sociali di prossimità del mio territorio.

    Oggi non ci riesco più. E non è solo per la quantità dei dati captati e memorizzati ad ogni passo ma per la totale opacità del contesto e delle regole tecniche e sociali che governano la nostra vita digitale.

    Algoritmi ignoti, insondabili ai loro stessi creatori, ricostruiscono la nostra immagine, creano punteggi e giudicano rilevanze e congruità a nostra totale insaputa. Banche, assicurazioni, imprese di ogni risma e fattezza (a breve l’internet delle cose ci stupirà) ma soprattutto lo Stato, con le sue mille agenzie di verifica e controllo, accedono ad ogni informazione decontestualizzandola, creando relazioni e correlazioni di cui non abbiamo coscienza, ma di cui subiamo quotidianamente le conseguenze.

    Non possiamo impedire tutto questo, il big data e gli open-data salveranno il mondo, d’accordo. Ma possiamo e dobbiamo pretendere di sapere il chi, il come e il quando. Abbiamo bisogno di sapere qual è il contesto, e quali sono le regole; solo così troveremo strategie, non per delinquere o eludere la legge (come sostiene parte della magistratura), ma per esercitare i diritti fondamentali della persona.

    Nel mondo fisico sappiamo quando lo Stato ha il diritto di entrare in casa nostra, o a quali condizioni possa limitare le nostre libertà personali, di movimento, d’espressione; nel mondo digitale non sappiamo, e neppure ci chiediamo, chi, quando e a quali condizioni possa impossessarsi dei nostri dati, dei nostri dispositivi tramite software occulti, della nostra vita. Accettiamo supinamente un’intollerabile opacità.

    Io ho qualcosa da nascondere da quando ho ricordi: sono riservatezze variabili a seconda dell’interlocutore, del tempo, del luogo e del contesto. E non voglio per me e i miei figli una società stupidamente disciplinata da una costante sorveglianza e decerebrata dagli algoritmi. Vorrei una società in cui l’asimmetria dell’informazione sia l’esatto opposto dell’attuale, dove purtroppo il cittadino è totalmente trasparente e lo Stato e le sue regole sono opache e incerte.
    Mostra commenti ( 0 )
    Carlo Blengino
    Carlo Blengino

    Avvocato penalista, affronta nelle aule giudiziarie il diritto delle nuove tecnologie, le questioni di copyright e di data protection. È fellow del NEXA Center for Internet & Society del Politecnico di Torino. @CBlengio su Twitter
    http://www.ilpost.it/carloblengino/2016/11/02/ho-qualcosa-da-nascondere
    Voting 0
  6. Quit fracking our lives to extract data that’s none of your business and that your machines misinterpret. — New Clues, #58

    That’s the blunt advice David Weinberger and I give to marketers who still make it hard to talk, sixteen years after many of them started failing to get what we meant by Markets are Conversations.

    In the world of 2001, people have become so machinelike that the most human character turns out to be a machine. That’s the essence of Kubrick’s dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.

    Even if our own intelligence is not yet artificialized, what’s feeding it surely is.

    In The Filter Bubble, after explaining Google’s and Facebook’s very different approaches to personalized “experience” filtration, and the assumptions behind both, Eli Pariser says both companies approximations are based on “a bad theory of you,” and come up with “pretty poor representations of who we are, in part because there is no one set of data that describes who we are.” He says the ideal of perfect personalization dumps us into what animators, puppetry and robotics engineers call the uncanny valley: a “place where something is lifelike but not convincingly alive, and it gives people the creeps.”

    Sanity requires that we line up many different personalities behind a single first person pronoun: I, me, mine. And also behind multiple identifiers. In my own case, I am Doc to most of those who know me, David to various government agencies (and most of the entities that bill me for stuff), Dave to many (but not all) family members, @dsearls to Twitter, and no name at all to the rest of the world, wherein I remain, like most of us, anonymous (literally, nameless), because that too is a civic grace. (And if you doubt that, ask any person who has lost their anonymity through the Faustian bargain called celebrity.)

    Third, advertising needs to return to what it does best: straightforward brand messaging that is targeted at populations, and doesn’t get personal. For help with that, start reading
    https://medium.com/@dsearls/on-market...bad-guesswork-88a84de937b0#.deu5ue16x
    Voting 0
  7. Aadhaar reflects and reproduces power imbalances and inequalities. Information asymmetries result in the data subject becoming a data object, to be manipulated, misrepresented and policed at will.

    Snowden: “Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say.”

    Snowden’s demolition of the argument doesn’t mean our work here is done. There are many other tropes that my (now renamed) Society for the Rejection of Culturally Relativist Excuses could tackle. Those that insist Indians are not private. That privacy is a western liberal construct that has no place whatsoever in Indian culture. That acknowledging privacy interests will stall development. This makes it particularly hard to advance claims of privacy, autonomy and liberty in the context of large e-governance and identity projects like Aadhaar: they earn one the labels of elitist, anti-progress, Luddite, paranoid and, my personal favourite, privacy fascist.
    http://scroll.in/article/748043/aadha...n-its-the-only-way-to-secure-equality
    Voting 0
  8. Although evidence-based algorithms consistently outperform human forecasters, people consistently fail to use them, especially after learning that they are imperfect. In this paper, we investigate how algorithm aversion might be overcome. In incentivized forecasting tasks, we find that people are considerably more likely to choose to use an algorithm, and thus perform better, when they can modify its forecasts. Importantly, this is true even when they are severely restricted in the modifications they can make. In fact, people’s decision to use an algorithm is insensitive to the magnitude of the modifications they are able to make. Additionally, we find that giving people the freedom to modify an algorithm makes people feel more satisfied with the forecasting process, more tolerant of errors, more likely to believe that the algorithm is superior, and more likely to choose to use an algorithm to make subsequent forecasts. This research suggests that one may be able to overcome algorithm aversion by giving people just a slight amount of control over the algorithm’s forecasts.
    http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2616787
    Voting 0
  9. A May 2014 White House report on “big data” notes that the ability to determine the demographic traits of individuals through algorithms and aggregation of online data has a potential downside beyond just privacy concerns: Systematic discrimination.

    There is a long history of denying access to bank credit and other financial services based on the communities from which applicants come — a practice called “redlining.” Likewise, the report warns, “Just as neighborhoods can serve as a proxy for racial or ethnic identity, there are new worries that big data technologies could be used to ‘digitally redline’ unwanted groups, either as customers, employees, tenants or recipients of credit.” (See materials from the report’s related research conference for scholars’ views on this and other issues.)

    One vexing problem, according to the report, is that potential digital discrimination is even less likely to be pinpointed, and therefore remedied.

    Approached without care, data mining can reproduce existing patterns of discrimination, inherit the prejudice of prior decision-makers, or simply reflect the widespread biases that persist in society. It can even have the perverse result of exacerbating existing inequalities by suggesting that historically disadvantaged groups actually deserve less favorable treatment.” The paper’s authors argue that the most likely legal basis for anti-discrimination enforcement, Title VII, is not currently adequate to stop many forms of discriminatory data mining, and “society does not have a ready answer for what to do about it.”

    Their 2014 paper “Digital Discrimination: The Case of Airbnb.com” examined listings for thousands of New York City landlords in mid-2012. Airbnb builds up a reputation system by allowing ratings from guests and hosts.

    The study’s findings include:

    “The raw data show that non-black and black hosts receive strikingly different rents: roughly $144 versus $107 per night, on average.” However, the researchers had to control for a variety of factors that might skew an accurate comparison, such as differences in geographical location.
    “Controlling for all of these factors, non-black hosts earn roughly 12% more for a similar apartment with similar ratings and photos relative to black hosts.”
    “Despite the potential of the Internet to reduce discrimination, our results suggest that social platforms such as Airbnb may have the opposite effect. Full of salient pictures and social profiles, these platforms make it easy to discriminate — as evidenced by the significant penalty faced by a black host trying to conduct business on Airbnb.”

    “Given Airbnb’s careful consideration of what information is available to guests and hosts,” Edelman and Luca note. “Airbnb might consider eliminating or reducing the prominence of host photos: It is not immediately obvious what beneficial information these photos provide, while they risk facilitating discrimination by guests. Particularly when a guest will be renting an entire property, the guest’s interaction with the host will be quite limited, and we see no real need for Airbnb to highlight the host’s picture.” (For its part, Airbnb responded to the study by saying that it prohibits discrimination in its terms of service, and that the data analyzed were both older and limited geographically.)
    http://journalistsresource.org/studie...racial-discrimination-research-airbnb
    Voting 0
  10. there seems to be something wrong with personalization. We are continuously bumping into obtrusive, uninteresting ads. Our digital personal assistant isn’t all that personal. We’ve lost friends to the algorithmic abyss of the News feed. The content we encounter online seems to repeat the same things again and again. There are five main reasons why personalization remains broken.

    Additionally, there lies a more general paradox at the very heart of personalization.

    Personalization promises to modify your digital experience based on your personal interests and preferences. Simultaneously, personalization is used to shape you, to influence you and guide your everyday choices and actions. Inaccessible and incomprehensible algorithms make autonomous decisions on your behalf. They reduce the amount of visible choices, thus restricting your personal agency.

    Because of the personalization gaps and internal paradox, personalization remains unfulfilling and incomplete. It leaves us with a feeling that it serves someone else’s interests better than our own.
    http://techcrunch.com/2015/06/25/the-...n=Feed%3A+Techcrunch+%28TechCrunch%29
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 2 Online Bookmarks of M. Fioretti: Tags: algorithms + privacy

About - Propulsed by SemanticScuttle