mfioretti: big data*

Bookmarks on this page are managed by an admin user.

330 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. Today’s Internet and digital platforms are becoming increasingly centralised, slowing innovation and challenging their potential to revolutionise society and the economy in a pluralistic manner.

    The DECODE project will develop practical alternatives, through the creation, evaluation and demonstration of a distributed and open architecture for managing online access and aggregation of private information to allow a citizen-friendly and privacy-aware governance of access entitlements.

    Strong ethical and digital rights principles are at the base of DECODE’s mission, moving towards the implementation of open standards for a technical architecture resting on the use of Attribute Based Cryptography, distributed ledgers, secure operating system and a privacy focused smart rules language
    https://decodeproject.github.io/whitepaper/#pf6
    Voting 0
  2. After Barack Obama won reelection in 2012, voter targeting and other uses of Big Data in campaigns was all the rage. The following spring, at a conference titled Data-Crunched Democracy that Turow organized with Daniel Kreiss of the University of North Carolina, I listened as Ethan Roeder, the head of data analytics for Obama 2012, railed against critics. “Politicians exist to manipulate you,” he said, “and that is not going to change, regardless of how information is used.” He continued: “OK, maybe we have a new form of manipulation, we have micro-manipulation, but what are the real concerns? What is the real problem that we see with the way information is being used? Because if it’s manipulation, that ship has long since sailed.” To Roeder, the bottom line was clear: “Campaigns do not care about privacy. All campaigns care about is winning.”

    A few of us at the conference, led by the sociologist Zeynep Tufekci, argued that because individual voter data was being weaponized with behavioral-science insights in ways that could be finely tuned and also deployed outside of public view, the potential now existed to engineer the public toward outcomes that wealthy interests would pay dearly to control. No one listened. Until last year, you could not get a major US foundation to put a penny behind efforts to monitor and unmask these new forms of hidden persuasion.

    If there’s any good news in the last week of revelations about the data firm Cambridge Analytica’s 2014 acquisition (and now-notorious 2016 use) of the profile data of 50 million Facebook members, it’s this: Millions of people are now awake to just how naked and exposed they are in the public sphere. And clearly, people care a lot more about political uses of their personal data than they do about someone trying to sell them a pair of shoes. That’s why so many people are suddenly talking about deleting their Facebook accounts.
    http://www.other-news.info/2018/03/po...eeds-to-be-restored-to-internet-users
    Voting 0
  3. Then, if the economic value of personal data is so limited, why is there all this fuss about this economic dwarf? The answer is that this is not an economic matter but a question of power. Not the power of making people buy specific economic products, which always at doubt, but power per se. Power to organize the environment in which each of us develops her vision of the world, the power on thoughts and bodies. And among the big corporations of this dwarf universe, who cares if data power creates chaos, destruction and insanity. Faced with the disaster that it brings about, they will only respond with trying to grab even more power on the pretext to correct their misdeeds. It is from below, from us, through groups who adopt and create their own knwoledge tools that the next world can emerge. It is already there in scraps, but to see its premises, one needs to get rid of dogmas.
    http://paigrain.debatpublic.net/?p=9824&lang=en
    Tags: , , by M. Fioretti (2018-03-29)
    Voting 0
  4. Di nuovo: dove sta lo scandalo di questi giorni, dunque? Lo scandalo sta nell’evidenza di un errore di fondo nella concezione delle interazioni umane, la concezione che Mark Zuckerberg ha imposto — per sua stessa ammissione, nel tanto agnognato intervento post-Cambridge Analytica — dal 2007. L’idea cioè di costruire un “web dove si è social di default”. Dove cioè la norma è condividere. Un principio che è strutturalmente opposto alla tutela della privacy individuale, che si fonda sulla riservatezza come norma, riguardo ai propri dati personali.

    Zuckerberg lo spiega benissimo nel suo più recente intervento, individuando - giustamente - in quell’errore filosofico e antropologico la radice della tempesta in cui è costretto a destreggiarsi: “Nel 2007, abbiamo lanciato la Facebook Platform nella convinzione (“vision”) che più app dovessero essere social. Il tuo calendario doveva poterti mostrare il compleanno degli amici, le tue mappe mostrare dove vivono i tuoi amici, il tuo address book le loro foto. Per farlo, abbiamo consentito di accedere alle app e condividere chi fossero i tuoi amici e alcune informazioni su di loro”.

    È questo che conduce, nel 2013, Kogan a ottenere l’accesso ai dati di milioni di persone. E certo, quei dati hanno un immenso valore scientifico — ed è giusto che la ricerca, se condotta nel pieno rispetto del consenso informato degli utenti divenuti soggetti sperimentali, possa accedervi. Per soli scopi accademici, però. E anche così, già nel 2014 il famoso esperimento condotto da Facebook stessa sulla manipolazione delle emozioni di centinaia di migliaia di utenti, a cui erano stati mostrati deliberatamente più contenuti positivi o negativi, aveva dimostrato che anche quando non ci sono di mezzo fini commerciali, la questione è ambigua, complessa. E che no, non basta accettare condizioni di utilizzo intricate e che non legge nessuno per dire che allora ogni utente ha, per il fatto stesso di avere accettato di essere su Facebook, di diventare indiscriminatamente un topo di laboratorio arruolato in esperimenti di cui ignora tutto.

    Eppure è proprio la piattaforma a rendersi conto, già in quello stesso anno, che così le cose non vanno. Che a quel modo Facebook perde il controllo su quali terze parti hanno accesso ai dati dei suoi utenti. La policy dunque cambia, e da allora gli “amici” devono acconsentire al trattamento dei propri dati da parte di una app. La nuova filosofia, ricorda Albright, è “people first”. Ma è tardi. E l’incapacità di rientrare davvero in possesso di quell’ammasso di informazioni, dimostrata dal caso Cambridge Analytica – possibile Facebook debba scoprire dai giornali che l’azienda non aveva cancellato i dati che diceva di aver cancellato, o che debba comunque condurre un auditing serio per verificarlo ora, dimostrando di non avere idea se lo siano o meno? – fa capire che il problema va ben oltre il singolo caso in questione, ma è sistematico.

    Per capirci più chiaramente: come scrive Albright, la prima versione delle API v.1.0 per il Facebook Graph – cioè ciò che gli sviluppatori di applicazioni potevano ottenere dal social network tra il 2010, data di lancio, e il 2014, data in cui la policy è cambiata – consentiva di sapere non su chi si iscriveva a una determinata app, ma dagli amici inconsapevoli, i seguenti dati: “about, azioni, attività, compleanno, check-ins, istruzione, eventi, giochi, gruppi, residenza, interessi, like, luogo, note, status, tag, foto, domande, relazioni, religione/politica, iscrizioni, siti, storia lavorativa”. Davvero si poteva pensare di controllare dove finissero tutti questi dati, per milioni e milioni di persone?

    E davvero Facebook lo scopre oggi? Nel 2011, la Federal Trade Commission americana aveva già segnalato la questione come problematica. Non ha insegnato nulla
    https://www.valigiablu.it/facebook-cambridge-analytica-scandalo
    Voting 0
  5. Stratumseind in Eindhoven is one of the busiest nightlife streets in the Netherlands. On a Saturday night, bars are packed, music blares through the street, laughter and drunken shouting bounces off the walls. As the night progresses, the ground becomes littered with empty shot bottles, energy drink cans, cigarette butts and broken glass.

    It’s no surprise that the place is also known for its frequent fights. To change that image, Stratumseind has become one of the “smartest” streets in the Netherlands. Lamp-posts have been fitted with wifi-trackers, cameras and 64 microphones that can detect aggressive behaviour and alert police officers to altercations. There has been a failed experiment to change light intensity to alter the mood. The next plan, starting this spring, is to diffuse the smell of oranges to calm people down. The aim? To make Stratumseind a safer place.

    We get that comment a lot – ‘Big brother is watching you’. I prefer to say, ‘Big brother is helping you’
    Peter van de Crommert

    All the while, data is being collected and stored. “Visitors do not realise they are entering a living laboratory,” says Maša Galic, a researcher on privacy in the public space for the Tilburg Institute of Law, Technology and Society. Since the data on Stratumseind is used to profile, nudge or actively target people, this “smart city” experiment is subject to privacy law. According to the Dutch Personal Data Protection Act, people should be notified in advance of data collection and the purpose should be specified – but in Stratumseind, as in many other “smart cities”, this is not the case.

    Peter van de Crommert is involved at Stratumseind as project manager with the Dutch Institute for Technology, Safety and Security. He says visitors do not have to worry about their privacy: the data is about crowds, not individuals. “We often get that comment – ‘Big brother is watching you’ – but I prefer to say, ‘Big brother is helping you’. We want safe nightlife, but not a soldier on every street corner.”
    Revellers in Eindhoven’s Stratumseind celebrate King’s Day.
    Facebook
    Twitter
    Pinterest
    Revellers in Eindhoven’s Stratumseind celebrate King’s Day. Photograph: Filippo Manaresi/Moment Editorial/Getty Images

    When we think of smart cities, we usually think of big projects: Songdo in South Korea, the IBM control centre in Rio de Janeiro or the hundreds of new smart cities in India. More recent developments include Toronto, where Google will build an entirely new smart neighbourhood, and Arizona, where Bill Gates plans to build his own smart city. But the reality of the smart city is that it has stretched into the everyday fabric of urban life – particularly so in the Netherlands.
    Advertisement

    In the eastern city of Enschede, city traffic sensors pick up your phone’s wifi signal even if you are not connected to the wifi network. The trackers register your MAC address, the unique network card number in a smartphone. The city council wants to know how often people visit Enschede, and what their routes and preferred spots are. Dave Borghuis, an Enschede resident, was not impressed and filed an official complaint. “I don’t think it’s okay for the municipality to track its citizens in this way,” he said. “If you walk around the city, you have to be able to imagine yourself unwatched.”

    Enschede is enthusiastic about the advantages of the smart city. The municipality says it is saving €36m in infrastructure investments by launching a smart traffic app that rewards people for good behaviour like cycling, walking and using public transport. (Ironically, one of the rewards is a free day of private parking.) Only those who mine the small print will discover that the app creates “personal mobility profiles”, and that the collected personal data belongs to the company Mobidot.
    https://www.theguardian.com/cities/20...-privacy-eindhoven-utrecht?CMP=twt_gu
    Voting 0
  6. Finally, there’s what the authors call “political security” – using AI to automate tasks involved in surveillance, persuasion (creating targeted propaganda) and deception (eg, manipulating videos). We can also expect new kinds of attack based on machine-learning’s capability to infer human behaviours, moods and beliefs from available data. This technology will obviously be welcomed by authoritarian states, but it will also further undermine the ability of democracies to sustain truthful public debates. The bots and fake Facebook accounts that currently pollute our public sphere will look awfully amateurish in a couple of years.

    The report is available as a free download and is worth reading in full. If it were about the dangers of future or speculative technologies, then it might be reasonable to dismiss it as academic scare-mongering. The alarming thing is most of the problematic capabilities that its authors envisage are already available and in many cases are currently embedded in many of the networked services that we use every day. William Gibson was right: the future has already arrived.
    https://www.theguardian.com/commentis...ic-nightmare-real-threat-more-current
    Voting 0
  7. The “bad part of town” will be full of algorithms that shuffle you straight from high-school detention into the prison system. The rich part of town will get mirror-glassed limos that breeze through the smart red lights to seamlessly deliver the aristocracy from curb into penthouse.

    These aren’t the “best practices” beloved by software engineers; they’re just the standard urban practices, with software layered over. It’s urban design as the barbarian’s varnish on urbanism. People could have it otherwise, technically, if they really wanted it and had the political will, but they don’t. So they won’t get it.
    https://www.theatlantic.com/technolog.../stupid-cities/553052/?utm_source=twb
    Voting 0
  8. Here’s one example: In 2014, Maine Gov. Paul LePage released data to the public detailing over 3,000 transactions from welfare recipients using EBT cards in the state. (EBT cards are like state-issued debit cards, and can be used to disperse benefits like food stamps.)

    LePage created a list of every time this money had been used in a strip club, liquor store, or bar, and used it to push his political agenda of limiting access to state benefits. LePage’s list represents a tiny fraction of overall EBT withdrawals, but it effectively reinforced negative stereotypes and narratives about who relies on welfare benefits and why.

    I spoke with Eubanks recently about her new book, and why she believes automated technologies are being used to rig the welfare system against the people who need it the most.

    A lightly edited transcript of our conversation follows.
    Sean Illing

    What’s the thesis of your book?
    Virginia Eubanks

    There’s a collision of political forces and technical innovations that are devastating poor and working-class families in America.
    https://www.vox.com/2018/2/6/16874782/welfare-big-data-technology-poverty
    Voting 0
  9. urope has propelled past the United States when it comes to constraining the abuses of Big Tech. In June, the European Union fined Google $2.7 billion for steering web users to its shopping site, and investigations remain active over similar treatment on Android phones. European regulators fined Facebook for lying about whether it could match user profiles with phone numbers on its messaging acquisition WhatsApp. They demanded Apple repay $15.3 billion in back taxes in Ireland. And they forced Amazon to change its e-book contracts, which they claimed inappropriately squeezed publishers.
    AP Photo/Rick Bowmer

    Trust-Busted: In 2002, Microsoft Chairman Bill Gates had to testify at federal court in his company's antitrust case. The public trial led Microsoft to sfoten its aggressive strategy against rivals.

    Unfortunately, these actions were treated mainly as the cost of doing business. The Facebook fine totaled not even 1 percent of the $22 billion purchase price for WhatsApp, and it allowed the two companies to remain partnered. Government policy, in effect, has “told these companies that the smart thing to do is to lie to us and break the law,” said Scott Galloway in his presentation. Google’s remedy in the shopping case still forces rivals to bid for placement at the top of the page, with Google Shopping spun off as a stand-alone competitor. This does weaken Google’s power and solves the “equal treatment” problem, but it doesn’t protect consumers, who will ultimately pay for those costly bids. “The EU got a $2.7 billion fine to hold a party and bail out Greek banks,” said Gary Reback, an antitrust lawyer and critic of the EU’s actions. “No amount of money will make a difference.”

    However, one thing might: Europe’s increasing move toward data privacy. The General Data Protection Regulation (GDPR), scheduled for implementation in May 2018, empowers European web users to affirmatively opt out of having their data collected, with high penalties for non-compliance. Consumers will be able to obtain their personal data and learn how it is used. They can request that their data be erased completely (known as the “right to be forgotten”) as well as prohibited from sale to third parties. Platforms could not condition use of their products on data collection. A separate, not-yet-finalized regulation called ePrivacy would forbid platforms from tracking users across separate apps, websites, and devices.
    http://prospect.org/article/big-tech-new-predatory-capitalism
    Voting 0
  10. Some entries are ambiguous. Take Microsoft, under the “operational services” category. PayPal apparently supplies the tech company with an image of a customer–a photo or video–or their image from an identity document for the purposes of “facial image comparison for fraud protection” and “research and testing as to appropriateness of new products.” The former sounds like some kind of facial recognition system that PayPal uses to look for fraud. But the latter is uneasily broad. What kind of research is Microsoft doing using pictures of PayPal users’ faces? PayPal did not comment on this specific question.
    https://www.fastcodesign.com/90157501...ource=twitter.com&utm_campaign=buffer
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 33 Online Bookmarks of M. Fioretti: Tags: big data

About - Propulsed by SemanticScuttle