mfioretti: algorithms* + big data*

Bookmarks on this page are managed by an admin user.

16 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. no serious scholar of modern geopolitics disputes that we are now at war — a new kind of information-based war, but war, nevertheless — with Russia in particular, but in all honesty, with a multitude of nation states and stateless actors bent on destroying western democratic capitalism. They are using our most sophisticated and complex technology platforms to wage this war — and so far, we’re losing. Badly.

    Why? According to sources I’ve talked to both at the big tech companies and in government, each side feels the other is ignorant, arrogant, misguided, and incapable of understanding the other side’s point of view. There’s almost no data sharing, trust, or cooperation between them. We’re stuck in an old model of lobbying, soft power, and the occasional confrontational hearing.

    Not exactly the kind of public-private partnership we need to win a war, much less a peace.

    Am I arguing that the government should take over Google, Amazon, Facebook, and Apple so as to beat back Russian info-ops? No, of course not. But our current response to Russian aggression illustrates the lack of partnership and co-ordination between government and our most valuable private sector companies. And I am hoping to raise an alarm: When the private sector has markedly better information, processing power, and personnel than the public sector, one will only strengthen, while the latter will weaken. We’re seeing it play out in our current politics, and if you believe in the American idea, you should be extremely concerned.
    https://shift.newco.co/data-power-and-war-465933dcb372
    Voting 0
  2. Similarly, GOOG in 2014 started reorganizing itself to focus on artificial intelligence only. In January 2014, GOOG bought DeepMind, and in September they shutdown Orkut (one of their few social products which had momentary success in some countries) forever. The Alphabet Inc restructuring was announced in August 2015 but it likely took many months of meetings and bureaucracy. The restructuring was important to focus the web-oriented departments at GOOG towards a simple mission. GOOG sees no future in the simple Search market, and announces to be migrating “From Search to Suggest” (in Eric Schmidt’s own words) and being an “AI first company” (in Sundar Pichai’s own words). GOOG is currently slightly behind FB in terms of how fast it is growing its dominance of the web, but due to their technical expertise, vast budget, influence and vision, in the long run its AI assets will play a massive role on the internet. They know what they are doing.

    These are no longer the same companies as 4 years ago. GOOG is not anymore an internet company, it’s the knowledge internet company. FB is not an internet company, it’s the social internet company. They used to attempt to compete, and this competition kept the internet market diverse. Today, however, they seem mostly satisfied with their orthogonal dominance of parts of the Web, and we are losing diversity of choices. Which leads us to another part of the internet: e-commerce and AMZN.

    AMZN does not focus on making profit.
    https://staltz.com/the-web-began-dying-in-2014-heres-how.html
    Voting 0
  3. Notato? “E assicuratori”.

    Quello che i giornali italiani a marzo 2016 non dicono è che lo sbarco di Ibm nell’area Expo – buco nero a cui da un paio d’anni si fatica a trovare un futuro – è subordinato alla consegna a Ibm dei dati sanitari degli abitanti della Lombardia, una delle regioni più ricche d’Europa. Sono le cosiddette “Protected Health Information”, che includono “i dati dell’assistenza sanitaria”, le “cartelle cliniche personali”, le “informazioni fiscali nominative o anonimizzate”. Cedendo all’azienda americana i “diritti all’uso per la memorizzazione ed elaborazione di tali dati a fini progettuali, nonché per l’utilizzo dei dati anonimizzati anche per finalità ulteriori a quelle progettuali”. Insomma: è in arrivo il Grande Fratello della Sanità che avrà a disposizione tutti i nostri dati sanitari.

    Nel documento “confidenziale” Ibm che il Fatto quotidiano ha potuto vedere, si legge: “Come presupposto per realizzare il Programma ed effettuare l’investimento, Ibm (incluse le società controllanti, controllate, affiliate o collegate, ove necessario) si aspetta di poter avere accesso – in modalità da definire – al trattamento dei dati sanitari dei circa 61 milioni di cittadini italiani (intesi come dati sanitari storici, presenti e futuri) in forma anonima e identificata, per specifici ambiti progettuali, ivi incluso il diritto all’utilizzo secondario dei predetti dati sanitari per finalità ulteriori rispetto ai progetti”.

    La sanità pubblica italiana consegnata tutta nelle sapienti mani di una multinazionale americana. “A titolo esemplificativo ma non esaustivo”, continua il documento confidenziale, “si ritiene cruciale avere accesso a dati dei pazienti, ai dati farmacologici, ai dati del registro dei tumori, ai dati genomici, dati delle cure, dati regionali o Agenas, dati Aifa sui farmaci, sugli studi clinici attivi, dati di iscrizione e demografici, diagnosi mediche storiche, rimborsi e costi di utilizzo, condizioni e procedure mediche, prescizioni ambulatoriali, trattamenti farmacologici con relativi costi, visite di pronto soccorso, schede di dimissioni ospedaliere (sdo), informazioni sugli appuntamenti, orari e presenze, e altri dati sanitari”. Ogni nostro respiro, ogni nostro battito, ogni nostro bacillo, ogni nostro pagamento per la sanità entrerà nei computer Watson Ibm per alimentare la loro capacità di apprendimento e sviluppare la loro intelligenza artificiale. I risultati che saranno via via raggiunti, gli algoritmi che saranno messi a punto grazie ai nostri dati resteranno privati. Ibm potrà venderli alle industrie sanitarie o alle compagnie d’assicurazione.

    Ora la parola è passata alla Regione Lombardia, la prima d’Italia a essere coinvolta. Essendo stata bocciata la riforma costituzionale che toglieva i poteri alle Regioni, dovrà dare il suo ok. Viene qualche dubbio sul fatto che uno dei grandi mercati del futuro, quello della salute, sia di fatto regalato a una azienda privata, senza chiedere nulla in cambio, ma soddisfatti soltanto dalla promessa che questa apra un centro sui tribolati terreni Expo.

    I dati saranno “anonimizzati”, promette qua e là il documento. Ma ormai si stanno affinando sistemi in grado di rendere “reversibili” i dati anonimi, rinominandoli. Chissà se il Garante della privacy, così sensibile ad altre battaglie, avrà tempo per dire la sua anche su questo progetto. Comunque, anche anonimi, i dati sanitari della Lombardia sono un bene preziosissimo: perché passarli a un’impresa privata, esautorando del tutto il sistema sanitario pubblico? E, se proprio bisogna darli ai privati, perché senza gara? Perché a Ibm-Watson e non, per esempio, a Google-Deep Mind o Amazon? E perché, infine, concederli gratis? L’investimento previsto di 150 milioni di dollari per un centro privato è nulla rispetto al valore dell’immensa mole di dati sanitari promessi, che sul deep web oggi vengono venduti a 10-15 dollari a record (50 volte più dei codici di una carta di credito).
    http://www.giannibarbacetto.it/2017/0...cambio-della-nuova-sede-sullarea-expo
    Voting 0
  4. The reason I bring this up: first of all, it’s a great way of understanding how machine learning algorithms can give us stuff we absolutely don’t want, even though they fundamentally lack prior agendas. Happens all the time, in ways similar to the Donald.

    Second, some people actually think there will soon be algorithms that control us, operating “through sound decisions of pure rationality” and that we will no longer have use for politicians at all.

    And look, I can understand why people are sick of politicians, and would love them to be replaced with rational decision-making robots. But that scenario means one of three things:

    1. Controlling robots simply get trained by the people’s will and do whatever people want at the moment. Maybe that looks like people voting with their phones or via the chips in their heads. This is akin to direct democracy, and the problems are varied – I was in Occupy after all – but in particular mean that people are constantly weighing in on things they don’t actually understand. That leaves them vulnerable to misinformation and propaganda.

    2. Controlling robots ignore people’s will and just follow their inner agendas. Then the question becomes, who sets that agenda? And how does it change as the world and as culture changes? Imagine if we were controlled by someone from 1000 years ago with the social mores from that time. Someone’s gonna be in charge of “fixing” things.

    3. Finally, it’s possible that the controlling robot would act within a political framework to be somewhat but not completely influenced by a democratic process. Something like our current president. But then getting a robot in charge would be a lot like voting for a president. Some people would agree with it, some wouldn’t. Maybe every four years we’d have another vote, and the candidates would be both people and robots, and sometimes a robot would win, sometimes a person. I’m not saying it’s impossible, but it’s not utopian. There’s no such thing as pure rationality in politics, it’s much more about picking sides and appealing to some people’s desires while ignoring others.
    https://mathbabe.org/2016/08/11/donal...e-a-biased-machine-learning-algorithm
    Voting 0
  5. Quit fracking our lives to extract data that’s none of your business and that your machines misinterpret. — New Clues, #58

    That’s the blunt advice David Weinberger and I give to marketers who still make it hard to talk, sixteen years after many of them started failing to get what we meant by Markets are Conversations.

    In the world of 2001, people have become so machinelike that the most human character turns out to be a machine. That’s the essence of Kubrick’s dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.

    Even if our own intelligence is not yet artificialized, what’s feeding it surely is.

    In The Filter Bubble, after explaining Google’s and Facebook’s very different approaches to personalized “experience” filtration, and the assumptions behind both, Eli Pariser says both companies approximations are based on “a bad theory of you,” and come up with “pretty poor representations of who we are, in part because there is no one set of data that describes who we are.” He says the ideal of perfect personalization dumps us into what animators, puppetry and robotics engineers call the uncanny valley: a “place where something is lifelike but not convincingly alive, and it gives people the creeps.”

    Sanity requires that we line up many different personalities behind a single first person pronoun: I, me, mine. And also behind multiple identifiers. In my own case, I am Doc to most of those who know me, David to various government agencies (and most of the entities that bill me for stuff), Dave to many (but not all) family members, @dsearls to Twitter, and no name at all to the rest of the world, wherein I remain, like most of us, anonymous (literally, nameless), because that too is a civic grace. (And if you doubt that, ask any person who has lost their anonymity through the Faustian bargain called celebrity.)

    Third, advertising needs to return to what it does best: straightforward brand messaging that is targeted at populations, and doesn’t get personal. For help with that, start reading
    https://medium.com/@dsearls/on-market...bad-guesswork-88a84de937b0#.deu5ue16x
    Voting 0
  6. Aadhaar reflects and reproduces power imbalances and inequalities. Information asymmetries result in the data subject becoming a data object, to be manipulated, misrepresented and policed at will.

    Snowden: “Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say.”

    Snowden’s demolition of the argument doesn’t mean our work here is done. There are many other tropes that my (now renamed) Society for the Rejection of Culturally Relativist Excuses could tackle. Those that insist Indians are not private. That privacy is a western liberal construct that has no place whatsoever in Indian culture. That acknowledging privacy interests will stall development. This makes it particularly hard to advance claims of privacy, autonomy and liberty in the context of large e-governance and identity projects like Aadhaar: they earn one the labels of elitist, anti-progress, Luddite, paranoid and, my personal favourite, privacy fascist.
    http://scroll.in/article/748043/aadha...n-its-the-only-way-to-secure-equality
    Voting 0
  7. A May 2014 White House report on “big data” notes that the ability to determine the demographic traits of individuals through algorithms and aggregation of online data has a potential downside beyond just privacy concerns: Systematic discrimination.

    There is a long history of denying access to bank credit and other financial services based on the communities from which applicants come — a practice called “redlining.” Likewise, the report warns, “Just as neighborhoods can serve as a proxy for racial or ethnic identity, there are new worries that big data technologies could be used to ‘digitally redline’ unwanted groups, either as customers, employees, tenants or recipients of credit.” (See materials from the report’s related research conference for scholars’ views on this and other issues.)

    One vexing problem, according to the report, is that potential digital discrimination is even less likely to be pinpointed, and therefore remedied.

    Approached without care, data mining can reproduce existing patterns of discrimination, inherit the prejudice of prior decision-makers, or simply reflect the widespread biases that persist in society. It can even have the perverse result of exacerbating existing inequalities by suggesting that historically disadvantaged groups actually deserve less favorable treatment.” The paper’s authors argue that the most likely legal basis for anti-discrimination enforcement, Title VII, is not currently adequate to stop many forms of discriminatory data mining, and “society does not have a ready answer for what to do about it.”

    Their 2014 paper “Digital Discrimination: The Case of Airbnb.com” examined listings for thousands of New York City landlords in mid-2012. Airbnb builds up a reputation system by allowing ratings from guests and hosts.

    The study’s findings include:

    “The raw data show that non-black and black hosts receive strikingly different rents: roughly $144 versus $107 per night, on average.” However, the researchers had to control for a variety of factors that might skew an accurate comparison, such as differences in geographical location.
    “Controlling for all of these factors, non-black hosts earn roughly 12% more for a similar apartment with similar ratings and photos relative to black hosts.”
    “Despite the potential of the Internet to reduce discrimination, our results suggest that social platforms such as Airbnb may have the opposite effect. Full of salient pictures and social profiles, these platforms make it easy to discriminate — as evidenced by the significant penalty faced by a black host trying to conduct business on Airbnb.”

    “Given Airbnb’s careful consideration of what information is available to guests and hosts,” Edelman and Luca note. “Airbnb might consider eliminating or reducing the prominence of host photos: It is not immediately obvious what beneficial information these photos provide, while they risk facilitating discrimination by guests. Particularly when a guest will be renting an entire property, the guest’s interaction with the host will be quite limited, and we see no real need for Airbnb to highlight the host’s picture.” (For its part, Airbnb responded to the study by saying that it prohibits discrimination in its terms of service, and that the data analyzed were both older and limited geographically.)
    http://journalistsresource.org/studie...racial-discrimination-research-airbnb
    Voting 0
  8. there seems to be something wrong with personalization. We are continuously bumping into obtrusive, uninteresting ads. Our digital personal assistant isn’t all that personal. We’ve lost friends to the algorithmic abyss of the News feed. The content we encounter online seems to repeat the same things again and again. There are five main reasons why personalization remains broken.

    Additionally, there lies a more general paradox at the very heart of personalization.

    Personalization promises to modify your digital experience based on your personal interests and preferences. Simultaneously, personalization is used to shape you, to influence you and guide your everyday choices and actions. Inaccessible and incomprehensible algorithms make autonomous decisions on your behalf. They reduce the amount of visible choices, thus restricting your personal agency.

    Because of the personalization gaps and internal paradox, personalization remains unfulfilling and incomplete. It leaves us with a feeling that it serves someone else’s interests better than our own.
    http://techcrunch.com/2015/06/25/the-...n=Feed%3A+Techcrunch+%28TechCrunch%29
    Voting 0
  9. The outcome of key battles about developing smart cities will depend on who owns the data. There is no reason why it has to be private companies.

    Uber’s emergence as a useful data repository no urban planners want to miss is in line with the broader ideology of solutionism espoused by Silicon Valley. Technology companies, having grabbed one of the most precious contemporary resources – data – now have the leverage over cash-strapped and unimaginative governments, pitching themselves as inevitable, benevolent saviours to the dull bureaucrats inside city administrations.

    Cities that cosy up to Uber, however, risk becoming too dependent on its data streams. Why accept Uber’s role as a data intermediary? Instead of letting the company hoover up extensive details about who is going where and when, cities should find a way to get this data on their own. Only then should the likes of Uber be allowed to step in and build a service on top of them.

    At the moment, Uber is so effective because it controls all the key data points: our phones tell it all it needs to know about planning a trip. If, however, control over data were to pass to cities, Uber – a company with few assets – would hardly be worth the $40bn that it’s valued at today. Surely, an algorithm to match supply and demand cannot be that expensive?

    The real challenge, however, is to make such city apps work with other forms of transport. Uber’s solutionist vision is now clear: you launch its app on your smartphone and a car pulls up to drive you where you want to go. To call this unimaginative would be an understatement; it is an approach that works fine in America, where walking is rarely an option and public transport mostly nonexistent.

    Why should this be a template for the rest of the world? Just because walking is unprofitable from Uber’s perspective does not mean that it’s a form of transport that should be written off.
    http://www.theguardian.com/commentisf...uber-trasnsport-choice-evgeny-morozov
    Voting 0
  10. The intimate secret meetings between senior Enron executives and high-level US government officials via the Pentagon Highlands Forum, from November 2000 to June 2001, played a central role in establishing and cementing the increasingly symbiotic link between Enron and Pentagon planning. The Forum’s role was, as O’Neill has always said, to function as an ideas lab to explore the mutual interests of industry and government.
    Enron and Pentagon war planning

    In February 2001, when Enron executives including Kenneth Lay began participating concertedly in the Cheney Energy Task Force, a classified National Security Council document instructed NSC staffers to work with the task force in “melding” previously separate issues: “operational policies towards rogue states” and “actions regarding the capture of new and existing oil and gas fields.”

    According to Bush’s treasury secretary Paul O’Neill, as quoted by Ron Suskind in The Price of Loyalty (2004), cabinet officials discussed an invasion of Iraq in their first NSC meeting, and had even prepared a map for a post-war occupation marking the carve-up of Iraq’s oil fields. The message at that time from President Bush was that officials must “find a way to do this.”


    in June 2001, the same month that Enron’s executive vice president Steve Kean attended the Pentagon Highlands Forum, the company’s hopes for the Dabhol project were dashed when the Trans-Afghan pipeline failed to materialize, and as a consequence, construction on the Dabhol power plant was shut down. The failure of the $3 billion project contributed to Enron’s bankruptcy in December. That month, Enron officials met with Bush’s commerce secretary, Donald Evans, about the plant, and Cheney lobbied India’s main opposition party about the Dhabol project. Ken Lay had also reportedly contacted the Bush administration around this time to inform officials about the firm’s financial troubles.

    By August, desperate to pull off the deal, US officials threatened Taliban representatives with war if they refused to accept American terms: namely, to cease fighting and join in a federal alliance with the opposition Northern Alliance; and to give up demands for local consumption of the gas. On the 15th of that month, Enron lobbyist Pat Shortridge told then White House economic advisor Robert McNally that Enron was heading for a financial meltdown that could cripple the country’s energy markets.

    So the Pentagon had:

    1. contracted Rendon, a propaganda firm;

    2. given Rendon access to the intelligence community’s most classified information including data from NSA surveillance;

    3. tasked Rendon to facilitating the DoD’s development of information operations strategy by running the Highlands Forum process;

    4. and further, tasked Rendon with overseeing the concrete execution of this strategy developed through the Highlands Forum process, in actual information operations around the world in Iraq, Afghanistan and beyond.

    The Pentagon Highlands Forum’s intimate link, via Rendon, to the propaganda operations pursued under Bush and Obama in support of the ‘Long War,’ demonstrate the integral role of mass surveillance in both irregular warfare and ‘strategic communications.’

    Arquilla went on to advocate that western intelligence services should use the British case as a model for creating new “pseudo gang” terrorist groups, as a way of undermining “real” terror networks:

    “What worked in Kenya a half-century ago has a wonderful chance of undermining trust and recruitment among today’s terror networks. Forming new pseudo gangs should not be difficult.”

    Essentially, Arquilla’s argument was that as only networks can fight networks, the only way to defeat enemies conducting irregular warfare is to use techniques of irregular warfare against them.

    It is this sort of closed-door networking that has rendered the American vote pointless. Far from protecting the public interest or helping to combat terrorism, the comprehensive monitoring of electronic communications has been systematically abused to empower vested interests in the energy, defense, and IT industries.
    https://medium.com/@NafeezAhmed/why-google-made-the-nsa-2a80584c9c1
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 2 Online Bookmarks of M. Fioretti: Tags: algorithms + big data

About - Propulsed by SemanticScuttle