mfioretti: usa 2016*

Bookmarks on this page are managed by an admin user.

94 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. Di nuovo: dove sta lo scandalo di questi giorni, dunque? Lo scandalo sta nell’evidenza di un errore di fondo nella concezione delle interazioni umane, la concezione che Mark Zuckerberg ha imposto — per sua stessa ammissione, nel tanto agnognato intervento post-Cambridge Analytica — dal 2007. L’idea cioè di costruire un “web dove si è social di default”. Dove cioè la norma è condividere. Un principio che è strutturalmente opposto alla tutela della privacy individuale, che si fonda sulla riservatezza come norma, riguardo ai propri dati personali.

    Zuckerberg lo spiega benissimo nel suo più recente intervento, individuando - giustamente - in quell’errore filosofico e antropologico la radice della tempesta in cui è costretto a destreggiarsi: “Nel 2007, abbiamo lanciato la Facebook Platform nella convinzione (“vision”) che più app dovessero essere social. Il tuo calendario doveva poterti mostrare il compleanno degli amici, le tue mappe mostrare dove vivono i tuoi amici, il tuo address book le loro foto. Per farlo, abbiamo consentito di accedere alle app e condividere chi fossero i tuoi amici e alcune informazioni su di loro”.

    È questo che conduce, nel 2013, Kogan a ottenere l’accesso ai dati di milioni di persone. E certo, quei dati hanno un immenso valore scientifico — ed è giusto che la ricerca, se condotta nel pieno rispetto del consenso informato degli utenti divenuti soggetti sperimentali, possa accedervi. Per soli scopi accademici, però. E anche così, già nel 2014 il famoso esperimento condotto da Facebook stessa sulla manipolazione delle emozioni di centinaia di migliaia di utenti, a cui erano stati mostrati deliberatamente più contenuti positivi o negativi, aveva dimostrato che anche quando non ci sono di mezzo fini commerciali, la questione è ambigua, complessa. E che no, non basta accettare condizioni di utilizzo intricate e che non legge nessuno per dire che allora ogni utente ha, per il fatto stesso di avere accettato di essere su Facebook, di diventare indiscriminatamente un topo di laboratorio arruolato in esperimenti di cui ignora tutto.

    Eppure è proprio la piattaforma a rendersi conto, già in quello stesso anno, che così le cose non vanno. Che a quel modo Facebook perde il controllo su quali terze parti hanno accesso ai dati dei suoi utenti. La policy dunque cambia, e da allora gli “amici” devono acconsentire al trattamento dei propri dati da parte di una app. La nuova filosofia, ricorda Albright, è “people first”. Ma è tardi. E l’incapacità di rientrare davvero in possesso di quell’ammasso di informazioni, dimostrata dal caso Cambridge Analytica – possibile Facebook debba scoprire dai giornali che l’azienda non aveva cancellato i dati che diceva di aver cancellato, o che debba comunque condurre un auditing serio per verificarlo ora, dimostrando di non avere idea se lo siano o meno? – fa capire che il problema va ben oltre il singolo caso in questione, ma è sistematico.

    Per capirci più chiaramente: come scrive Albright, la prima versione delle API v.1.0 per il Facebook Graph – cioè ciò che gli sviluppatori di applicazioni potevano ottenere dal social network tra il 2010, data di lancio, e il 2014, data in cui la policy è cambiata – consentiva di sapere non su chi si iscriveva a una determinata app, ma dagli amici inconsapevoli, i seguenti dati: “about, azioni, attività, compleanno, check-ins, istruzione, eventi, giochi, gruppi, residenza, interessi, like, luogo, note, status, tag, foto, domande, relazioni, religione/politica, iscrizioni, siti, storia lavorativa”. Davvero si poteva pensare di controllare dove finissero tutti questi dati, per milioni e milioni di persone?

    E davvero Facebook lo scopre oggi? Nel 2011, la Federal Trade Commission americana aveva già segnalato la questione come problematica. Non ha insegnato nulla
    https://www.valigiablu.it/facebook-cambridge-analytica-scandalo
    Voting 0
  2. Seconda domanda: Cambridge Analytica segnala come propri casi di successo il ruolo consulenziale avuto per la Brexit e le presidenziali USA del 2016. Questo vuol dire che la propaganda computazionale data-based ha successo prevalentemente con i movimenti populisti? Qui entriamo nella fantapolitica, ma è possibile provare a ragionare sulla questione. Se fosse vero che il populismo è più sensibile ad una comunicazione semplice e mirata, vorrebbe dire che la mente di chi vota conservatore sia diversa dalla mente di chi vota liberale. Chi ha sollevato la questione è il linguista George Lakoff che nel suo libro “Moral Politics” ha ipotizzato che i conservatori hanno un modello familiare rigoroso, in cui i valori sono fondati su autodisciplina e lavoro duro, mentre i liberali hanno un modello familiare partecipativo, in cui i valori sono basati sul prendersi cura gli uni con gli altri.
    https://www.agendadigitale.eu/cultura...ioni-con-i-social-che-dice-la-scienza
    Voting 0
  3. you know, for somebody who actually has read the indictment in its entirety, and, actually, the Russian reporting that is almost entirely repeated in the indictment, it’s really hard to square that with the way that it’s been portrayed as, you know, a sophisticated, bold effort. I think H.R. McMaster is correct in saying, yes, there’s “incontrovertible” evidence of Russian meddling, but to call it bold, to call it sophisticated and to imply that we now know that it actually had an influence on the outcome of the election is absurd. It was not bold. It was not sophisticated. And it—we don’t know, and probably never will know, whether it had any impact.

    creating a cacophony, creating confusion, creating the sense that nothing means anything anymore is definitely important, right? But that is different from saying that their goal was to sway the outcome of the election and that we can say with any amount of certainty that that worked and that’s how we got Trump.

    AMY GOODMAN: And it’s also served another purpose, for example, when it comes to these large megacorporations, like Facebook and Twitter. They’ve been hauled before Congress, before the British Parliament, and they’re saying, “How could you have allowed this to appear?” And in the end, they’re being pressured, basically, these corporations, to censor what is out there.

    MASHA GESSEN: Yeah, I mean, I don’t think that the agenda of holding Facebook accountable publicly is such a bad agenda. You know, I think that a conversation about what Facebook is—is it a public resource, even though it’s a privately owned corporation? Is it a media company? It is certainly not just a platform, as Facebook has claimed repeatedly. I think that is a really important question. I just think it’s been asked in the wrong way, right? It’s been asked—you know, when we saw Senator Al Franken badgering the Facebook lawyer and screaming, you know, “They were Russians! You know, how could you not see that these ads were bought for rubles?” Well, why are we starting at a place where we assume that selling advertising for rubles, that there’s something necessarily sinister and horrible about it? Right? And that is—I don’t think that moves forward a conversation about how something that has de facto become a public resource, but is privately owned, functions in society.
    https://www.democracynow.org/2018/2/23/masha_gessen_did_a_russian_troll
    Tags: , , , by M. Fioretti (2018-02-24)
    Voting 0
  4. As problematic as Facebook has become, it represents only one component of a much broader shift into a new human connectivity that is both omnipresent (consider the smartphone) and hypermediated—passing through and massaged by layer upon layer of machinery carefully hidden from view. The upshot is that it’s becoming increasingly difficult to determine what in our interactions is simply human and what is machine-generated. It is becoming difficult to know what is real.

    Before the agents of this new unreality finish this first phase of their work and then disappear completely from view to complete it, we have a brief opportunity to identify and catalogue the processes shaping our drift to a new world in which reality is both relative and carefully constructed by others, for their ends. Any catalogue must include at least these four items:

    the monetisation of propaganda as ‘fake news’;
    the use of machine learning to develop user profiles accurately measuring and modelling our emotional states;
    the rise of neuromarketing, targeting highly tailored messages that nudge us to act in ways serving the ends of others;
    a new technology, ‘augmented reality’, which will push us to sever all links with the evidence of our senses.



    The fake news stories floated past as jetsam on Facebook’s ‘newsfeed’, that continuous stream of shared content drawn from a user’s Facebook’s contacts, a stream generated by everything everyone else posts or shares. A decade ago that newsfeed had a raw, unfiltered quality, the notion that everyone was doing everything, but as Facebook has matured it has engaged increasingly opaque ‘algorithms’ to curate (or censor) the newsfeed, producing something that feels much more comfortable and familiar.

    This seems like a useful feature to have, but the taming of the newsfeed comes with a consequence: Facebook’s billions of users compose their world view from what flows through their feeds. Consider the number of people on public transport—or any public place—staring into their smartphones, reviewing their feeds, marvelling at the doings of their friends, reading articles posted by family members, sharing video clips or the latest celebrity outrages. It’s an activity now so routine we ignore its omnipresence.

    Curating that newsfeed shapes what Facebook’s users learn about the world. Some of that content is controlled by the user’s ‘likes’, but a larger part is derived from Facebook’s deep analysis of a user’s behaviour. Facebook uses ‘cookies’ (invisible bits of data hidden within a user’s web browser) to track the behaviour of its users even when they’re not on the Facebook site—and even when they’re not users of Facebook. Facebook knows where its users spend time on the web, and how much time they spend there. All of that allows Facebook to tailor a newsfeed to echo the interests of each user. There’s no magic to it, beyond endless surveillance.

    What is clear is that Facebook has the power to sway the moods of billions of users. Feed people a steady diet of playful puppy videos and they’re likely to be in a happier mood than people fed images of war. Over the last two years, that capacity to manage mood has been monetised through the sharing of fake news and political feeds atuned to reader preference: you can also make people happy by confirming their biases.

    We all like to believe we’re in the right, and when we get some sign from the universe at large that we are correct, we feel better about ourselves. That’s how the curated newsfeed became wedded to the world of profitable propaganda.

    Adding a little art to brighten an other-wise dull wall seems like an unalloyed good, but only if one completely ignores bad actors. What if that blank canvas gets painted with hate speech? What if, perchance, the homes of ‘undesirables’ are singled out with graffiti that only bad actors can see? What happens when every gathering place for any oppressed community gets invisibly ‘tagged’? In short, what happens when bad actors use Facebook’s augmented reality to amplify their own capacity to act badly?

    But that’s Zuckerberg: he seems to believe his creations will only be used to bring out the best in people. He seems to believe his gigantic sharing network would never be used to incite mob violence. Just as he seems to claim that Facebook’s capacity to collect and profile the moods of its users should never be monetised—but, given that presentation unearthed by the Australian, Facebook tells a different story to advertisers.

    Regulating Facebook enshrines its position as the data-gathering and profile-building organisation, while keeping it plugged into and responsive to the needs of national powers. Before anyone takes steps that would cement Facebook in our social lives for the foreseeable future, it may be better to consider how this situation arose, and whether—given what we now know—there might be an opportunity to do things differently.
    https://meanjin.com.au/essays/the-last-days-of-reality
    Voting 0
  5. Some observers have speculated that Trump's volatile week, which has included retweeting anti-Muslim videos from a British hate group and making a racially disparaging remark during an appearance with Native American war heroes, may have reflected a mind scrambled by signs that Flynn was about to enter a plea deal.
    The impact on Trump's temperament and mood -- at a time of a dangerous nuclear crisis involving North Korea -- and the potential of the latest Russia revelations to further distract him -- will be even more closely watched now.
    The sheer magnitude of Friday's events left Trump's defenders within his party with yet another infuriating distraction in their relationship with the President.
    Senate Intelligence Chairman Richard Burr of North Carolina, who is leading his own probe into the Russia issue, refused multiple requests by CNN's Manu Raju to comment on Friday's bombshell developments.
    It is a measure of the shocking significance of the Flynn news that it completely obliterated two other massive political developments -- the apparently imminent passage of the most sweeping tax reform bill for 30 years in a hugely significant victory for Trump and the stunning public humiliation of Secretary of State Rex Tillerson by the White House.
    In many ways that's the story of the Trump presidency itself -- everything has been overshadowed by Russia.
    http://edition.cnn.com/2017/12/01/pol...s/russia-trump-white-house/index.html
    Tags: , , by M. Fioretti (2017-12-02)
    Voting 0
  6. I do believe that this time is different, the beginning of a massive shift, and I believe it’s the fault of these social networks.

    One of the problems is that these platforms act, in many ways, like drugs. Facebook, and every other social-media outlet, knows that all too well. Your phone vibrates a dozen times an hour with alerts about likes and comments and retweets and faves. The combined effect is one of just trying to suck you back in, so their numbers look better for their next quarterly earnings report. Sean Parker, one of Facebook’s earliest investors and the company’s first president, came right out and said what we all know: the whole intention of Facebook is to act like a drug, by “ giving » you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever.” That, Parker said, was by design. These companies are “exploiting a vulnerability in human psychology.” Former Facebook executive Chamath Palihapitiya has echoed this, too. “Do I feel guilty?” he asked rhetorically on CNN about the role Facebook is playing in society. “Absolutely I feel guilt.”

    And then, there’s the biggest reason why people are abandoning the platforms: the promise of connection has turned out to be a reality of division. We’ve all watched the way Donald J. Trump used social media to drive a wedge between us all, the way he tweets his sad and pathetic insecurities out to the world, without a care for how calling an equally insecure rogue leader a childish name might put us all on the brink of nuclear war. There’s a point that watching it all happen in real time makes you question what you’re doing with your life. As for conversing with our fellow Americans, we’ve all tried, unsuccessfully, to have a conversation on these platforms, which has so quickly devolved into a shouting match, or pile-on from perfect strangers because your belief isn’t the same as theirs. Years ago, a Facebook executive told me that the biggest reason people unfriend each other is because they disagree on an issue. The executive jokingly said, “Who knows, if this keeps up, maybe we’ll end up with people only having a few friends on Facebook.” Perhaps, worse of all, we’ve all watched as Russia has taken these platforms and used them against us in ways no one could have comprehended a decade ago.
    https://www.vanityfair.com/news/2017/...-social-era-twitter-facebook-snapchat
    Voting 0
  7. I kept asking the party lawyers and the DNC staff to show me the agreements that the party had made for sharing the money they raised, but there was a lot of shuffling of feet and looking the other way.

    When I got back from a vacation in Martha’s Vineyard, I at last found the document that described it all: the Joint Fund-Raising Agreement between the DNC, the Hillary Victory Fund, and Hillary for America.

    The agreement—signed by Amy Dacey, the former CEO of the DNC, and Robby Mook with a copy to Marc Elias—specified that in exchange for raising money and investing in the DNC, Hillary would control the party’s finances, strategy, and all the money raised. Her campaign had the right of refusal of who would be the party communications director, and it would make final decisions on all the other staff. The DNC also was required to consult with the campaign about all other staffing, budgeting, data, analytics, and mailings.

    I had been wondering why it was that I couldn’t write a press release without passing it by Brooklyn. Well, here was the answer.
    https://www.politico.com/magazine/sto.../02/clinton-brazile-hacks-2016-215774
    Voting 0
  8. In a largely automated platform like Facebook, what matters most is not the political beliefs of the employees but the structures, algorithms and incentives they set up, as well as what oversight, if any, they employ to guard against deception, misinformation and illegitimate meddling. And the unfortunate truth is that by design, business model and algorithm, Facebook has made it easy for it to be weaponized to spread misinformation and fraudulent content. Sadly, this business model is also lucrative, especially during elections. Sheryl Sandberg, Facebook’s chief operating officer, called the 2016 election “a big deal in terms of ad spend” for the company, and it was. No wonder there has been increasing scrutiny of the platform.
    https://www.nytimes.com/2017/09/29/opinion/mark-zuckerberg-facebook.html
    Voting 0
  9. Pressed by investigators in Congress, Facebook said Wednesday that it has found evidence that a pro-Kremlin Russian “troll farm” bought $100,000 worth of ads targeted at U.S. voters between 2015 and 2017. The finding was first reported by the Washington Post, and Facebook published its own statement Wednesday afternoon.

    A few of the roughly 3,000 ads that Facebook traced to the Russian company mentioned presidential candidates Donald Trump or Hillary Clinton directly, according to the Post’s sources. The majority focused on stoking people’s emotions around divisive issues such as “gun rights and immigration fears, as well as gay rights and racial discrimination.”

    Facebook wouldn’t disclose the ads in question, nor exactly how the scheme worked.
    http://www.slate.com/blogs/future_ten...olitical_ads_on_facebook_is_such.html
    Voting 0
  10. Relations between Democrats and religious progressives have been more difficult since 1980, when evangelicals deserted Jimmy Carter — one of their own, whom they had supported in 1976 — for Ronald Reagan.

    As Republicans cemented the Christian right as a cornerstone of the party’s base, Democrats moved in the opposite direction, so intent on separating church and state that they recoiled from courting religious blocs of voters, recalled Gary Hart, the former senator, who grew up in the Church of the Nazarene and graduated from divinity school.

    Interactive Feature | How Have Your Politics and Religion Mixed in Unexpected Ways? We would like to know more about how your religious beliefs have affected your political views and actions — or vice versa.

    During his ill-fated 1988 presidential campaign, Mr. Hart said, he was often asked, “‘Why don’t you talk about your religious background more?’ And the answer was, ‘I don’t want to be seen as pandering for votes.’”

    Issues on which the religious left is at odds with Democratic doctrine include military spending and the death penalty, though the most polarizing is abortion — the main barrier, for many liberal evangelicals and Catholics, to voting as Democrats — as could be seen when the party split recently over whether to endorse an anti-abortion Democrat running for mayor of Omaha.

    Setting abortion aside, political appeals based on religious beliefs continue to carry risk for Democrats, given the growing numbers of Americans who claim no religion: Secular voters overwhelmingly vote Democratic, and younger voters are far more secular than older voters.

    Still, Hillary Clinton’s snub of even moderate evangelicals in the 2016 presidential race squandered many opportunities to cut into Mr. Trump’s support. Where Barack Obama had worked hard in 2008 to show he would at least listen to evangelicals, Mrs. Clinton rebuffed interview requests from evangelical media outlets and signaled leftward moves on abortion rights that helped many conservative voters overcome their doubts about Mr. Trump.

    “The fact that one party has strategically used and abused religion, while the other has had a habitually allergic and negative response to religion per se, puts our side in a more difficult position in regard to political influence,” said the Rev. Jim Wallis, the evangelical social justice advocate who founded the Sojourners community and magazine in 1971.

    “Most progressive religious leaders I talk to, almost all of them, feel dissed by the left,” he said. “The left is really controlled by a lot of secular fundamentalists.”
    https://mobile.nytimes.com/2017/06/10....html?referer=https://t.co/RCF6eYd1hV
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 10 Online Bookmarks of M. Fioretti: Tags: usa 2016

About - Propulsed by SemanticScuttle