Tags: facebook*

465 bookmark(s) - Sort by: Date ↓ / Title / Voting /

  1. Parker described how in the early days of Facebook people would tell him they weren’t on social media because they valued their real-life interactions.

    “And I would say, ‘OK. You know, you will be,’” he said.

    “I don’t know if I really understood the consequences of what I was saying,” he added, pointing to “unintended consequences” that arise when a network grows to have more than 2 billion users.

    “It literally changes your relationship with society, with each other. It probably interferes with productivity in weird ways. God only knows what it’s doing to our children’s brains,” he said.

    He explained that when Facebook was being developed the objective was: “How do we consume as much of your time and conscious attention as possible?” It was this mindset that led to the creation of features such as the “like” button that would give users “a little dopamine hit” to encourage them to upload more content.

    “It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.”
    Voting 0
  2. A former Facebook executive has said he feels “tremendous guilt” over his work on “tools that are ripping apart the social fabric of how society works”, joining a growing chorus of critics of the social media giant.

    Chamath Palihapitiya, who was vice-president for user growth at Facebook before he left the company in 2011, said: “The short-term, dopamine-driven feedback loops that we have created are destroying how society works. No civil discourse, no cooperation, misinformation, mistruth.”

    The remarks, which were made at a Stanford Business School event in November, were just surfaced by tech website the Verge on Monday.

    “This is not about Russian ads,” he added. “This is a global problem. It is eroding the core foundations of how people behave by and between each other.”
    Ex-Facebook president Sean Parker: site made to exploit human 'vulnerability'
    Read more

    Palihapitiya’s comments last month were made a day after Facebook’s founding president, Sean Parker, criticized the way that the company “exploit s » a vulnerability in human psychology” by creating a “social-validation feedback loop” during an interview at an Axios event.

    Parker had said that he was “something of a conscientious objector” to using social media, a stance echoed by Palihapitiya who said that he was now hoping to use the money he made at Facebook to do good in the world.

    “I can’t control them,” Palihapitiya said of his former employer. “I can control my decision, which is that I don’t use that shit. I can control my kids’ decisions, which is that they’re not allowed to use that shit.”

    He also called on his audience to “soul-search” about their own relationship to social media. “Your behaviors, you don’t realize it, but you are being programmed,” he said. “It was unintentional, but now you gotta decide how much you’re going to give up, how much of your intellectual independence.”
    Tags: , , by M. Fioretti (2017-12-16)
    Voting 0
  3. no serious scholar of modern geopolitics disputes that we are now at war — a new kind of information-based war, but war, nevertheless — with Russia in particular, but in all honesty, with a multitude of nation states and stateless actors bent on destroying western democratic capitalism. They are using our most sophisticated and complex technology platforms to wage this war — and so far, we’re losing. Badly.

    Why? According to sources I’ve talked to both at the big tech companies and in government, each side feels the other is ignorant, arrogant, misguided, and incapable of understanding the other side’s point of view. There’s almost no data sharing, trust, or cooperation between them. We’re stuck in an old model of lobbying, soft power, and the occasional confrontational hearing.

    Not exactly the kind of public-private partnership we need to win a war, much less a peace.

    Am I arguing that the government should take over Google, Amazon, Facebook, and Apple so as to beat back Russian info-ops? No, of course not. But our current response to Russian aggression illustrates the lack of partnership and co-ordination between government and our most valuable private sector companies. And I am hoping to raise an alarm: When the private sector has markedly better information, processing power, and personnel than the public sector, one will only strengthen, while the latter will weaken. We’re seeing it play out in our current politics, and if you believe in the American idea, you should be extremely concerned.
    Voting 0
  4. Facebook’s goal is to “push down the age” of when it’s acceptable for kids to be on social media, says Josh Golin, executive director of Campaign for a Commercial Free Childhood. Golin says 11-to-12-year-olds who already have a Facebook account, probably because they lied about their age, might find the animated emojis and GIFs of Messenger Kids “too babyish,” and are unlikely to convert to the new app.

    Facebook launched Messenger Kids for 6-to-12-year olds in the US Monday, saying it took extraordinary care and precautions. The company said its 100-person team building apps for teens and kids consulted with parent groups, advocates, and childhood-development experts during the 18-month development process and the app reflects their concerns. Parents download Messenger Kids on their child’s account, after verifying their identity by logging into Facebook. Since kids cannot be found in search, parents must initiate and respond to friend requests.

    Facebook says Messenger Kids will not display ads, nor collect data on kids for advertising purposes. Kids’ accounts will not automatically be rolled into Facebook accounts once they turn 13.

    Nonetheless, advocates focused on marketing to children expressed concerns. The company will collect the content of children’s messages, photos they send, what features they use on the app, and information about the device they use. Facebook says it will use this information to improve the app and will share the information “within the family of companies that are part of Facebook,” and outside companies that provide customer support, analysis, and technical infrastructure.
    Voting 0
  5. What about the actual functioning of the application: What tweets are displayed to whom in what order? Every major social-networking service uses opaque algorithms to shape what data people see. Why does Facebook show you this story and not that one? No one knows, possibly not even the company’s engineers. Outsiders know basically nothing about the specific choices these algorithms make. Journalists and scholars have built up some inferences about the general features of these systems, but our understanding is severely limited. So, even if the LOC has the database of tweets, they still wouldn’t have Twitter.

    In a new paper, “Stewardship in the ‘Age of Algorithms,’” Clifford Lynch, the director of the Coalition for Networked Information, argues that the paradigm for preserving digital artifacts is not up to the challenge of preserving what happens on social networks.

    Over the last 40 years, archivists have begun to gather more digital objects—web pages, PDFs, databases, kinds of software. There is more data about more people than ever before, however, the cultural institutions dedicated to preserving the memory of what it was to be alive in our time, including our hours on the internet, may actually be capturing less usable information than in previous eras.

    “We always used to think for historians working 100 years from now: We need to preserve the bits (the files) and emulate the computing environment to show what people saw a hundred years ago,” said Dan Cohen, a professor at Northeastern University and the former head of the Digital Public Library of America. “Save the HTML and save what a browser was and what Windows 98 was and what an Intel chip was. That was the model for preservation for a decade or more.”

    Which makes sense: If you want to understand how WordPerfect, an old word processor, functioned, then you just need that software and some way of running it.

    But if you want to document the experience of using Facebook five years ago or even two weeks ago ... how do you do it?

    The truth is, right now, you can’t. No one (outside Facebook, at least) has preserved the functioning of the application. And worse, there is no thing that can be squirreled away for future historians to figure out. “The existing models and conceptual frameworks of preserving some kind of ‘canonical’ digital artifacts are increasingly inapplicable in a world of pervasive, unique, personalized, non-repeatable performances,” Lynch writes.

    Nick Seaver of Tufts University, a researcher in the emerging field of “algorithm studies,” wrote a broader summary of the issues with trying to figure out what is happening on the internet. He ticks off the problems of trying to pin down—or in our case, archive—how these web services work. One, they’re always testing out new versions. So there isn’t one Google or one Bing, but “10 million different permutations of Bing.” Two, as a result of that testing and their own internal decision-making, “You can’t log into the same Facebook twice.” It’s constantly changing in big and small ways. Three, the number of inputs and complex interactions between them simply makes these large-scale systems very difficult to understand, even if we have access to outputs and some knowledge of inputs.

    “What we recognize or ‘discover’ when critically approaching algorithms from the outside is often partial, temporary, and contingent,” Seaver concludes.

    The world as we experience it seems to be growing more opaque. More of life now takes place on digital platforms that are different for everyone, closed to inspection, and massively technically complex. What we don't know now about our current experience will resound through time in historians of the future knowing less, too. Maybe this era will be a new dark age, as resistant to analysis then as it has become now.

    If we do want our era to be legible to future generations, our “memory organizations” as Lynch calls them, must take radical steps to probe and document social networks like Facebook. Lynch suggests creating persistent, socially embedded bots that exist to capture a realistic and demographically broad set of experiences on these platforms. Or, alternatively, archivists could go out and recruit actual humans to opt in to having their experiences recorded, as ProPublica has done with political advertising on Facebook.
    Voting 0
  6. for us, changes like this can be disastrous. Attracting viewers to a story relies, above all, on making the process as simple as possible. Even one extra click can make a world of difference. This is an existential threat, not only to my organization and others like it but also to the ability of citizens in all of the countries subject to Facebook’s experimentation to discover the truth about their societies and their leaders.

    Serbia is a perfect example of why the political context of Facebook’s experimentation matters. Serbia escaped the dictatorship of Slobodan Milosevic in 2000, but it hasn’t developed into a fully functioning democracy. One party, led by President Aleksandar Vucic, controls not only the Parliament but also the whole political system. Our country has no tradition of checks and balances. Mr. Vucic now presents himself as progressive and pro-European, but as minister of information in the Milosevic government, he was responsible for censoring news coverage.
    Sign Up for the Opinion Today Newsletter

    Every weekday, get thought-provoking commentary from Op-Ed columnists, the Times editorial board and contributing writers from around the world.

    See Sample Privacy Policy Opt out or contact us anytime

    Today, censorship in Serbia takes a softer form. Pliant outlets loyal to the government receive preferential treatment and better funding from local and central budgets. Those that stray out of line find themselves receiving unexpected visits from the tax inspectors.

    This isn’t an easy place to be an independent journalist. Since 2015, my investigative nonprofit, KRIK, has covered stories the mainstream media won’t touch. In return, we have been spied on and threatened, and have had lurid fabrications about our private lives splashed on the front page of national tabloids.

    Last year, KRIK published an investigation showing that when he was a young surgeon, Zlatibor Loncar, who is now minister of health, had been contracted by a gang to kill one of its enemies, according to court testimony by protected witnesses. You’d think the story of a future minister administering poison through an IV would make a splash — but the mainstream outlets ignored it.

    Going to KRIK’s website is the only way Serbian citizens could learn the truth about that story and many others like it. And until last month, most of our readers went to our site via Facebook.

    Facebook allowed us to bypass mainstream channels and bring our stories to hundreds of thousands of readers. But now, even as the social network claims to be cracking down on “fake news,” it is on the verge of ruining us.

    That’s why Mark Zuckerberg’s arbitrary experiments are so dangerous. The major TV channels, mainstream newspapers and organized-crime-run outlets will have no trouble buying Facebook ads or finding other ways to reach their audiences. It’s small, alternative organizations like mine that will suffer.

    We journalists bear some responsibility for this, too. Using Facebook to reach our readers has always been convenient, so we invested time and effort in building our presence there, helping it become the monster it is today.

    But what’s done is done — a private company, accountable to no one, has taken over the world’s media ecosystem. It is now responsible for what happens there. By picking small countries with shaky democratic institutions to be experimental subjects, it is showing a cynical lack of concern for how its decisions affect the most vulnerable.
    Voting 0
  7. I do believe that this time is different, the beginning of a massive shift, and I believe it’s the fault of these social networks.

    One of the problems is that these platforms act, in many ways, like drugs. Facebook, and every other social-media outlet, knows that all too well. Your phone vibrates a dozen times an hour with alerts about likes and comments and retweets and faves. The combined effect is one of just trying to suck you back in, so their numbers look better for their next quarterly earnings report. Sean Parker, one of Facebook’s earliest investors and the company’s first president, came right out and said what we all know: the whole intention of Facebook is to act like a drug, by “ giving » you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever.” That, Parker said, was by design. These companies are “exploiting a vulnerability in human psychology.” Former Facebook executive Chamath Palihapitiya has echoed this, too. “Do I feel guilty?” he asked rhetorically on CNN about the role Facebook is playing in society. “Absolutely I feel guilt.”

    And then, there’s the biggest reason why people are abandoning the platforms: the promise of connection has turned out to be a reality of division. We’ve all watched the way Donald J. Trump used social media to drive a wedge between us all, the way he tweets his sad and pathetic insecurities out to the world, without a care for how calling an equally insecure rogue leader a childish name might put us all on the brink of nuclear war. There’s a point that watching it all happen in real time makes you question what you’re doing with your life. As for conversing with our fellow Americans, we’ve all tried, unsuccessfully, to have a conversation on these platforms, which has so quickly devolved into a shouting match, or pile-on from perfect strangers because your belief isn’t the same as theirs. Years ago, a Facebook executive told me that the biggest reason people unfriend each other is because they disagree on an issue. The executive jokingly said, “Who knows, if this keeps up, maybe we’ll end up with people only having a few friends on Facebook.” Perhaps, worse of all, we’ve all watched as Russia has taken these platforms and used them against us in ways no one could have comprehended a decade ago.
    Voting 0
  8. Facebook representatives have told Italian officials that they are planning to dispatch an Italian “task force” of fact-checkers to address the fake-news problem here before the elections, according to a government official who was present during the negotiations but was not authorized to speak on the record.
    Voting 0
  9. Il 28 giugno 2016 la Hill aveva pubblicato un articolo in cui sosteneva che Facebook stesse utilizzando la geolocalizzazione degli utenti per migliorare i propri suggerimenti. Menlo Park inizialmente conferma (due volte, stando a quanto riporta Hill), precisando però che questo sarebbe solo uno dei fattori che determinano chi compare nella sezione “persone che potresti conoscere”, per poi fare dietrofront e dire che no, assolutamente, la geolocalizzazione non viene usata. Il che però non spiega in che modo il padre di un ragazzino con tendenze suicide, dopo aver frequentato un incontro di supporto, il giorno dopo si sia trovato tra i suggerimenti un altro dei genitori presenti.

    Kashmir Hill, che da qualche mese ormai tiene d’occhio la sezione “persone che potresti conoscere”, scaricando e stampando di volta in volta la lista di suggerimenti che le viene proposta (circa 160 nomi al giorno), ha tentato più volte di chiedere delucidazioni sul funzionamento dell’algoritmo. Dopo aver ragionevolmente ipotizzato che lo spunto per queste serie di nomi arrivasse da blocchi di informazioni vendute da data broker, un portavoce le risponde in modo piuttosto secco che “Facebook non utilizza informazioni di fornitori di dati per la sezione ‘persone che potresti conoscere’”. Nel frattempo raccoglie testimonianze di gente che con quella sezione non ha avuto interazioni del tutto felici. Hill racconta del caso di una prostituta, Leia (ovviamente uno pseudonimo), che ha visto comparire tra le “persone che avrebbe potuto conoscere” alcuni dei propri clienti. Questo nonostante avesse usato la massima cautela sul social, registrandosi con un indirizzo email accademico e non facendo mai menzione della propria professione. C’è poi il caso di una psichiatra che si è vista suggerire molti dei propri pazienti come possibili amici, due dei quali a loro volta sono comparsi l’uno nella lista di suggerimenti dell’altro. O quello di un avvocato che racconta di aver cancellato il proprio account dopo che tra le possibili amicizie aveva trovato il nome del difensore di una controparte in uno dei casi di cui si era occupato. I due avevano corrisposto solo tramite l’indirizzo email di lavoro, che Facebook non avrebbe avuto modo di ottenere per cessione volontaria dell’utente, visto che quest’ultimo non aveva dato il consenso per la condivisione dei proprio contatti.

    Tutto ciò è possibile perché oltre all’enorme mole di dati che forniamo volontariamente, Facebook immagazzina tutta una serie di altre informazioni su di noi e sui nostri contatti, che va a costituire il cosiddetto “shadow profile”. Immaginate un ipotetico iceberg, la cui punta sono i dati che cediamo di nostra spontanea volontà: ne rimane una gigantesca porzione nascosta, che è appunto utilizzata dal social non solo per decidere chi sarà la prossima persona a cui chiederemo l’amicizia, ma anche per arricchire il nostro pacchetto di preferenze e caratteristiche personali – pacchetto che andrà a migliorare le targhettizzazioni delle inserzioni.

    Come vengono raccolte queste informazioni? Stando a quanto riporta Hill, nel momento in cui si decide di utilizzare il servizio “Trova amici” si accetta di trasmettere tutti i propri contatti a Facebook. Il che significa: numeri di telefono (se si utilizza l’app per mobile), indirizzi email (se si usa in desktop o da cellulare), nomi ed eventuali soprannomi degli intestatari. Avete appena ceduto a terzi tutta la vostra rubrica. Questi contatti vengono quindi messi a confronto e incrociati. Se anche un solo indirizzo email e numero di telefono tra quelli che avete appena ceduto è presente nella rubrica di un qualsiasi altro utente – che deve però a sua volta aver accettato di condividere i contatti – allora sarete reciprocamente suggeriti come amici. Se il vostro nome è salvato nella rubrica di un user che dà il proprio consenso per caricarla online, il vostro contatto entrerà nel sistema con il vostro nome e cognome, che voi abbiate o meno deciso di caricare i dati su Facebook.

    Tornando al caso dell’avvocato citato prima, com’è possibile che gli sia stato proposto come amico un altro legale con cui ha corrisposto solo tramite la propria mail lavorativa? Non serve nemmeno che l’avvocato in questione decida di condividere i propri contatti. Basta che un qualsiasi altro utente abbia quella mail salvata sotto il nome del legale e che la condivida con Facebook. A questo punto il sito, avendo il suo nome, cognome e indirizzo email, è in grado di metterlo in contatto con qualsiasi altro utente in possesso di quella stessa mail. Nel caso in questione, il legale di controparte.

    Gli shadow profile non sono una novità. L’iniziativa Europe v. Facebook aveva presentato un reclamo formale già nel 2011 (vi vengono elencate sette istanze in cui Facebook, con sede in Irlanda, avrebbe violato la legge irlandese per la protezione dei dati personali), ma la questione è esplosa nel giugno 2013, quando un bug colpì il social media rendendo pubblici indirizzi email e numeri di telefono privati di 6 milioni di utenti. Molti fra questi non avevano condiviso volontariamente le proprie informazioni. Scaricando il proprio file personale da Facebook, si potevano vedere non solo i dati “ufficiali” relativi alla propria lista di amici, ma anche i loro shadow profile. Il bug, insomma, ha dimostrato l’esistenza di un profilo ombra la cui esistenza si ipotizzava già da un paio di anni.

    Pensate a tutti i numeri che avete salvati in rubrica. Pensate a tutte le persone che avete visto una volta e di cui avete salvato il contatto, per un motivo o per l’altro. Se voi o le persone in questione caricate le vostre rispettive info su Facebook, i vostri profili saranno aggiornati. E così i suggerimenti diventeranno sempre più accurati, fino a includere qualsiasi pittoresco individuo che per qualche ragione ha anche un solo contatto in comune con voi. Tale profilo finirà per costituire una sorta di cronologia della vostra rubrica, con tutte le informazioni annesse a quest’ultima.

    Complice una piattaforma che non sempre dà prova di tutta la trasparenza che decanta, siamo convinti che tutto ciò che Facebook sa di noi sia frutto di interazioni dirette e delle nostre scelte consapevoli. In molti casi, purtroppo, ciò si rivela un clamoroso errore.

    Questo è uno di quei casi. L’elenco di “persone che potresti conoscere”, in realtà, non è altro che una delle possibili dimostrazioni di ciò che Facebook potrebbe conoscere su di voi.
    Voting 0
  10. Look: no. Skedaddle is not going to eliminate Yelp or Facebook or tipping. It's not going to be "the first cryptocurrency for real world use." But at some level they're not wrong! One day 20 years from now we'll wake up and all of our interactions and performance will be tracked on the blockchain and will directly determine our income and socioeconomic status, and on the one hand we'll get pretty good customer service, but on the other hand we'll be terrified all the time. It is the logical endpoint of the "gig economy."

    The thing is that this omniscient blockchain of terror will be run by Facebook, not Skedaddle. If you just come out and say that your mission is to build a dystopia of economic precarity and constant surveillance, then you do not have the soft skills to actually carry out that mission. (Never mind if you say that your mission is "to completely take down Yelp and Facebook reviews, while completely eliminating tipping.") If you say that your mission is "to make the world more open and connected," then you have the ruthlessness, and the facility with euphemism, to actually do it.

    Elsewhere in dystopian blockchain fiction, here is a story about doomsday preppers who are hoarding bitcoins against the apocalypse. Doomsday prepping and bitcoin enthusiasm go well together psychologically: Both involve distrust of modern social systems, and both tap into deep libertarian and self-sufficiency themes. But they don't go at all well together logically: If modern society is wiped out in some massive catastrophe, it seems unlikely that the electric grid and global internet infrastructure will survive to run an energy-hungry blockchain for a currency with no physical form that even now basically can't be used to buy anything. But the bitcoin/apocalypse enthusiasts are undeterred:
    Voting 0

Top of the page

First / Previous / Next / Last / Page 2 of 47 Online Bookmarks of M. Fioretti: tagged with "facebook"

About - Propulsed by SemanticScuttle