mfioretti: facebook* + surveillance*

Bookmarks on this page are managed by an admin user.

119 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. “I believe it’s important to tell people exactly how the information that they share on Facebook is going to be used.

    “That’s why, every single time you go to share something on Facebook, whether it’s a photo in Facebook, or a message, every single time, there’s a control right there about who you’re going to be sharing it with ... and you can change that and control that in line.

    “To your broader point about the privacy policy ... long privacy policies are very confusing. And if you make it long and spell out all the detail, then you’re probably going to reduce the per cent of people who read it and make it accessible to them.”
    https://www.theguardian.com/technolog...testimony-to-congress-the-key-moments
    Voting 0
  2. Should there be regulation?#
    Yes. On privacy disclosure, and prohibiting the most draconian uses of user data. It should not be possible for users to give those rights up in exchange for use of a social system like Facebook. The idea is similar to the law in California that says that most non-competes are not enforceable. The benefit you receive has to be somewhat equivalent to the data you give up. #
    What about Google, Apple, Amazon?#
    This is the really important stuff.#
    This affair should get users, government and the press to look at other tech companies who have business models based on getting users to disclose ever-more-intimate information. Here are some examples.#
    Google, through Android, knows every place you go. They use that data. Do they sell it? I don't know, but I'm pretty sure you can use it to target ads. Apple, through the iPhone also knows where you go.#
    Apps on Android or iPhones can be told where you go. Many of them are only useful if you let them have the info. Apps can also have all your pictures, contacts. Face recognition makes it possible to construct a social graph without any access to the Facebook API.#
    Google and Apple can listen to all your phone calls.#
    Google, through their Chrome browser, knows everywhere you go on the web, and everything you type into the browser. #
    Amazon Echo and Google Home are always listening. Imagine a leak based on conversations at home, phone calls, personal habits, arguments you have with your spouse, kids, any illegal activities that might be going on in your home. #
    If you have a Gmail account, Google reads your mail, and targets ads at you based on what you're writing about. They also read the email that people send to you, people who may not also be Gmail users. Some examples of how creepy this can be -- they seem to know what my investments are, btw -- I assume they figured this out through email. Recently they told me when a friend's flight to NYC was arriving. I don't know how they made this connection. I assume it was through email.#
    Amazon, of course, knows everything you buy through Amazon. #
    Google knows everything you search for. #
    And on and on. We've reconstructed our whole society around companies having all the data about us that they want. It's kind of funny that we're all freaking out about Cambridge Analytica and Facebook. The problem is so much bigger. #
    Summary#
    It seems like a non-event to me. The press knew all about the API going back to 2012. That they didn't foresee the problem then is a result of the press accepting the hype of big tech companies on their terms, and not trying to find out what the implications of technology are from non-partisan experts. This was a story that could have and should have been written in 2010, warning users of a hidden cost to Facebook.#
    Today's scandal, the equivalent of the one in 2010, is that Google is attempting to turn the web into a corporate platform. Once they control the web as Facebook controls the Social Graph, we'll have another impossibly huge problem to deal with. Better to head this one off with regulation, now, when it can do some good
    http://scripting.com/2018/04/11/140429.html
    Voting 0
  3. After Barack Obama won reelection in 2012, voter targeting and other uses of Big Data in campaigns was all the rage. The following spring, at a conference titled Data-Crunched Democracy that Turow organized with Daniel Kreiss of the University of North Carolina, I listened as Ethan Roeder, the head of data analytics for Obama 2012, railed against critics. “Politicians exist to manipulate you,” he said, “and that is not going to change, regardless of how information is used.” He continued: “OK, maybe we have a new form of manipulation, we have micro-manipulation, but what are the real concerns? What is the real problem that we see with the way information is being used? Because if it’s manipulation, that ship has long since sailed.” To Roeder, the bottom line was clear: “Campaigns do not care about privacy. All campaigns care about is winning.”

    A few of us at the conference, led by the sociologist Zeynep Tufekci, argued that because individual voter data was being weaponized with behavioral-science insights in ways that could be finely tuned and also deployed outside of public view, the potential now existed to engineer the public toward outcomes that wealthy interests would pay dearly to control. No one listened. Until last year, you could not get a major US foundation to put a penny behind efforts to monitor and unmask these new forms of hidden persuasion.

    If there’s any good news in the last week of revelations about the data firm Cambridge Analytica’s 2014 acquisition (and now-notorious 2016 use) of the profile data of 50 million Facebook members, it’s this: Millions of people are now awake to just how naked and exposed they are in the public sphere. And clearly, people care a lot more about political uses of their personal data than they do about someone trying to sell them a pair of shoes. That’s why so many people are suddenly talking about deleting their Facebook accounts.
    http://www.other-news.info/2018/03/po...eeds-to-be-restored-to-internet-users
    Voting 0
  4. None of this will be legal under the #GDPR. (See one reason why at https://t.co/HXOQ5gb4dL). Publishers and brands need to take care to stop using personal data in the RTB system. Data connections to sites (and apps) have to be carefully controlled by publishers.

    So far, #adtech’s trade body has been content to cover over this wholesale personal data leakage with meaningless gestures that purport to address the #GDPR (see my note on @IABEurope current actions here: https://t.co/FDKBjVxqBs). It is time for a more practical position.

    And advertisers, who pay for all of this, must start to demand that safe, non-personal data take over in online RTB targeting. RTB works without personal data. Brands need to demand this to protect themselves – and all Internet users too. @dwheld @stephan_lo @BobLiodice

    Websites need to control
    1. which data they release in to the RTB system
    2. whether ads render directly in visitors’ browsers (where DSPs JavaScript can drop trackers)
    3. what 3rd parties get to be on their page
    @jason_kint @epc_angela @vincentpeyregne @earljwilkinson 11/12

    Lets work together to fix this. 12/12

    Those last three recommendations are all good, but they also assume that websites, advertisers and their third party agents are the ones with the power to do something. Not readers.

    But there’s lots readers will be able to do. More about that shortly. Meanwhile, publishers can get right with readers by dropping #adtech and go back to publishing the kind of high-value brand advertising they’ve run since forever in the physical world.

    That advertising, as Bob Hoffman (@adcontrarian) and Don Marti (@dmarti) have been making clear for years, is actually worth a helluva lot more than adtech, because it delivers clear creative and economic signals and comes with no cognitive overhead (for example, wondering where the hell an ad comes from and what it’s doing right now).

    As I explain here, “Real advertising wants to be in a publication because it values the publication’s journalism and readership” while “adtech wants to push ads at readers anywhere it can find them.”

    Going back to real advertising is the easiest fix in the world, but so far it’s nearly unthinkable because we’ve been defaulted for more than twenty years to an asymmetric power relationship between people and publishers called client-server. I’ve been told that client-server was chosen as the name for this relationship because “slave-master” didn’t sound so good; but I think the best way to visualize it is calf-cow:

    As I put it at that link (way back in 2012), Client-server, by design, subordinates visitors to websites. It does this by putting nearly all responsibility on the server side, so visitors are just users or consumers, rather than participants with equal power and shared responsibility in a truly two-way relationship between equals.
    http://blogs.harvard.edu/doc/2018/03/23/nothing
    Voting 0
  5. As problematic as Facebook has become, it represents only one component of a much broader shift into a new human connectivity that is both omnipresent (consider the smartphone) and hypermediated—passing through and massaged by layer upon layer of machinery carefully hidden from view. The upshot is that it’s becoming increasingly difficult to determine what in our interactions is simply human and what is machine-generated. It is becoming difficult to know what is real.

    Before the agents of this new unreality finish this first phase of their work and then disappear completely from view to complete it, we have a brief opportunity to identify and catalogue the processes shaping our drift to a new world in which reality is both relative and carefully constructed by others, for their ends. Any catalogue must include at least these four items:

    the monetisation of propaganda as ‘fake news’;
    the use of machine learning to develop user profiles accurately measuring and modelling our emotional states;
    the rise of neuromarketing, targeting highly tailored messages that nudge us to act in ways serving the ends of others;
    a new technology, ‘augmented reality’, which will push us to sever all links with the evidence of our senses.



    The fake news stories floated past as jetsam on Facebook’s ‘newsfeed’, that continuous stream of shared content drawn from a user’s Facebook’s contacts, a stream generated by everything everyone else posts or shares. A decade ago that newsfeed had a raw, unfiltered quality, the notion that everyone was doing everything, but as Facebook has matured it has engaged increasingly opaque ‘algorithms’ to curate (or censor) the newsfeed, producing something that feels much more comfortable and familiar.

    This seems like a useful feature to have, but the taming of the newsfeed comes with a consequence: Facebook’s billions of users compose their world view from what flows through their feeds. Consider the number of people on public transport—or any public place—staring into their smartphones, reviewing their feeds, marvelling at the doings of their friends, reading articles posted by family members, sharing video clips or the latest celebrity outrages. It’s an activity now so routine we ignore its omnipresence.

    Curating that newsfeed shapes what Facebook’s users learn about the world. Some of that content is controlled by the user’s ‘likes’, but a larger part is derived from Facebook’s deep analysis of a user’s behaviour. Facebook uses ‘cookies’ (invisible bits of data hidden within a user’s web browser) to track the behaviour of its users even when they’re not on the Facebook site—and even when they’re not users of Facebook. Facebook knows where its users spend time on the web, and how much time they spend there. All of that allows Facebook to tailor a newsfeed to echo the interests of each user. There’s no magic to it, beyond endless surveillance.

    What is clear is that Facebook has the power to sway the moods of billions of users. Feed people a steady diet of playful puppy videos and they’re likely to be in a happier mood than people fed images of war. Over the last two years, that capacity to manage mood has been monetised through the sharing of fake news and political feeds atuned to reader preference: you can also make people happy by confirming their biases.

    We all like to believe we’re in the right, and when we get some sign from the universe at large that we are correct, we feel better about ourselves. That’s how the curated newsfeed became wedded to the world of profitable propaganda.

    Adding a little art to brighten an other-wise dull wall seems like an unalloyed good, but only if one completely ignores bad actors. What if that blank canvas gets painted with hate speech? What if, perchance, the homes of ‘undesirables’ are singled out with graffiti that only bad actors can see? What happens when every gathering place for any oppressed community gets invisibly ‘tagged’? In short, what happens when bad actors use Facebook’s augmented reality to amplify their own capacity to act badly?

    But that’s Zuckerberg: he seems to believe his creations will only be used to bring out the best in people. He seems to believe his gigantic sharing network would never be used to incite mob violence. Just as he seems to claim that Facebook’s capacity to collect and profile the moods of its users should never be monetised—but, given that presentation unearthed by the Australian, Facebook tells a different story to advertisers.

    Regulating Facebook enshrines its position as the data-gathering and profile-building organisation, while keeping it plugged into and responsive to the needs of national powers. Before anyone takes steps that would cement Facebook in our social lives for the foreseeable future, it may be better to consider how this situation arose, and whether—given what we now know—there might be an opportunity to do things differently.
    https://meanjin.com.au/essays/the-last-days-of-reality
    Voting 0
  6. Mark Zuckerberg also launched Facebook with a disdain for intrusive advertising, but it wasn’t long before the social network giant became Google’s biggest competitor for ad dollars. After going public with 845 million users in 2012, Facebook became a multibillion-dollar company and Zuckerberg one of the richest men on Earth, but with only a promise that the company would figure out how to monetize its platform.

    Facebook ultimately sold companies on its platform by promising “brand awareness” and the best possible data on what consumers actually liked. Brands could start their own Facebook pages, which people would actually “like” and interact with. This provided unparalleled information about what company each individual person wanted to interact with the most. By engaging with companies on Facebook, people gave corporate marketing departments more information than they could have ever dreamed of buying, but here it was offered up free.

    This was the “grand bargain,” as Columbia University law professor Tim Wu called it in his book, The Attention Merchants, that users struck with corporations. Wu wrote that Facebook’s “billions of users worldwide were simply handing over a treasure trove of detailed demographic data and exposing themselves to highly targeted advertising in return for what, exactly?”

    In other words: We will give you every detail of our lives and you will get rich by selling that information to advertisers.

    European regulators are now saying that bargain was a bad deal. The big question that remains is whether their counterparts in the U.S. will follow their lead.
    https://www.huffingtonpost.com/entry/...antitrust_us_5a625023e4b0dc592a088f6c
    Voting 0
  7. Facebook's ability to figure out the "people we might know" is sometimes eerie. Many a Facebook user has been creeped out when a one-time Tinder date or an ex-boss from 10 years ago suddenly pops up as a friend recommendation. How does the big blue giant know?

    While some of these incredibly accurate friend suggestions are amusing, others are alarming, such as this story from Lisa*, a psychiatrist who is an infrequent Facebook user, mostly signing in to RSVP for events. Last summer, she noticed that the social network had started recommending her patients as friends—and she had no idea why.

    "I haven't shared my email or phone contacts with Facebook," she told me over the phone.

    The next week, things got weirder.

    Most of her patients are senior citizens or people with serious health or developmental issues, but she has one outlier: a 30-something snowboarder. Usually, Facebook would recommend he friend people his own age, who snowboard and jump out of planes. But Lisa told me that he had started seeing older and infirm people, such as a 70-year-old
    https://splinternews.com/facebook-rec...s-psychiatrists-patients-f-1793861472
    Tags: , , , by M. Fioretti (2018-01-28)
    Voting 0
  8. urope has propelled past the United States when it comes to constraining the abuses of Big Tech. In June, the European Union fined Google $2.7 billion for steering web users to its shopping site, and investigations remain active over similar treatment on Android phones. European regulators fined Facebook for lying about whether it could match user profiles with phone numbers on its messaging acquisition WhatsApp. They demanded Apple repay $15.3 billion in back taxes in Ireland. And they forced Amazon to change its e-book contracts, which they claimed inappropriately squeezed publishers.
    AP Photo/Rick Bowmer

    Trust-Busted: In 2002, Microsoft Chairman Bill Gates had to testify at federal court in his company's antitrust case. The public trial led Microsoft to sfoten its aggressive strategy against rivals.

    Unfortunately, these actions were treated mainly as the cost of doing business. The Facebook fine totaled not even 1 percent of the $22 billion purchase price for WhatsApp, and it allowed the two companies to remain partnered. Government policy, in effect, has “told these companies that the smart thing to do is to lie to us and break the law,” said Scott Galloway in his presentation. Google’s remedy in the shopping case still forces rivals to bid for placement at the top of the page, with Google Shopping spun off as a stand-alone competitor. This does weaken Google’s power and solves the “equal treatment” problem, but it doesn’t protect consumers, who will ultimately pay for those costly bids. “The EU got a $2.7 billion fine to hold a party and bail out Greek banks,” said Gary Reback, an antitrust lawyer and critic of the EU’s actions. “No amount of money will make a difference.”

    However, one thing might: Europe’s increasing move toward data privacy. The General Data Protection Regulation (GDPR), scheduled for implementation in May 2018, empowers European web users to affirmatively opt out of having their data collected, with high penalties for non-compliance. Consumers will be able to obtain their personal data and learn how it is used. They can request that their data be erased completely (known as the “right to be forgotten”) as well as prohibited from sale to third parties. Platforms could not condition use of their products on data collection. A separate, not-yet-finalized regulation called ePrivacy would forbid platforms from tracking users across separate apps, websites, and devices.
    http://prospect.org/article/big-tech-new-predatory-capitalism
    Voting 0
  9. When Facebook first came to Cambodia, many hoped it would help to usher in a new period of free speech, amplifying voices that countered the narrative of the government-friendly traditional press. Instead, the opposite has happened. Prime Minister Hun Sen is now using the platform to promote his message while jailing his critics, and his staff is doing its best to exploit Facebook’s own rules to shut down criticism — all through a direct relationship with the company’s staff.

    In Cambodia, Prime Minister Hun Sen has held power since 1998, a reign characterized by systematic looting, political patronage and violent suppression of human rights; when opposition parties used Facebook to organize a strong showing in the 2013 elections, Hun Sen turned to the tool to consolidate his slipping hold on power.

    In this he was greatly aided by Fresh News, a Facebook-based political tabloid that is analogous to far-right partisan US news sources like Breitbart; which acted as a literal stenographer for Sen, transcribing his remarks in "scoops" that vilify opposition figures and dissidents without evidence. Sen and Fresh News successfully forced an opposition leader into exile in France, and mined Facebook for the identities of political opponents, who were targeted for raids and arrests.

    The Cambodian government has cultivated a deep expertise in Facebook's baroque acceptable conduct rules, and they use this expertise to paint opposition speech as in violation of Facebook's policies, using the company's anti-abuse systems to purge their rivals from the platform.

    Offline, the government has targeted the independent press with raids and arrests, shutting down most of the media it does not control, making Facebook -- where the government is able to silence people with its rules-lawyering -- the only place for independent analysis and criticism of the state.

    Then, last October, Facebook used Cambodia in an experiment to de-emphasize news sources in peoples' feeds -- a change it will now roll out worldwide -- and hid those remaining independent reporters from the nation's view.

    Opposition figures have worked with independent researchers to show that the government is buying Facebook likes from clickfarms in the Philippines and India, racking up thousands of likes for Khmer-language posts in territories where Khmer isn't spoken. They reported these abuses to Facebook, hoping to get government posts downranked, but Facebook executives gave them the runaround or refused to talk to them. No action was taken on these violations of Facebook's rules.

    Among other things, the situation in Cambodia is a cautionary tale on the risks of "anti-abuse" policies, which are often disproportionately useful to trolls who devote long hours and careful study to staying on the right side of the lines that companies draw up, and scour systems for people they taunt into violations of these rules, getting the platforms to terminate them.

    When ordinary Facebook users find a post objectionable, they click a link on the post to report it. Then a Facebook employee judges whether it violates the platform’s rules and should be taken down. In practice, it’s a clunky process that involves no direct communication or chance for appeal, and the decisions made by Facebook can seem mysterious and arbitrary.

    But for the Cambodian government, that process has been streamlined by Facebook.

    Duong said every couple of months, his team would email an employee they work with at Facebook to request a set of accounts be taken down, either based on language they used or because their accounts did not appear to be registered to their real names, a practice Facebook’s rules forbid. Facebook often complies, he said.

    Clare Wareing, a spokesperson for Facebook, said the company removes “credible threats, hate speech, and impersonation profiles when we’re made aware of them.” Facebook says it only takes down material that violates its policies.
    https://www.buzzfeed.com/meghara/face...racy?utm_term=.or3XYz3wNX#.wpJgoz5Lvg
    Voting 0
  10. dismissing Facebook’s change as a mere strategy credit is perhaps to give short shrift to Zuckerberg’s genuine desire to leverage Facebook’s power to make the world a better place. Zuckerberg argued in his 2017 manifesto Building Global Community:

    Progress now requires humanity coming together not just as cities or nations, but also as a global community. This is especially important right now. Facebook stands for bringing us closer together and building a global community. When we began, this idea was not controversial. Every year, the world got more connected and this was seen as a positive trend. Yet now, across the world there are people left behind by globalization, and movements for withdrawing from global connection. There are questions about whether we can make a global community that works for everyone, and whether the path ahead is to connect more or reverse course.

    Our job at Facebook is to help people make the greatest positive impact while mitigating areas where technology and social media can contribute to divisiveness and isolation. Facebook is a work in progress, and we are dedicated to learning and improving. We take our responsibility seriously.

    That, though, leaves the question I raised in response to that manifesto:

    Even if Zuckerberg is right, is there anyone who believes that a private company run by an unaccountable all-powerful person that tracks your every move for the purpose of selling advertising is the best possible form said global governance should take?

    My deep-rooted suspicion of Zuckerberg’s manifesto has nothing to do with Facebook or Zuckerberg; I suspect that we agree on more political goals than not. Rather, my discomfort arises from my strong belief that centralized power is both inefficient and dangerous: no one person, or company, can figure out optimal solutions for everyone on their own, and history is riddled with examples of central planners ostensibly acting with the best of intentions — at least in their own minds — resulting in the most horrific of consequences; those consequences sometimes take the form of overt costs, both economic and humanitarian, and sometimes those costs are foregone opportunities and innovations. Usually it’s both.

    Facebook’s stated reasoning for this change only heightens these contradictions: if indeed Facebook as-is harms some users, fixing that is a good thing. And yet the same criticism becomes even more urgent: should the personal welfare of 2 billion people be Mark Zuckerberg’s personal responsibility?
    https://stratechery.com/2018/facebooks-motivations
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 12 Online Bookmarks of M. Fioretti: Tags: facebook + surveillance

About - Propulsed by SemanticScuttle