mfioretti: control*

Bookmarks on this page are managed by an admin user.

430 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. Stratumseind in Eindhoven is one of the busiest nightlife streets in the Netherlands. On a Saturday night, bars are packed, music blares through the street, laughter and drunken shouting bounces off the walls. As the night progresses, the ground becomes littered with empty shot bottles, energy drink cans, cigarette butts and broken glass.

    It’s no surprise that the place is also known for its frequent fights. To change that image, Stratumseind has become one of the “smartest” streets in the Netherlands. Lamp-posts have been fitted with wifi-trackers, cameras and 64 microphones that can detect aggressive behaviour and alert police officers to altercations. There has been a failed experiment to change light intensity to alter the mood. The next plan, starting this spring, is to diffuse the smell of oranges to calm people down. The aim? To make Stratumseind a safer place.

    We get that comment a lot – ‘Big brother is watching you’. I prefer to say, ‘Big brother is helping you’
    Peter van de Crommert

    All the while, data is being collected and stored. “Visitors do not realise they are entering a living laboratory,” says Maša Galic, a researcher on privacy in the public space for the Tilburg Institute of Law, Technology and Society. Since the data on Stratumseind is used to profile, nudge or actively target people, this “smart city” experiment is subject to privacy law. According to the Dutch Personal Data Protection Act, people should be notified in advance of data collection and the purpose should be specified – but in Stratumseind, as in many other “smart cities”, this is not the case.

    Peter van de Crommert is involved at Stratumseind as project manager with the Dutch Institute for Technology, Safety and Security. He says visitors do not have to worry about their privacy: the data is about crowds, not individuals. “We often get that comment – ‘Big brother is watching you’ – but I prefer to say, ‘Big brother is helping you’. We want safe nightlife, but not a soldier on every street corner.”
    Revellers in Eindhoven’s Stratumseind celebrate King’s Day.
    Revellers in Eindhoven’s Stratumseind celebrate King’s Day. Photograph: Filippo Manaresi/Moment Editorial/Getty Images

    When we think of smart cities, we usually think of big projects: Songdo in South Korea, the IBM control centre in Rio de Janeiro or the hundreds of new smart cities in India. More recent developments include Toronto, where Google will build an entirely new smart neighbourhood, and Arizona, where Bill Gates plans to build his own smart city. But the reality of the smart city is that it has stretched into the everyday fabric of urban life – particularly so in the Netherlands.

    In the eastern city of Enschede, city traffic sensors pick up your phone’s wifi signal even if you are not connected to the wifi network. The trackers register your MAC address, the unique network card number in a smartphone. The city council wants to know how often people visit Enschede, and what their routes and preferred spots are. Dave Borghuis, an Enschede resident, was not impressed and filed an official complaint. “I don’t think it’s okay for the municipality to track its citizens in this way,” he said. “If you walk around the city, you have to be able to imagine yourself unwatched.”

    Enschede is enthusiastic about the advantages of the smart city. The municipality says it is saving €36m in infrastructure investments by launching a smart traffic app that rewards people for good behaviour like cycling, walking and using public transport. (Ironically, one of the rewards is a free day of private parking.) Only those who mine the small print will discover that the app creates “personal mobility profiles”, and that the collected personal data belongs to the company Mobidot.
    Voting 0
  2. Finally, there’s what the authors call “political security” – using AI to automate tasks involved in surveillance, persuasion (creating targeted propaganda) and deception (eg, manipulating videos). We can also expect new kinds of attack based on machine-learning’s capability to infer human behaviours, moods and beliefs from available data. This technology will obviously be welcomed by authoritarian states, but it will also further undermine the ability of democracies to sustain truthful public debates. The bots and fake Facebook accounts that currently pollute our public sphere will look awfully amateurish in a couple of years.

    The report is available as a free download and is worth reading in full. If it were about the dangers of future or speculative technologies, then it might be reasonable to dismiss it as academic scare-mongering. The alarming thing is most of the problematic capabilities that its authors envisage are already available and in many cases are currently embedded in many of the networked services that we use every day. William Gibson was right: the future has already arrived.
    Voting 0
  3. Google tracks you on more than just their search engine. You may realize they also track you on YouTube, Gmail, Chrome, Android, Gmaps, and all the other services they run. For those, we recommend using private alternatives like DuckDuckGo for search. Yes, you can live Google-free. I’ve been doing it for many years.

    What you may not realize, though, is Google trackers are actually lurking behind the scenes on 75% of the top million websites. To give you a sense of how large that is, Facebook is the next closest with 25%. It’s a good bet that any random site you land on the Internet will have a Google tracker hiding on it. Between the two of them, they are truly dominating online advertising, by some measures literally making up 74%+ of all its growth. A key component of how they have managed to do that is through all these hidden trackers.

    Google Analytics is installed on most sites, tracking you behind the scenes, letting website owners know who is visiting their sites, but also feeding that information back to Google. Same for the ads themselves, with Google running three of the largest non-search ad networks installed on millions of sites and apps: Adsense, Admob, and DoubleClick.

    You know those ads that creepily follow you around everywhere? Most of those are actually run through these Google ad networks, where they let advertisers target you against your search history, browsing history, location history and other personal information they collect. Even less well known is they also enable advertisers like airlines to charge you different prices based upon your personal information.

    These ads are not only annoying — they are literally designed to manipulate you through targeting to make you buy more things, and just showing them to you is an act of Google profiting off of your personal information.

    At DuckDuckGo, we’ve expanded beyond our roots in search, to protect you no matter where you go on the Internet. Our DuckDuckGo browser extension and mobile app is available for all major browsers and devices, and blocks these Google trackers, along with the ones from Facebook and countless other data brokers. It does even more to protect you as well like providing smarter encryption.

    #3 — Get unbiased results, outside the Filter Bubble.

    When you search, you expect unbiased results, but that’s not what you get on Google. On Google, you get results tailored to what they think you’re likely to click on, based on the data profile they’ve built on you over time from all that tracking I described above.

    That may appear at first blush to be a good thing, but when most people say they want personalization in a search context they actually want localization. They want local weather and restaurants, which can actually be provided without tracking, like we do at DuckDuckGo. That’s because approximate location info is automatically embedded by your computer in the search request, which we can use to serve you local results and immediately throw away without tracking you.

    Beyond localization, personalized results are dangerous because to show you results they think you’ll click on, they must filter results they think you’ll skip. That’s why it’s called the Filter Bubble.

    So if you have political leanings one way or another, you’re more likely to get results you already agree with, and less likely to ever see opposing viewpoints. In the aggregate this leads to increased echo chambers that are significantly contributing to our increasingly polarized society.

    This Filter Bubble is especially pernicious in a search context because you have the expectation that you’re seeing what others are seeing, that you’re seeing the “results.” We’ve done studies over the years where we have people search for the same topics on Google at the same time and in “Incognito” mode, and found they are significantly tailored.
    Voting 0
  4. IoT will be able to take stock of your choices, moods, preferences and tastes, the same way Google Search does. With enough spreadsheets, many practical questions are rendered trivial. How hard will it be for the IoT — maybe through Alexa, maybe through your phone — to statistically study why, where and when you raise your voice at your child? If you can correlate people’s habits and physical attributes, it will be toddler-easy to correlate mood to environment. The digitally connected devices of tomorrow would be poor consumer products if they did not learn you well. Being a good and faithful servant means monitoring the master closely, and that is what IoT devices will do. They will analyze your feedback and automate their responses — and predict your needs. In the IoT, Big Data is weaponized, and can peer deeper into the seeds your life than the government has ever dreamed.
    Voting 0
  5. Mark Zuckerberg also launched Facebook with a disdain for intrusive advertising, but it wasn’t long before the social network giant became Google’s biggest competitor for ad dollars. After going public with 845 million users in 2012, Facebook became a multibillion-dollar company and Zuckerberg one of the richest men on Earth, but with only a promise that the company would figure out how to monetize its platform.

    Facebook ultimately sold companies on its platform by promising “brand awareness” and the best possible data on what consumers actually liked. Brands could start their own Facebook pages, which people would actually “like” and interact with. This provided unparalleled information about what company each individual person wanted to interact with the most. By engaging with companies on Facebook, people gave corporate marketing departments more information than they could have ever dreamed of buying, but here it was offered up free.

    This was the “grand bargain,” as Columbia University law professor Tim Wu called it in his book, The Attention Merchants, that users struck with corporations. Wu wrote that Facebook’s “billions of users worldwide were simply handing over a treasure trove of detailed demographic data and exposing themselves to highly targeted advertising in return for what, exactly?”

    In other words: We will give you every detail of our lives and you will get rich by selling that information to advertisers.

    European regulators are now saying that bargain was a bad deal. The big question that remains is whether their counterparts in the U.S. will follow their lead.
    Voting 0
  6. Facebook's ability to figure out the "people we might know" is sometimes eerie. Many a Facebook user has been creeped out when a one-time Tinder date or an ex-boss from 10 years ago suddenly pops up as a friend recommendation. How does the big blue giant know?

    While some of these incredibly accurate friend suggestions are amusing, others are alarming, such as this story from Lisa*, a psychiatrist who is an infrequent Facebook user, mostly signing in to RSVP for events. Last summer, she noticed that the social network had started recommending her patients as friends—and she had no idea why.

    "I haven't shared my email or phone contacts with Facebook," she told me over the phone.

    The next week, things got weirder.

    Most of her patients are senior citizens or people with serious health or developmental issues, but she has one outlier: a 30-something snowboarder. Usually, Facebook would recommend he friend people his own age, who snowboard and jump out of planes. But Lisa told me that he had started seeing older and infirm people, such as a 70-year-old
    Tags: , , , by M. Fioretti (2018-01-28)
    Voting 0
  7. urope has propelled past the United States when it comes to constraining the abuses of Big Tech. In June, the European Union fined Google $2.7 billion for steering web users to its shopping site, and investigations remain active over similar treatment on Android phones. European regulators fined Facebook for lying about whether it could match user profiles with phone numbers on its messaging acquisition WhatsApp. They demanded Apple repay $15.3 billion in back taxes in Ireland. And they forced Amazon to change its e-book contracts, which they claimed inappropriately squeezed publishers.
    AP Photo/Rick Bowmer

    Trust-Busted: In 2002, Microsoft Chairman Bill Gates had to testify at federal court in his company's antitrust case. The public trial led Microsoft to sfoten its aggressive strategy against rivals.

    Unfortunately, these actions were treated mainly as the cost of doing business. The Facebook fine totaled not even 1 percent of the $22 billion purchase price for WhatsApp, and it allowed the two companies to remain partnered. Government policy, in effect, has “told these companies that the smart thing to do is to lie to us and break the law,” said Scott Galloway in his presentation. Google’s remedy in the shopping case still forces rivals to bid for placement at the top of the page, with Google Shopping spun off as a stand-alone competitor. This does weaken Google’s power and solves the “equal treatment” problem, but it doesn’t protect consumers, who will ultimately pay for those costly bids. “The EU got a $2.7 billion fine to hold a party and bail out Greek banks,” said Gary Reback, an antitrust lawyer and critic of the EU’s actions. “No amount of money will make a difference.”

    However, one thing might: Europe’s increasing move toward data privacy. The General Data Protection Regulation (GDPR), scheduled for implementation in May 2018, empowers European web users to affirmatively opt out of having their data collected, with high penalties for non-compliance. Consumers will be able to obtain their personal data and learn how it is used. They can request that their data be erased completely (known as the “right to be forgotten”) as well as prohibited from sale to third parties. Platforms could not condition use of their products on data collection. A separate, not-yet-finalized regulation called ePrivacy would forbid platforms from tracking users across separate apps, websites, and devices.
    Voting 0
  8. When Facebook first came to Cambodia, many hoped it would help to usher in a new period of free speech, amplifying voices that countered the narrative of the government-friendly traditional press. Instead, the opposite has happened. Prime Minister Hun Sen is now using the platform to promote his message while jailing his critics, and his staff is doing its best to exploit Facebook’s own rules to shut down criticism — all through a direct relationship with the company’s staff.

    In Cambodia, Prime Minister Hun Sen has held power since 1998, a reign characterized by systematic looting, political patronage and violent suppression of human rights; when opposition parties used Facebook to organize a strong showing in the 2013 elections, Hun Sen turned to the tool to consolidate his slipping hold on power.

    In this he was greatly aided by Fresh News, a Facebook-based political tabloid that is analogous to far-right partisan US news sources like Breitbart; which acted as a literal stenographer for Sen, transcribing his remarks in "scoops" that vilify opposition figures and dissidents without evidence. Sen and Fresh News successfully forced an opposition leader into exile in France, and mined Facebook for the identities of political opponents, who were targeted for raids and arrests.

    The Cambodian government has cultivated a deep expertise in Facebook's baroque acceptable conduct rules, and they use this expertise to paint opposition speech as in violation of Facebook's policies, using the company's anti-abuse systems to purge their rivals from the platform.

    Offline, the government has targeted the independent press with raids and arrests, shutting down most of the media it does not control, making Facebook -- where the government is able to silence people with its rules-lawyering -- the only place for independent analysis and criticism of the state.

    Then, last October, Facebook used Cambodia in an experiment to de-emphasize news sources in peoples' feeds -- a change it will now roll out worldwide -- and hid those remaining independent reporters from the nation's view.

    Opposition figures have worked with independent researchers to show that the government is buying Facebook likes from clickfarms in the Philippines and India, racking up thousands of likes for Khmer-language posts in territories where Khmer isn't spoken. They reported these abuses to Facebook, hoping to get government posts downranked, but Facebook executives gave them the runaround or refused to talk to them. No action was taken on these violations of Facebook's rules.

    Among other things, the situation in Cambodia is a cautionary tale on the risks of "anti-abuse" policies, which are often disproportionately useful to trolls who devote long hours and careful study to staying on the right side of the lines that companies draw up, and scour systems for people they taunt into violations of these rules, getting the platforms to terminate them.

    When ordinary Facebook users find a post objectionable, they click a link on the post to report it. Then a Facebook employee judges whether it violates the platform’s rules and should be taken down. In practice, it’s a clunky process that involves no direct communication or chance for appeal, and the decisions made by Facebook can seem mysterious and arbitrary.

    But for the Cambodian government, that process has been streamlined by Facebook.

    Duong said every couple of months, his team would email an employee they work with at Facebook to request a set of accounts be taken down, either based on language they used or because their accounts did not appear to be registered to their real names, a practice Facebook’s rules forbid. Facebook often complies, he said.

    Clare Wareing, a spokesperson for Facebook, said the company removes “credible threats, hate speech, and impersonation profiles when we’re made aware of them.” Facebook says it only takes down material that violates its policies.
    Voting 0
  9. Some entries are ambiguous. Take Microsoft, under the “operational services” category. PayPal apparently supplies the tech company with an image of a customer–a photo or video–or their image from an identity document for the purposes of “facial image comparison for fraud protection” and “research and testing as to appropriateness of new products.” The former sounds like some kind of facial recognition system that PayPal uses to look for fraud. But the latter is uneasily broad. What kind of research is Microsoft doing using pictures of PayPal users’ faces? PayPal did not comment on this specific question.
    Voting 0
  10. dismissing Facebook’s change as a mere strategy credit is perhaps to give short shrift to Zuckerberg’s genuine desire to leverage Facebook’s power to make the world a better place. Zuckerberg argued in his 2017 manifesto Building Global Community:

    Progress now requires humanity coming together not just as cities or nations, but also as a global community. This is especially important right now. Facebook stands for bringing us closer together and building a global community. When we began, this idea was not controversial. Every year, the world got more connected and this was seen as a positive trend. Yet now, across the world there are people left behind by globalization, and movements for withdrawing from global connection. There are questions about whether we can make a global community that works for everyone, and whether the path ahead is to connect more or reverse course.

    Our job at Facebook is to help people make the greatest positive impact while mitigating areas where technology and social media can contribute to divisiveness and isolation. Facebook is a work in progress, and we are dedicated to learning and improving. We take our responsibility seriously.

    That, though, leaves the question I raised in response to that manifesto:

    Even if Zuckerberg is right, is there anyone who believes that a private company run by an unaccountable all-powerful person that tracks your every move for the purpose of selling advertising is the best possible form said global governance should take?

    My deep-rooted suspicion of Zuckerberg’s manifesto has nothing to do with Facebook or Zuckerberg; I suspect that we agree on more political goals than not. Rather, my discomfort arises from my strong belief that centralized power is both inefficient and dangerous: no one person, or company, can figure out optimal solutions for everyone on their own, and history is riddled with examples of central planners ostensibly acting with the best of intentions — at least in their own minds — resulting in the most horrific of consequences; those consequences sometimes take the form of overt costs, both economic and humanitarian, and sometimes those costs are foregone opportunities and innovations. Usually it’s both.

    Facebook’s stated reasoning for this change only heightens these contradictions: if indeed Facebook as-is harms some users, fixing that is a good thing. And yet the same criticism becomes even more urgent: should the personal welfare of 2 billion people be Mark Zuckerberg’s personal responsibility?
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 43 Online Bookmarks of M. Fioretti: Tags: control

About - Propulsed by SemanticScuttle