mfioretti: data ownership* + big data*

Bookmarks on this page are managed by an admin user.

30 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. So, in essence, we have given the designers of time and thought saving applications a grave responsibility. We have implicitly allowed them to choose for us what presumably will be in our best interests. I, for one, find the natural progression of that prospect extremely scary and amazingly it is all self-imposed!

    Google, Amazon and Facebook do this all the time by mining our data, targeting us with custom advertising and even creating profiles that in practice could rival ones that our intelligence agencies keep on criminals and terrorists.

    And here is the thing: We all are complicit in allowing this to happen. So next time when you turn on your GPS or phone, which by the way pin-points to within twenty feet of where you are at any time of the day or night, remember the power you are ceding to that vast network in the sky. Do the powers that operate that network really have our best interests in mind or will they one day decide to direct us all to drive off the proverbial cliff? I, for one, will be dusting off my old Rand McNally road maps
    http://www.huffingtonpost.com/ralph-a...-crippling-effect-of-m_b_8285818.html
    Voting 0
  2. A May 2014 White House report on “big data” notes that the ability to determine the demographic traits of individuals through algorithms and aggregation of online data has a potential downside beyond just privacy concerns: Systematic discrimination.

    There is a long history of denying access to bank credit and other financial services based on the communities from which applicants come — a practice called “redlining.” Likewise, the report warns, “Just as neighborhoods can serve as a proxy for racial or ethnic identity, there are new worries that big data technologies could be used to ‘digitally redline’ unwanted groups, either as customers, employees, tenants or recipients of credit.” (See materials from the report’s related research conference for scholars’ views on this and other issues.)

    One vexing problem, according to the report, is that potential digital discrimination is even less likely to be pinpointed, and therefore remedied.

    Approached without care, data mining can reproduce existing patterns of discrimination, inherit the prejudice of prior decision-makers, or simply reflect the widespread biases that persist in society. It can even have the perverse result of exacerbating existing inequalities by suggesting that historically disadvantaged groups actually deserve less favorable treatment.” The paper’s authors argue that the most likely legal basis for anti-discrimination enforcement, Title VII, is not currently adequate to stop many forms of discriminatory data mining, and “society does not have a ready answer for what to do about it.”

    Their 2014 paper “Digital Discrimination: The Case of Airbnb.com” examined listings for thousands of New York City landlords in mid-2012. Airbnb builds up a reputation system by allowing ratings from guests and hosts.

    The study’s findings include:

    “The raw data show that non-black and black hosts receive strikingly different rents: roughly $144 versus $107 per night, on average.” However, the researchers had to control for a variety of factors that might skew an accurate comparison, such as differences in geographical location.
    “Controlling for all of these factors, non-black hosts earn roughly 12% more for a similar apartment with similar ratings and photos relative to black hosts.”
    “Despite the potential of the Internet to reduce discrimination, our results suggest that social platforms such as Airbnb may have the opposite effect. Full of salient pictures and social profiles, these platforms make it easy to discriminate — as evidenced by the significant penalty faced by a black host trying to conduct business on Airbnb.”

    “Given Airbnb’s careful consideration of what information is available to guests and hosts,” Edelman and Luca note. “Airbnb might consider eliminating or reducing the prominence of host photos: It is not immediately obvious what beneficial information these photos provide, while they risk facilitating discrimination by guests. Particularly when a guest will be renting an entire property, the guest’s interaction with the host will be quite limited, and we see no real need for Airbnb to highlight the host’s picture.” (For its part, Airbnb responded to the study by saying that it prohibits discrimination in its terms of service, and that the data analyzed were both older and limited geographically.)
    http://journalistsresource.org/studie...racial-discrimination-research-airbnb
    Voting 0
  3. there seems to be something wrong with personalization. We are continuously bumping into obtrusive, uninteresting ads. Our digital personal assistant isn’t all that personal. We’ve lost friends to the algorithmic abyss of the News feed. The content we encounter online seems to repeat the same things again and again. There are five main reasons why personalization remains broken.

    Additionally, there lies a more general paradox at the very heart of personalization.

    Personalization promises to modify your digital experience based on your personal interests and preferences. Simultaneously, personalization is used to shape you, to influence you and guide your everyday choices and actions. Inaccessible and incomprehensible algorithms make autonomous decisions on your behalf. They reduce the amount of visible choices, thus restricting your personal agency.

    Because of the personalization gaps and internal paradox, personalization remains unfulfilling and incomplete. It leaves us with a feeling that it serves someone else’s interests better than our own.
    http://techcrunch.com/2015/06/25/the-...n=Feed%3A+Techcrunch+%28TechCrunch%29
    Voting 0
  4. Connected cars have the potential to make information such as the location and speed of vehicles readily available to advertisers, insurance and communication companies, who could then capitalize on the data for commercial purposes.

    According to Audi Chief Executive Rupert Stadler, his company "takes that very seriously."

    "A car is one’s second living room today," Stadler said at a business event in Berlin on Tuesday.
    A Google self-driving car goes on a test drive near the Computer History Museum in Mountain View, Calif. While these cars promise to be much safer, accidents will be inevitable. How those cars should react when faced with a series of bad, perhaps deadly, options is a field even less developed than the technology itself.
    © AP Photo/ Eric Risberg
    Apple Looking to Build Self-Driving Car, Too

    "That’s private. The only person who needs access to the data onboard is the customer."

    "The customer wants to be at the focus, and does not want to be exploited," Stadler added. "They want to be in control of their data and not subject to monitoring."
    http://sputniknews.com/europe/2015061...ntent=sPW&utm_campaign=URL_shortening
    Voting 0
  5. we know for a fact that companies like Google are giving corporate advertisers access to users based on the personal data they control -- and many of those advertisers are targeting individuals with the express intent to rip them off, sell them deadly products, and financially impoverish them.

    Some advertisers are just trying to help customers find a product they might like, but the dark version of online marketing is that it can facilitate what economists call "price discrimination," selling the same exact good at a variety of prices in ways unknown to the buyers. Researchers Rosa-Branc Esteves and Joana Resende highlight how with the low costs of online advertising, such online price discrimination systematically shifts wealth from consumers to corporate profits. One implication of their models is that "average prices with mass advertising i.e. without the discrimination allowed by targeting individual users online » are below those with targeted advertising," which follows the idea that firms will target certain consumers with promotions while enjoying higher prices paid by consumers kept ignorant of lower prices offered to others.

    Early Internet visionary Jaron Lanier, who pioneered ideas like "virtual reality" two decades ago, has noted that such access to behavioral targeting has even more appeal to the "tawdry" kinds of firms than the "dignified side of capitalism", since "ambulance chasers and snake oil salesmen" among the capitalist class thrive on such targeted access to their victims.

    Google isn't usually identified as a big player in the subprime mortgage debacle and its aftermath, but a significant portion of Google's profits in the mid-2000s were coming straight from subprime mortgage lenders advertising on its site. As Jeff Chester of the Center for Digital Democracy said back in 2007, "Many online companies depend for a disproportionate amount of their income on financial services advertising, with subprime in some cases accounting for a large part of it."

    Companies enticed customers with unrealistic "teaser rates" -- heavily advertised online -- that burdened borrowers with toxic terms and unmanageable obligations that exploded in later years. And as the racial and exploitive aspect of the mortgage meltdown was endemic with what some scholars described as reverse redlining, "the practice of targeting borrowers of color for loans on unfavorable terms." This offering of differential rates based on the characteristics of the borrower constitutes the most damaging price discrimination inflicting consumer harm in American history, for which Google played an integral (and profitable) role as an advertising intermediary where it was earning billions of dollars a year in that role.
    http://www.huffingtonpost.com/nathan-...ource=twitter.com&utm_campaign=buffer
    Voting 0
  6. America’s three obsessions – technology, fitness and finance – have finally converged in FitCoin, a new app that allows users to monetise their visits to the gym. The mechanism is simple: by integrating with popular activity trackers and wearables, the app converts our heartbeats into a digital currency. FitCoin’s founders hope that, much like its older sibling, bitcoin, this currency can be used to buy exclusive goods from partners such as Adidas and lower your insurance payments.

    FitCoin might fail but the principle behind it is indicative of the broader transformation of social life under conditions of permanent connectivity and instant commodification: what was previously done for pleasure or merely to conform to social norms is now firmly guided by the logic of the market. The other logics don’t disappear but they become secondary to the monetary incentive.

    The ability to measure all our activities remotely is opening up new avenues for speculation, as anyone – from corporations to insurance firms to governments – can now design sly compensatory schemes to elicit desired behaviour from consumers chasing a quick buck. As a result, even the most mundane of daily activities can be linked to global financial markets. Eventually, we’ll all be trading in derivatives that link our entitlement to receive specific medical services to our physical behaviour.


    The transformations happening in all these mundane venues –the gym, the parking lot, the restaurant – reveal that, once an informational layer is added on top of them, they might lose other layers and especially those of non-utilitarian, purely aesthetic enjoyment, solidarity and fairness. It could be that the worst excesses of capitalism were manageable, at least on a psychic level, precisely because we could occasionally shelter ourselves in various hermetic zones that did not bend to the logic of supply and demand. These zones, impervious to the rhythms of globalisation, reassured us that a personal autonomy outside the market bubble was a feasible objective.

    Thus we could always find solace in art, sport, food, urbanism: those domains, we would tell ourselves, were either driven by aesthetic, artisanal considerations or they featured enough cooperation and solidarity to make up for the occasional brutality of the market relations that they couldn’t escape. After all, there was something uplifting and reassuring about the fact that a hedge fund manager had to spend as much time as a janitor looking for a parking spot. Ten years ago, this presumed equality between the two was a fact of life that seemed unalterable; today, it’s merely a technological imperfection that could be easily corrected with a smartphone.

    Our lives have been made livable by such imperfections; many of our institutions thrived on them.

    The reason why the tales told to us by American hi-tech entrepreneurs sound so sweet is because they always present knowledge as something apolitical and existing outside contemporary struggles between citizens and governments or citizens and corporations. In the dream world of Silicon Valley, ordinary citizens wield as much power as insurance companies: thus, they reason, information about our activity will surely be equally empowering to both?

    From this perspective, the efforts to link up everyone and everything into an internet of things (“Next frontier for ‘internet of things’: babies” reads a recent headline on the business site, CNBC) could only mean that the spaces of imperfection that have temporarily allowed us to delay the triumph of market logic in all other domains of social life would shrink even further. And, if permanent connectivity is essential for that logic to exercise control over our lives, then the only autonomy worth fighting for – both for individuals and institutions – would be an autonomy that thrives on opacity, ignorance and disconnection. A right to connect is important – so is the right to disconnect.
    http://www.theguardian.com/commentisf...lives-evgeny-morozov?CMP=share_btn_tw
    Voting 0
  7. Once big data becomes fully consumerized, it will be possible for anyone to identify anyone based on anything from religious affiliation, sexual preference, political association, even something as trivial as rival sport team fanhood, which can then be used by individuals to discriminate against entire groups of people.

    And here we thought price discrimination and redlining by companies was all that society had to worry about in terms of rampant discriminatory behavior.

    Of course as Mitt Romney and others including the Supreme Court claim, "corporations are people" so, yes, discrimination of this type could conceivably happen on a very large scale and practically everywhere. Not because corporations and businesses are really people, for in fact they are merely legal and tax designations, but because people do indeed run businesses and therefore some of them may act as the flawed beings they are on occasion.
    http://www.fiercebigdata.com/story/in...e-big-data-consumerization/2015-04-06
    Voting 0
  8. Here's a look at what we know—and what we don't—about the consumer data industry.

    How much do these companies know about individual people?

    They start with the basics, like names, addresses and contact information, and add on demographics, like age, race, occupation and "education level," according to consumer data firm Acxiom's overview of its various categories.

    But that's just the beginning: The companies collect lists of people experiencing "life-event triggers" like getting married, buying a home, sending a kid to college—or even getting divorced.

    Credit reporting giant Experianhas a separate marketing services division, which sells lists of "names of expectant parents and families with newborns" that are "updated weekly."

    The companies also collect data about your hobbies and many of the purchases you make. Want to buy a list of people who read romance novels? Epsiloncan sell you that, as well as a list of people who donate to international aid charities.

    A subsidiary of credit reporting company Equifax even collects detailed salary and pay stub informationfor roughly 38 percentof employed Americans, as NBC news reported. As part of handling employee verification requests, the company gets the information directly from employers.

    Equifax said in a statement that the information is only sold to customers "who have been verified through a detailed credentialing process." It added that if a mortgage company or other lender wants to access information about your salary, they must obtain your permission to do so.
    http://www.propublica.org/article/eve...bout-what-data-brokers-know-about-you
    Voting 0
  9. Maybe the other way round: If users would have been paid for their data, business models driven by personal data would be less attractive or would look different at least. Additionally it heavily depends on which data is being sold: According to an OECD report 1 » bankruptcy info is worth $25/record, employment history about $14/record and educational history about $12/record. Background check or employment screening packages are sometimes worth $100-300/query. If companies really start selling aggregated data & scores based on digital behaviour, on body & health data and on various sensors at home and workplaces at large scale this will be much more valueable than today's profits mainly based on advertising. Anyway it's a lousy idea and it doesn't solve any of the fundamental problems of corporate surveillance. Users love being tracked - even when they get (nearly) nothing for it. They're using loyality cards since ages, small (pseudo) incentives are sufficient to make them participate in nearly anything.
    https://www.mail-archive.com/nettime-l@mail.kein.org/msg02722.html
    Voting 0
  10. The last panel of the day focussed on policy. Christine O’Keefe (CSIRO), Keith Spicer (ONS), Tanvi Desai (ADS) and our own Jeni Tennison (ODI) discussed data access mechanisms and policy implications. There is a spectrum of access methods and a more granular approach of who needs access and what they want to access will put in better safeguards for data sharing.

    Statistical disclosure control in the future may involve specialist hackers and for data that is not open, records and accountability of who has access, are crucial to engender trust.

    Anonymisation remains an important tool for anyone publishing data. While we should have sophisticated discussions on the future of personal data in our society, the crucial step for an individual is to consider data in its context.
    http://theodi.org/blog/paradise-or-in...s-from-the-uk-anonymisation-symposium
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 3 Online Bookmarks of M. Fioretti: Tags: data ownership + big data

About - Propulsed by SemanticScuttle