mfioretti: discrimination* + surveillance*

Bookmarks on this page are managed by an admin user.

3 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. A new ProPublica investigation1 uncovered a disturbing fact: Facebook allows advertisers to exclude Black, Hispanic and other so-called “ethnic affinity” groups from seeing ads.

    Let us say it again: Facebook is allowing advertisers to discriminate based on race.

    This is more than the normal yuckiness of targeted advertising. Facebook's system allowed ProPublica reporters to purchase a housing ad that excluded people of color in what prominent civil rights lawyers describe as a clear violation of the federal Fair Housing Act.
    http://act.freepress.net/sign/interne...5848.10606804.gXtK4u?source=fptwitter
    Voting 0
  2. A May 2014 White House report on “big data” notes that the ability to determine the demographic traits of individuals through algorithms and aggregation of online data has a potential downside beyond just privacy concerns: Systematic discrimination.

    There is a long history of denying access to bank credit and other financial services based on the communities from which applicants come — a practice called “redlining.” Likewise, the report warns, “Just as neighborhoods can serve as a proxy for racial or ethnic identity, there are new worries that big data technologies could be used to ‘digitally redline’ unwanted groups, either as customers, employees, tenants or recipients of credit.” (See materials from the report’s related research conference for scholars’ views on this and other issues.)

    One vexing problem, according to the report, is that potential digital discrimination is even less likely to be pinpointed, and therefore remedied.

    Approached without care, data mining can reproduce existing patterns of discrimination, inherit the prejudice of prior decision-makers, or simply reflect the widespread biases that persist in society. It can even have the perverse result of exacerbating existing inequalities by suggesting that historically disadvantaged groups actually deserve less favorable treatment.” The paper’s authors argue that the most likely legal basis for anti-discrimination enforcement, Title VII, is not currently adequate to stop many forms of discriminatory data mining, and “society does not have a ready answer for what to do about it.”

    Their 2014 paper “Digital Discrimination: The Case of Airbnb.com” examined listings for thousands of New York City landlords in mid-2012. Airbnb builds up a reputation system by allowing ratings from guests and hosts.

    The study’s findings include:

    “The raw data show that non-black and black hosts receive strikingly different rents: roughly $144 versus $107 per night, on average.” However, the researchers had to control for a variety of factors that might skew an accurate comparison, such as differences in geographical location.
    “Controlling for all of these factors, non-black hosts earn roughly 12% more for a similar apartment with similar ratings and photos relative to black hosts.”
    “Despite the potential of the Internet to reduce discrimination, our results suggest that social platforms such as Airbnb may have the opposite effect. Full of salient pictures and social profiles, these platforms make it easy to discriminate — as evidenced by the significant penalty faced by a black host trying to conduct business on Airbnb.”

    “Given Airbnb’s careful consideration of what information is available to guests and hosts,” Edelman and Luca note. “Airbnb might consider eliminating or reducing the prominence of host photos: It is not immediately obvious what beneficial information these photos provide, while they risk facilitating discrimination by guests. Particularly when a guest will be renting an entire property, the guest’s interaction with the host will be quite limited, and we see no real need for Airbnb to highlight the host’s picture.” (For its part, Airbnb responded to the study by saying that it prohibits discrimination in its terms of service, and that the data analyzed were both older and limited geographically.)
    http://journalistsresource.org/studie...racial-discrimination-research-airbnb
    Voting 0
  3. If you're on the wrong side of the class divide, recent advances in retail tech will make for depressing reading. For example, some years ago Britain's class system was automated. Now, as you shop, machines can discriminate against you far more efficiently.

    When you phone a big retailer, a machine decides what level of service you get, depending on your status. The software that makes this social judgement switches all incoming phone calls. The system identifies your phone number and cross-references that with its customer database. It then discovers, having looked up your address, what class of person you are and will route your call according to the class of service it thinks you merit.
    http://www.independent.co.uk/life-sty...to-be-used-for-marketing-8413809.html
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 1 Online Bookmarks of M. Fioretti: Tags: discrimination + surveillance

About - Propulsed by SemanticScuttle