mfioretti: discrimination* + algorithms*

Bookmarks on this page are managed by an admin user.

2 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. A May 2014 White House report on “big data” notes that the ability to determine the demographic traits of individuals through algorithms and aggregation of online data has a potential downside beyond just privacy concerns: Systematic discrimination.

    There is a long history of denying access to bank credit and other financial services based on the communities from which applicants come — a practice called “redlining.” Likewise, the report warns, “Just as neighborhoods can serve as a proxy for racial or ethnic identity, there are new worries that big data technologies could be used to ‘digitally redline’ unwanted groups, either as customers, employees, tenants or recipients of credit.” (See materials from the report’s related research conference for scholars’ views on this and other issues.)

    One vexing problem, according to the report, is that potential digital discrimination is even less likely to be pinpointed, and therefore remedied.

    Approached without care, data mining can reproduce existing patterns of discrimination, inherit the prejudice of prior decision-makers, or simply reflect the widespread biases that persist in society. It can even have the perverse result of exacerbating existing inequalities by suggesting that historically disadvantaged groups actually deserve less favorable treatment.” The paper’s authors argue that the most likely legal basis for anti-discrimination enforcement, Title VII, is not currently adequate to stop many forms of discriminatory data mining, and “society does not have a ready answer for what to do about it.”

    Their 2014 paper “Digital Discrimination: The Case of Airbnb.com” examined listings for thousands of New York City landlords in mid-2012. Airbnb builds up a reputation system by allowing ratings from guests and hosts.

    The study’s findings include:

    “The raw data show that non-black and black hosts receive strikingly different rents: roughly $144 versus $107 per night, on average.” However, the researchers had to control for a variety of factors that might skew an accurate comparison, such as differences in geographical location.
    “Controlling for all of these factors, non-black hosts earn roughly 12% more for a similar apartment with similar ratings and photos relative to black hosts.”
    “Despite the potential of the Internet to reduce discrimination, our results suggest that social platforms such as Airbnb may have the opposite effect. Full of salient pictures and social profiles, these platforms make it easy to discriminate — as evidenced by the significant penalty faced by a black host trying to conduct business on Airbnb.”

    “Given Airbnb’s careful consideration of what information is available to guests and hosts,” Edelman and Luca note. “Airbnb might consider eliminating or reducing the prominence of host photos: It is not immediately obvious what beneficial information these photos provide, while they risk facilitating discrimination by guests. Particularly when a guest will be renting an entire property, the guest’s interaction with the host will be quite limited, and we see no real need for Airbnb to highlight the host’s picture.” (For its part, Airbnb responded to the study by saying that it prohibits discrimination in its terms of service, and that the data analyzed were both older and limited geographically.)
    http://journalistsresource.org/studie...racial-discrimination-research-airbnb
    Voting 0
  2. There's now an intense scrutiny of the actions and habits of employees and potential employees, in the hope that statistical analysis will reveal those who have desired workplace traits. Factors such as choice of web browser, or when and where they eat lunch, could affect their chances.

    This process runs up against anti-discrimination laws in countries like Australia, where employers can't base their decisions on attributes such as race, sex, disability, age, and marital status.

    " Burdon and Harpur » argue that it's almost impossible for these laws to be applied when the decisions are made on the basis of talent analytics, because it's usually almost impossible for either data users (employers), or data subjects, to know even what data is being used to make decisions," Greenleaf said.

    "This is very important if we're to preserve the hard-won social policies represented by anti-discrimination laws, and prevent the hidden heuristics and emerging employment practices starting to mean that 'data is destiny'."

    Big data's approach of collecting as much data as you can, even if it seems irrelevant, because it may reveal a previously unknown correlation, also collides with the "data minimisation" principles of data privacy laws, which say that you only collect the data you need to do the job.
    http://www.zdnet.com/why-big-data-eva...sent-to-re-education-camps-7000033862
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 1 Online Bookmarks of M. Fioretti: Tags: discrimination + algorithms

About - Propulsed by SemanticScuttle