mfioretti: discrimination*

Bookmarks on this page are managed by an admin user.

15 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. They have created lists of victims of sexual assault, and lists of people with sexually transmitted diseases. Lists of people who have Alzheimer’s, dementia and AIDS. Lists of the impotent and the depressed.

    There are lists of “impulse buyers.” Lists of suckers: gullible consumers who have shown that they are susceptible to “vulnerability-based marketing.” And lists of those deemed commercially undesirable because they live in or near trailer parks or nursing homes. Not to mention lists of people who have been accused of wrongdoing, even if they were not charged or convicted.

    Typically sold at a few cents per name, the lists don’t have to be particularly reliable to attract eager buyers — mostly marketers, but also, increasingly, financial institutions vetting customers to guard against fraud, and employers screening potential hires.

    There are three problems with these lists.

    + First, they are often inaccurate.
    + Second, even when the information is accurate, many of the lists have no business being in the hands of retailers, bosses or banks. Having a medical condition, or having been a victim of a crime, is simply not relevant to most employment or credit decisions.

    Third, people aren’t told they are on these lists, so they have no opportunity to correct bad information.


    It’s unrealistic to expect individuals to inquire, broker by broker, about their files. Instead, we need to require brokers to make targeted disclosures to consumers. Uncovering problems in Big Data (or decision models based on that data) should not be a burden we expect individuals to solve on their own.
    http://www.nytimes.com/2014/10/17/opi...rk-market-for-personal-data.html?_r=0
    Voting 0
  2. There's now an intense scrutiny of the actions and habits of employees and potential employees, in the hope that statistical analysis will reveal those who have desired workplace traits. Factors such as choice of web browser, or when and where they eat lunch, could affect their chances.

    This process runs up against anti-discrimination laws in countries like Australia, where employers can't base their decisions on attributes such as race, sex, disability, age, and marital status.

    " Burdon and Harpur » argue that it's almost impossible for these laws to be applied when the decisions are made on the basis of talent analytics, because it's usually almost impossible for either data users (employers), or data subjects, to know even what data is being used to make decisions," Greenleaf said.

    "This is very important if we're to preserve the hard-won social policies represented by anti-discrimination laws, and prevent the hidden heuristics and emerging employment practices starting to mean that 'data is destiny'."

    Big data's approach of collecting as much data as you can, even if it seems irrelevant, because it may reveal a previously unknown correlation, also collides with the "data minimisation" principles of data privacy laws, which say that you only collect the data you need to do the job.
    http://www.zdnet.com/why-big-data-eva...sent-to-re-education-camps-7000033862
    Voting 0
  3. I'm pretty sure the people who create advertisements are robots, because I'd rather not think about the type of person you'd have to be to decide that any of these were good ideas.

    Don't go thinking sexist advertising only affects women: At 1:37, you'll find out how it's changed the way men see themselves, and at 2:49, there's an eye-opening experiment that you don't want to miss.
    http://www.upworthy.com/the-people-wh...ave-a-lot-of-explaining-to-do-6?c=bm1
    Voting 0
  4. If you're on the wrong side of the class divide, recent advances in retail tech will make for depressing reading. For example, some years ago Britain's class system was automated. Now, as you shop, machines can discriminate against you far more efficiently.

    When you phone a big retailer, a machine decides what level of service you get, depending on your status. The software that makes this social judgement switches all incoming phone calls. The system identifies your phone number and cross-references that with its customer database. It then discovers, having looked up your address, what class of person you are and will route your call according to the class of service it thinks you merit.
    http://www.independent.co.uk/life-sty...to-be-used-for-marketing-8413809.html
    Voting 0
  5. You don’t see male heroes wearing these costumes or posing like this. Outside of statistical outliers like Namor, their costumes tend to have full coverage, and when they pose, it’s to inspire fear, not boners.
    http://rosalarian.tumblr.com/post/2325861377/dressed-to-kill
    Tags: , , , by M. Fioretti (2012-01-17)
    Voting 0

Top of the page

First / Previous / Next / Last / Page 2 of 2 Online Bookmarks of M. Fioretti: Tags: discrimination

About - Propulsed by SemanticScuttle