mfioretti: common sense*

Bookmarks on this page are managed by an admin user.

6 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. Ask Järvinen what difference money for nothing has made to his life, and you are marched over to his workshop. Inside is film-making equipment, a blackboard on which is scrawled plans for an artists’ version of Airbnb, and an entire little room where he makes shaman drums that sell for up to €900. All this while helping to bring up six children. All those free euros have driven him to work harder than ever.

    None of this would have been possible before he received UBI. Until this year, Järvinen was on dole money; the Finnish equivalent of the jobcentre was always on his case about job applications and training. Ideas flow out of Järvinen as easily as water from a tap, yet he could exercise none of his initiative for fear of arousing bureaucratic scrutiny.

    In one talked-about case last year, an unemployed Finn called Christian was caught carving and selling wooden guitar plectrums. It was more pastime than business, earning him a little more than €2,000 in a year. But the sum was not what angered the authorities, it was the thought that each plectrum had taken up time that could have been spent on official hoop-jumping.
    Iain Duncan Smith
    ‘For Iain Duncan Smith, poverty was the rotten fruit of broken families, addiction or debt.’ Photograph: Bloomberg/via Getty Images

    That was Järvinen, too, until this year. Just as with so many Britons on social security, he was trapped in a “humiliating” system that gave him barely enough to feed himself, while refusing him even a glimmer of a hope of fulfilment.

    So what accounted for his change? Certainly not the UBI money. In Finland, €560 is less than a fifth of average private-sector income. “You have to be a magician to survive on such money,” Järvinen says. Over and over, he baldly describes himself as “poor”.

    His liberation came in the lack of conditions attached to the money.
    Voting 0
  2. What you see when you fire up a device is dependent on a variety of factors: what browser you use, whether you’re on a mobile phone or a laptop, the quality of your display, the lighting conditions, and, especially, your vision.

    When you build a site and ignore what happens afterwards — when the values entered in code are translated into brightness and contrast depending on the settings of a physical screen — you’re avoiding the experience that you create. And when you design in perfect settings, with big, contrast-rich monitors, you blind yourself to users. To arbitrarily throw away contrast based on a fashion that “looks good on my perfect screen in my perfectly lit office” is abdicating designers’ responsibilities to the very people for whom they are designing.

    My plea to designers and software engineers: Ignore the fads and go back to the typographic principles of print — keep your type black, and vary weight and font instead of grayness. You’ll be making things better for people who read on smaller, dimmer screens, even if their eyes aren’t aging like mine. It may not be trendy, but it’s time to consider who is being left out by the web’s aesthetic.
    Tags: , , , , by M. Fioretti (2016-10-20)
    Voting 0
  3. Innovation is a dominant ideology of our era, embraced in America by Silicon Valley, Wall Street, and the Washington DC political elite. As the pursuit of innovation has inspired technologists and capitalists, it has also provoked critics who suspect that the peddlers of innovation radically overvalue innovation. What happens after innovation, they argue, is more important. Maintenance and repair, the building of infrastructures, the mundane labour that goes into sustaining functioning and efficient infrastructures, simply has more impact on people’s daily lives than the vast majority of technological innovations.

    The fates of nations on opposing sides of the Iron Curtain illustrate good reasons that led to the rise of innovation as a buzzword and organising concept. Over the course of the 20th century, open societies that celebrated diversity, novelty, and progress performed better than closed societies that defended uniformity and order.

    In the late 1960s in the face of the Vietnam War, environmental degradation, the Kennedy and King assassinations, and other social and technological disappointments, it grew more difficult for many to have faith in moral and social progress. To take the place of progress, ‘innovation’, a smaller, and morally neutral, concept arose. Innovation provided a way to celebrate the accomplishments of a high-tech age without expecting too much from them in the way of moral and social improvement.
    Voting 0
  4. In an op-ed in the student newspaper, four Columbia University undergrads have called on the school to implement trigger warnings — alerts about potentially distressing material — even for classics like Greek mythology or Roman poetry.

    “Ovid’s ‘Metamorphoses’ is a fixture of Lit Hum, but like so many texts in the Western canon, it contains triggering and offensive material that marginalizes student identities in the classroom,” wrote the four students, who are members of Columbia’s Multicultural Affairs Advisory Board. “These texts, wrought with histories and narratives of exclusion and oppression, can be difficult to read and discuss as a survivor, a person of color, or a student from a low-income background.”

    The April 30 op-ed has stirred debate on campus and online.

    “Grow up, open up, care less about your identity and more about your passions,” wrote one of hundreds of commenters. “Such an insufferable breed of self-centered Care Bears.”

    The op-ed comes at a time of intense debate about trigger warnings, a term that is 20 years old but only recently has become a proxy for broader issues such as political correctness, identity politics, liberal arts education and sexual assault.

    The phrase can be traced back to the treatment of Vietnam War veterans in the 1980s, according to BuzzFeed’s Alison Vingiano. Psychologists started identifying “triggers” that sent vets spiraling into flashbacks of past traumas. With the rise of the Internet in the late ’90s, feminist message boards began using “trigger warnings” to

    warn readers of content that could stir up painful or paralyzing memories of sexual assault.

    Trigger warnings quickly spread to include discussions of everything from eating disorders to self injury to suicide. In 2010, sex blogger Susannah Breslin wrote that feminists were using the term “like a Southern cook applies Pam cooking spray to an overused nonstick frying pan.” Breslin argued that trigger warnings were pointless or, even worse, self-defeating. A trigger warning is “like a flashing neon sign, attracting *more* attention to a particularly explicit post, even as it purports to deflect the attention of those to whom it might actually be relevant.”

    By 2012, The Awl’s Choire Sicha argued that the phrase had “lost all its meaning.”

    “In reality, trigger warnings are unrealistic,” argued Breslin, the sex blogger. “They are the dream-child of a fantasy in which the unknown can be labeled, anticipated, and controlled. What trigger warnings promise — protection — does not exist. The world is simply too chaotic, too out-of-control for every trigger to be anticipated, avoided, and defused.”

    “Hypersensitivity to the trauma allegedly inflicted by listening to controversial ideas approaches a strange form of derangement — a disorder whose lethal spread in academia grows by the day,” Harvey Silverglate opined in the Wall Street Journal. “What should be the object of derision, a focus for satire, is instead the subject of serious faux academic discussion and precautionary warnings. For this disorder there is no effective quarantine. A whole generation of students soon will have imbibed the warped notions of justice and entitlement now handed down as dogma in the universities.”
    Voting 0
  5. Bill Maher benefits from the hive mind mentality of so many atheists. You cannot disagree with Bill Maher without simultaneously delivering a slap to atheism — you must not foster divisiveness. You must accept all prominent celebrities who openly embrace atheism as pure paragons of human goodness — it is simply too complicated to think that a person might have a mix of views that are sometimes appealing, sometimes repugnant. So we constantly loft up “heroes” as exemplars, failing to recognize that the essence of atheism has to be a recognition of the flawed humanity of its people, and then we end up with primitive atheists getting defensive and angry at all those critics who point at the awkward reality of those heroes, whether they’re Feynman or Maher or Sanger or whoever.

    The problem is compounded by the fact that these same boosters of the Brave Hero Leader of Atheism simultaneously insist that atheism has no guiding principles or morality or goals — it’s a complete moral cipher that simply says there is no god. So sure, as long as you clearly state that there is no god, you can be sexist or racist or endorse bombing the Middle East or love Ayn Rand with all your heart or believe that the poor deserve their lot since Darwin said “survival of the fittest” (he didn’t), and still be the paradigmatic Good Atheist. In the absence of any moral principle, we can promote even moral monsters, or ascientific promoters of bunkum and quackery, to be our representatives — and if you dare to disagree, you are ‘divisive’ and ‘bickering’ and doing harm to the movement.
    Tags: , , , by M. Fioretti (2014-07-22)
    Voting 0
  6. I'm financially clueless. 

    Actually I'm financially savvy. I'm just not ruled by it
    Tags: , , by M. Fioretti (2012-08-16)
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 1 Online Bookmarks of M. Fioretti: Tags: common sense

About - Propulsed by SemanticScuttle