mfioretti: algorithms*

Bookmarks on this page are managed by an admin user.

79 bookmark(s) - Sort by: Date ↑ / Title / Voting / - Bookmarks from other users for this tag

  1. World famous mutual fund investor, Bill Gross, of PIMCO, has patented the methodology for his bond fund -- or, as Dealbreaker correctly points out, he "patented a way to count." Indeed, the patent in question, US Patent 8,306,892 is somewhat hideous, describing not much more than the concept of an algorithm that weights regions based on GDP.

    It doesn't take a patent specialist to figure out that this is basically patenting a spreadsheet for weighting countries on a few different factors. It seems to be the exact kind of thing that was disallowed by the Supreme Court under Gottschalk v. Benson. And yet, the USPTO waved it right on through. Kinda makes you wonder what the hell patent examiner Samica L. Norman was thinking in approving such a ridiculous patent.
    https://www.techdirt.com/articles/201...-couldnt-patent-math-good-times.shtml
    Voting 0
  2. As much as we may be loathe to admit it, the Facebook News Feed is where many of us love to waste our time. And unless your preferences are set show all the activities and updates of all your friends in chronological order, you're viewing a pre-determined selection of items that Facebook's algorithms have chosen just for you.

    Indeed, along with PageRank and Facebook's Newsfeed, these algorithms are creating a so-called filter bubble, a phenomenon in which users become separated from information that disagrees with their viewpoints — effectively isolating them in their own culture of ideological "bubbles." This could result in what Eli Pariser calls "information determinism" where our previous internet-browsing habits determine our future.

    Back in 2010, it was announced that, by using IBM's predictive analysis software (called CRUSH, or Criminal Reduction Utilizing Statistical History), Memphis's police department reduced serious crime by more than 30%, including a 15% reduction in violent crimes since 2006. Inspired, other cities have taken notice, including those in Poland, Israel, and the UK. Pilot projects are currently underway in Los Angeles, Santa Cruz, and Charleston.

    The 10 Algorithms That Dominate Our World3

    It works through a combination of data aggregation, statistical analysis, and of course, cutting-edge algorithms. It allows police to evaluate incident patterns throughout a city and forecast criminal "hot spots" to "proactively allocate resources and deploy personnel, resulting in improved force effectiveness and increased public safety."

    In the future, these systems will largely take over the work of analysts. Criminals will be tracked by sophisticated algorithms that monitor internet activity, GPS, personal digital assistants, biosignatures, and all communications in real time. Unmanned aerial vehicles will increasingly be used to track potential offenders to predict intent through their body movements and other visual clues.
    http://io9.com/the-10-algorithms-that...urce=facebook.com&utm_campaign=buffer
    Voting 0
  3. This new type of governance has a name: algorithmic regulation. In as much as Silicon Valley has a political programme, this is it. Tim O'Reilly, an influential technology publisher, venture capitalist and ideas man (he is to blame for popularising the term "web 2.0") has been its most enthusiastic promoter. In a recent essay that lays out his reasoning, O'Reilly makes an intriguing case for the virtues of algorithmic regulation – a case that deserves close scrutiny both for what it promises policymakers and the simplistic assumptions it makes about politics, democracy and power.

    The numerous possibilities that tracking devices offer to health and insurance industries are not lost on O'Reilly. "You know the way that advertising turned out to be the native business model for the internet?" he wondered at a recent conference. "I think that insurance is going to be the native business model for the internet of things." Things do seem to be heading that way: in June, Microsoft struck a deal with American Family Insurance, the eighth-largest home insurer in the US, in which both companies will fund startups that want to put sensors into smart homes and smart cars for the purposes of "proactive protection".

    An insurance company would gladly subsidise the costs of installing yet another sensor in your house – as long as it can automatically alert the fire department or make front porch lights flash in case your smoke detector goes off. For now, accepting such tracking systems is framed as an extra benefit that can save us some money. But when do we reach a point where not using them is seen as a deviation – or, worse, an act of concealment – that ought to be punished with higher premiums?

    The numerous possibilities that tracking devices offer to health and insurance industries are not lost on O'Reilly. "You know the way that advertising turned out to be the native business model for the internet?" he wondered at a recent conference. "I think that insurance is going to be the native business model for the internet of things." Things do seem to be heading that way: in June, Microsoft struck a deal with American Family Insurance, the eighth-largest home insurer in the US, in which both companies will fund startups that want to put sensors into smart homes and smart cars for the purposes of "proactive protection".

    An insurance company would gladly subsidise the costs of installing yet another sensor in your house – as long as it can automatically alert the fire department or make front porch lights flash in case your smoke detector goes off. For now, accepting such tracking systems is framed as an extra benefit that can save us some money. But when do we reach a point where not using them is seen as a deviation – or, worse, an act of concealment – that ought to be punished with higher premiums?

    Or consider a May 2014 report from 2020health, another thinktank, proposing to extend tax rebates to Britons who give up smoking, stay slim or drink less. "We propose 'payment by results', a financial reward for people who become active partners in their health, whereby if you, for example, keep your blood sugar levels down, quit smoking, keep weight off, or » take on more self-care, there will be a tax rebate or an end-of-year bonus," they state. Smart gadgets are the natural allies of such schemes: they document the results and can even help achieve them – by constantly nagging us to do what's expected.

    Cash-strapped governments welcome such colonisation by technologists – especially if it helps to identify and clean up datasets that can be profitably sold to companies who need such data for advertising purposes. Recent clashes over the sale of student and health data in the UK are just a precursor of battles to come: after all state assets have been privatised, data is the next target. For O'Reilly, open data is "a key enabler of the measurement revolution".

    This "measurement revolution" seeks to quantify the efficiency of various social programmes, as if the rationale behind the social nets that some of them provide was to achieve perfection of delivery. The actual rationale, of course, was to enable a fulfilling life by suppressing certain anxieties, so that citizens can pursue their life projects relatively undisturbed. This vision did spawn a vast bureaucratic apparatus and the critics of the welfare state from the left – most prominently Michel Foucault – were right to question its disciplining inclinations. Nonetheless, neither perfection nor efficiency were the "desired outcome" of this system. Thus, to compare the welfare state with the algorithmic state on those grounds is misleading.
    http://www.theguardian.com/technology...evgeny-morozov-algorithmic-regulation
    Voting 0
  4. I can’t help but feel that Crichton has somewhat missed the point of unions. Yes, creating a better workplace is a big part of that purpose, but so – fundamentally – is the ability of workers to organize themselves so they can speak with a collective voice. And the purpose of that is to counterbalance the voice of management or the providers of capital, in order to preserve their rights. It’s about maintaining healthy power dynamics.

    There is no such opportunity for workers in the on-demand economy — no platform for organization, no collective voice, and no power. Sure, if individual workers don’t like the work then they can theoretically leave, but they can and will be replaced immediately. That’s the whole point of the on-demand economy – it’s taking full advantage of the fact that the supply of workers greatly outstrips demand. And that means that the departures of individuals will provide little incentive to on-demand employers to improve wages and working conditions.

    Look at Uber, which strenuously denies that its drivers are its workers at all, which won’t guarantee to pay those drivers’ fines if they’re caught keeping Uber’s business afloat in cities where the service is banned, and which ultimately wants to get rid of those drivers altogether. TaskRabbit now matches tasks to workers by algorithm rather than letting workers bid for them, erasing much of the control its workers had over their work situation. These are the kinds of businesses that are going to be the “champions of workers”?

    I have absolutely no doubt that the workplace of the future will look very different to that of today, and perhaps entirely different to that of a few decades ago. There will probably be fewer jobs to go around, and in many cases we will certainly need to adjust our conceptions of the workplace and the working week. A lot of people like the traditional setup because they care more about what happens after 5pm than the drudgery that comes before, and maybe they’re going to be out of luck.

    However, the workers themselves need to have a say in how this new world develops. The idea that a handful of platforms operating on razor-thin margins will create an equitable world for their workers — that algorithms written by the employers will protect workers’ rights better than the workers themselves and their elected representatives could — would be funny if the reality of this model weren’t so outright terrifying.
    http://gigaom.com/2014/08/18/in-the-o...lgorithms-wont-protect-workers-rights
    Voting 0
  5. So, why the distaste for a change that would benefit many of them? It’s simple: Twitter’s uncurated feed certainly has some downsides, and I can see some algorithmic improvements that would make it easier for early users to adopt the service, but they’d potentially be chopping off the very—sometimes magical—ability of mature Twitter to surface from the network. And the key to this power isn't the reverse chronology but rather the fact that the network allows humans to exercise free judgment on the worth of content, without strong algorithmic biases. That cumulative, networked freedom is what extends the range of what Twitter can value and surface, and provides some of the best experiences of Twitter.
    https://medium.com/message/the-algori...iveth-but-it-also-taketh-b7efad92bc1f
    Voting 0
  6. There's now an intense scrutiny of the actions and habits of employees and potential employees, in the hope that statistical analysis will reveal those who have desired workplace traits. Factors such as choice of web browser, or when and where they eat lunch, could affect their chances.

    This process runs up against anti-discrimination laws in countries like Australia, where employers can't base their decisions on attributes such as race, sex, disability, age, and marital status.

    " Burdon and Harpur » argue that it's almost impossible for these laws to be applied when the decisions are made on the basis of talent analytics, because it's usually almost impossible for either data users (employers), or data subjects, to know even what data is being used to make decisions," Greenleaf said.

    "This is very important if we're to preserve the hard-won social policies represented by anti-discrimination laws, and prevent the hidden heuristics and emerging employment practices starting to mean that 'data is destiny'."

    Big data's approach of collecting as much data as you can, even if it seems irrelevant, because it may reveal a previously unknown correlation, also collides with the "data minimisation" principles of data privacy laws, which say that you only collect the data you need to do the job.
    http://www.zdnet.com/why-big-data-eva...sent-to-re-education-camps-7000033862
    Voting 0
  7. What is becoming increasingly clear from is that algorithms are an essential part of the process of creation of the money of the common, but that algorithms also have politics (what are the gendered politics of individual ‘mining’ for example and of the complex technical knowledge and machinery implied in mining bitcoins?). Furthermore the drive to completely automate money production in order to escape the fallacies of subjective factors and social relations might cause such relations to come back in the form of speculative trading. In the same way as financial capital is intrinsically linked to a certain kind of subjectivity (the financial predator narrated by Hollywood cinema), so an autonomous form of money need to be both jacked into and productive of a new kind of subjectivity not limited to the hacking milieu as such, but similarly oriented not towards monetization and accumulation, but towards the empowering of social cooperation.
    http://quaderni.sanprecario.info/2014...on-of-the-common-di-tiziana-terranova
    Tags: , , by M. Fioretti (2014-09-24)
    Voting 0
  8. There have been increasingly vocal calls for Twitter, Facebook and other Silicon Valley corporations to more aggressively police what their users are permitted to see and read. Last month in The Washington Post, for instance, MSNBC host Ronan Farrow demanded that social media companies ban the accounts of “terrorists” who issue “direct calls” for violence.

    This week, the announcement by Twitter CEO Dick Costolo that the company would prohibit the posting of the James Foley beheading video and photos from it (and suspend the accounts of anyone who links to the video) met with overwhelming approval. What made that so significant, as The Guardian‘s James Ball noted today, was that “Twitter has promoted its free speech credentials aggressively since the network’s inception.” By contrast, Facebook has long actively regulated what its users are permitted to say and read; at the end of 2013, the company reversed its prior ruling and decided that posting of beheading videos would be allowed, but only if the user did not express support for the act.

    Given the savagery of the Foley video, it’s easy in isolation to cheer for its banning on Twitter. But that’s always how censorship functions: it invariably starts with the suppression of viewpoints which are so widely hated that the emotional response they produce drowns out any consideration of the principle being endorsed.
    https://firstlook.org/theintercept/20...facebook-executives-arbiters-see-read
    Voting 0
  9. There’s no denying that the sharing economy can – and probably does – make the consequences of the current financial crisis more bearable. However, in tackling the consequences, it does nothing to address the causes. It’s true that, thanks to advances in the information technology, some of us can finally get by with less – chiefly, by relying on more effective distribution of existing resources. But there’s nothing to celebrate here: it’s like handing everybody earplugs to deal with intolerable street noise instead of doing something about the noise itself.
    http://www.theguardian.com/commentisf...pe-benefits-overstated-evgeny-morozov
    Voting 0
  10. Data everywhere but European entrepreneurs seemingly dare not touch it. This mentality that peoples personal data is a sacriment is hindering EU startups dramatically.In Europa: companies having your data = bad.

    Regardless of the good/bad and who, we all as a wearable community need to see data as transactional and conditional for building products of convenience. End users are willing to give this data up as long as your product enhances their quality of life. The real issues we have is collecting data just for the sake of it.

    If you are going to succeed in this market you must understand context. Contextual understanding is determined through collection, interpretation, and product iteration. EU governments aren’t figuring out how to support the betterment of human life with their currently restrictive policies, yet EU startups will die in droves if they don’t collect and smartly use data.
    http://tech.eu/features/2861/europe-wearable-tech-startups
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 8 Online Bookmarks of M. Fioretti: Tags: algorithms

About - Propulsed by SemanticScuttle