2018/09/13: Facebook is a two-billion-strong democratic community and the personal plaything of an unaccountable thirty-something billionaire.
If it comes down to a contest between the membership and the ownership of Facebook, Zuckerberg will probably win, as he gets to set the rules. In the end it is only the regulatory power of the state that can make Facebook safe for democracy.
there were two big risks with turning the state into a giant automaton. The first was that it wouldn’t be powerful enough.
The second was that it would too closely resemble the things it was designed to regulate. In a world of machines, the state might go native. It could become entirely artificial.
This is the original fear of the modern age: not what happens when the machines become too much like us, but what happens if we become too much like machines.
The machines that most frightened Hobbes were corporations.
Many of the things that we fret about when we imagine a future world of AIs are the same worries that have been harboured about corporations for centuries.
If we end up trading a surveillance economy for a surveillance state, we've done ourselves no favorsrnrnEvgeny Morozov offered a similar proposal concerning what he termed "the data wells inside ourselves":rnrn We can use the recent data controversies to articulate a truly decentralised, emancipatory politics, whereby the institutions of the state (from the national to the municipal level) will be deployed to recognise, create, and foster the creation of social rights to data. These institutions will organise various data sets into pools with differentiated access conditions. They will also ensure that those with good ideas that have little commercial viability but promise major social impact would receive venture funding and realise those ideas on top of those data pools.rnrnThe simplicity of the mining metaphor is its strength but also its weakness. The extraction metaphor doesn't capture enough of what companies like Facebook and Google do, and hence in adopting it we too quickly narrow the discussion of our possible responses to their power. Data does not lie passively within me, like a seam of ore, waiting to be extracted. Rather, I actively produce data through the actions I take over the course of a day. When I drive or walk from one place to another, I produce locational data. When I buy something, I produce purchase data. When I text with someone, I produce affiliation data. When I read or watch something online, I produce preference data. When I upload a photo, I produce not only behavioral data but data that is itself a product. I am, in other words, much more like a data factory than a data mine. I produce data through my labor - the labor of my mind, the labor of my body.rnrnThe platform companies, in turn, act more like factory owners and managers than like the owners of oil wells or copper mines. Beyond control of my data, the companies seek control of my actions, which to them are production processes, in order to optimize the efficiency, quality, and value of my data output (and, on the demand side of the platform, my data consumption). They want to script and regulate the work of my factory - i.e., my life - as Frederick Winslow Taylor sought to script and regulate the labor of factory workers at the turn of the last century. The control wielded by these companies, in other words, is not just that of ownership but also that of command. And they exercise this command through the design of their software, which increasingly forms the medium of everything we all do during our waking hours.rnrnThe factory metaphor makes clear what the mining metaphor obscures: We work for the Facebooks and Googles of the world, and the work we do is increasingly indistinguishable from the lives we lead. The questions we need to grapple with are political and economic, to be sure. But they are also personal, ethical, and philosophical.rnrnrn====rnrnTarnoff and Weigel point to Facebook CEO Mark Zuckerberg's recent announcement that his company will place less emphasis on increasing the total amount of time members spend on Facebook and more emphasis on ensuring that their Facebook time is "time well spent." What may sound like a selfless act of philanthropy is in reality, Tarnoff and Weigel suggest, the product of a hard-headed business calculation:rnrn Emphasising time well spent means creating a Facebook that prioritises data-rich personal interactions that Facebook can use to make a more engaging platform. Rather than spending a lot of time doing things that Facebook doesn't find valuable - such as watching viral videos - you can spend a bit less time, but spend it doing things that Facebook does find valuable. In other words, "time well spent" means Facebook can monetise more efficiently. It can prioritise the intensity of data extraction over its extensiveness. This is a wise business move, disguised as a concession to critics. Shifting to this model not only sidesteps concerns about tech addiction - it also acknowledges certain basic limits to Facebook's current growth model. There are only so many hours in the day. Facebook can't keep prioritising total time spent - it has to extract more value from less time.rnrnThe analysis is a trenchant one. The vagueness and self-absorption that often characterize discussions of wellness, particularly those emanating from the California coast, are well suited to the construction of window dressing. And, Lord knows, Zuckerberg and his ilk are experts at window dressing. But, having offered good reasons to be skeptical about Silicon Valley's brand of tech humanism, Tarnoff and Weigel overreach. They argue that any "humanist" critique of the personal effects of technology design and use is a distraction from the "fundamental" critique of the economic and structural basis for Silicon Valley's dominance:rnrn [The humanists] remain confined to the personal level, aiming to redesign how the individual user interacts with technology rather than tackling the industry's structural failures. Tech humanism fails to address the root cause of the tech backlash: the fact that a small handful of corporations own our digital lives and strip-mine them for profit. This is a fundamentally political and collective issue. But by framing the problem in terms of health and humanity, and the solution in terms of design, the tech humanists personalise and depoliticise it.rnrnThe choice that Tarnoff and Weigel present here - either personal critique or political critique, either a design focus or a structural focus - is a false choice. And it stems from the metaphor of extraction, which conceives of data as lying passively within us (beyond the influence of design) rather than being actively produced by us (under the influence of design). Arguing that attending to questions of design blinds us to questions of ownership is as silly (and as condescending) as arguing that attending to questions of ownership blinds us to questions of design. Silicon Valley wields its power through both its control of data and its control of design, and that power influences us on both a personal and a collective level. Any robust critique of Silicon Valley, whether practical, theoretical, or both, needs to address both the personal and the political.rnrnThe Silicon Valley apostates may be deserving of criticism, but what they've done that is praiseworthy is to expose, in considerable detail, the way the platform companies use software design to guide and regulate people's behavior - in particular, to encourage the compulsive use of their products in ways that override people's ability to think critically about the technology while provoking the kind of behavior that generates the maximum amount of valuable personal data. To put it into industrial terms, these companies are not just engaged in resource extraction; they are engaged in process engineering.rnrnrn===rnThe shift of data ownership from the private to the public sector may well succeed in reducing the economic power of Silicon Valley, but what it would also do is reinforce and indeed institutionalize Silicon Valley's computationalist ideology, with its foundational, Taylorist belief that, at a personal and collective level, humanity can and should be optimized through better programming. The ethos and incentives of constant surveillance would become even more deeply embedded in our lives, as we take on the roles of both the watched and the watcher. Consumer, track thyself! And, even with such a shift in ownership, we'd still confront the fraught issues of design, manipulation, and agency.rnrnFinally, there's the obvious practical question. How likely is it that the United States is going to establish a massive state-run data collective encompassing exhaustive information on every citizen, at least any time in the foreseeable future? It may not be entirely a pipe dream, but it's pretty close. In the end, we may discover that the best means of curbing Silicon Valley's power lies in an expansion of personal awareness, personal choice, and personal resistance. At the very least, we need to keep that possibility open. Let's not rush to sacrifice the personal at the altar of the collective.
Like the oil barons at the turn of the 20th century, the data barons are determined to extract as much as possible of a resource that's central to the economy of their time. The more information they can get to feed the algorithms that power their ad-targeting machines and product-recommendation engines, the better. In the absence of serious competition or (until Europe's recently introduced General Data Protection Regulation) serious legal constraints on the handling of personal data, they are going to keep undermining privacy in their push to know as much about their users as they possibly canrnrnTheir dominance is allowing them to play a dangerous and outsize role in our politics and culture. The web giants have helped undermine confidence in democracy by underestimating the threat posed by Russian trolls, Macedonian fake-news farms, and other purveyors of propaganda. Zuckerberg at first dismissed claims that disinformation on Facebook had influenced the 2016 election as "pretty crazy." But Facebook itself now says that between June 2015 and August 2017, as many as 126 million people may have seen content on the network that was created by a Russian troll farm.rnrnrnWhy haven't antitrust regulators blocked deals to promote competition? It's mainly because of a change in US antitrust philosophy in the 1980s, inspired by neoclassical economists and legal scholars at the University of Chicago. Before the shift, antitrust enforcers were wary of any deals that reinforced a company's dominant position. After it, they became more tolerant of such combinations, as long as prices for consumers didn't rise. This was just fine with internet companies, since most of their services were free anyway. Critics say trustbusters exercised too little scrutiny. "Just because the web companies offer products for free doesn't mean they should get a free pass," says Jonathan Kanter, an antitrust lawyer at Paul Weiss.rnrnthanks to their vast wealth, fining them for any transgressions won't diminish their power.rnrnOne radical solution would be to break them up, just as the US government splintered the dominant Standard Oil monopoly in the early 1900s. Some progressive advocacy groups in the US have been running online campaigns with slogans like "Facebook has too much power over our lives and democracy. It's time for us to take that power back," and calling on the FTC to force the social network to sell Instagram, WhatsApp, and Messenger to create competition.rnrnSo how to curb the power of the data barons? Rather than waiting for legal battles that may or may not foster more competition, we urgently need to find ways to bolster rivals. That means reducing the vast chasm between the amounts of information held by the web giants and the rest. Regulation can help here: Europe's new data privacy regime requires companies to hold people's data in machine-readable form and let them move it easily to other businesses if they want to. This "data portability" rule will allow startups to get hold of more data quickly.rnrnrnSome argue that we need to think much more boldly-and not just with the big internet companies in mind. Viktor Mayer-Sch195182nberger, a professor at the University of Oxford, has proposed what he calls a "progressive data-sharing mandate" that would apply to all businesses. This would require a company that has passed a certain level of market share (say, 10 percent) to share some data with other firms in its industry that ask for it. The data would be chosen at random and stripped of all personal identifiers. Intuitively, the idea makes sense: the closer a company gets to dominating its market, the more data it would have to share, making it easier for rivals to compete by building a better product.
If our supersmart tech leaders knew a bit more about history or philosophy we wouldn't be in the mess we're in now
WELCOME to Connected Rights, your breach in the hull of digital rights news and analysis. MARK ZUCKERBERG'S APPEARANCE BEFORE EUROPEAN PARLIAMENT LEADERS yesterday was an absolute farce. With the world's attention trained on them, the MEPs spent an hour grandstanding, landing all the predicted blows but at excessive length - the meeting was only supposed to
Photo by Thought Catalog During the 2008 financial crisis, Rod Blagojevich, the governor of Illinois, took on Bank of America ...
Ovvero, perché non è del tutto normale che in età adulta, con moglie e due figli e un'azienda che dirige da 14 anni, giustificarsi con la retorica del "giovane inesperto
Facebook's chief executive on the right to privacy, banning Cambridge Analytica and the things he is taking responsibility for
Over the last two years, the capacity to manage mood has been monetised through the sharing of fake news and political feeds atuned to reader preference: you can also make people happy by confirming their biases.
Facebook hired Tavis McGinn to measure Mark Zuckerberg and Sheryl Sandberg's likability. He quit after six months
The stories format for social sharing, which Facebook took from Snapchat, could overtake Facebook's famous feed.
The impact of Facebook's News Feed changes on the media is far less interesting than what the changes - and their stated purpose - say about Facebook itself.
Here is what Mark Zuckerberg learned from his 30-state tour of the US: polarisation is rife and the country is suffering from an opioid crisis. Forgive me if I have to lie down for a moment. Yet it
Official site of The Week Magazine, offering commentary and analysis of the day's breaking news and current events as well as arts, entertainment, people and gossip, and political cartoons.
If everyone is upset with you, as the platform's chief says, are you really doing something right?
The puzzle is this. If I were marking this as a student essay I'd be grading 'Building Global Community' as a B-minus. The terms are vague, obvious questions begged are ignored and historical context is missing.
Yesterday marked the conclusion of the two-day Summit on Technology and Opportunity, an anti-poverty conference cohosted by the White House, Stanford University, and Mark Zuckerberg's charity. Something is wrong here.
Perhaps not many people will see the connection between today being the first day Gawker is gone, it being the 25th Anniversary of the Web, and the message all Facebook users were greeted with this morning.Gawker is gone because Peter Thiel financed its murder-by-lawyer. It's legal to do this in the US, but until now as far as I know, no one has crossed this line. Now that the line has been crossed, it's fair to assume it will become standard practice for billionaires like Thiel to finance lawsuits until the publication loses and has to sell itself to pay the judgment. It's the 25th Anniversary of the Web because 25 years ago a generous visionary named Tim Berners-Lee invented something that would benefit humanity more than it would benefit him. And many other visionaries saw it, and because it was open, were able to build anything they could imagine using it as a basis. And they did, making something like Facebook possible.Facebook is a silo for web writing. And while it would be easy for them to create paths for ideas to flow in and out of Facebook, at very low cost, and they have the features already developed, and use them internally, they refuse to share them with users. I suppose we could just explain this as they're a very large tech company and that's what tech companies do, but they also have the chutzpah to pretend to support the open web. They have been happy to accept its bounty and have done nothing to return what they've taken from the commons to the commons.And finally, remember Peter Thiel, the guy who thinks his wealth entitles him to shut down publications he doesn't like, not only did he make billions from Facebook stock, he's still on the board of Facebook. Zuckerberg has had plenty of time to ask him to leave, or to fire him, and he hasn't done it. Again, you could just shrug it off and say Zuck is like Thiel, but he's extra special in that he wants you to believe he appreciates the gift of the open web, as he strangles it.
Today's empires are born on the web, and exert tremendous power in the material world.