2018/09/25: most people in the UK are far less free to express themselves – online or offline – than MPs, activists and journalists, often because of restrictions from their employers. For millions of public sector workers, including those in the police and NHS, there are rules against expressing political views, so that if someone wants to discuss politics online – which seems reasonable – they would need to use an anonymous account.
The same fears apply more widely to those in the private sector: we have heard enough tales of people losing their job for running their mouth off about their employer or posting pictures of a raucous night out, to dismiss this risk as hypothetical: people are right to be worried about posting under their own name.
Even those able to post under their own name often create anonymous or protected “alt” accounts to post about their family life, mental health, sex life or to bitch about others – and while we might frown on the latter, it’s hardly criminal or bannable behaviour.
But where the UK leads, dictators can follow: the suggestion from Rayner and others that anonymous accounts are illegitimate allows authoritarian leaders to say the same, suppressing a channel for opposition speech.
2018/09/14: while Brandeis believed that anyone had the right to express their views, he did not believe that anyone had the right to be amplified.
More importantly, he didn’t believe that anyone who had the means to shove a message down someone’s throat had the right to do so.
2018/09/05: Turns out too much free speech - and not enough listening - is a bad thing for democracy.
It’s not speech per se that allows democracies to function, but the ability to agree - eventually, at least some of the time - on what is true, what is important and what serves the public good.
"Democracy can’t operate completely unmoored from a common ground...We need new mechanisms—suited to the digital age—that allow for a shared understanding of facts and that focus our collective attention on the most important problems."
Trump-like so many other politicians and pundits-has found search and social media companies to be convenient targets in the debate over free speech and censorship online. "This is a very serious situation-will be addressed!"rnrnBut in this moment, the conversation we should be having-how can we fix the algorithms?-is instead being co-opted and twisted by politicians and pundits howling about censorship and miscasting content moderation as the demise of free speech online. It would be good to remind them that free speech does not mean free reach. There is no right to algorithmic amplification. In fact, that's the very problem that needs fixing.rnrnThe algorithms don't understand what is propaganda and what isn't, or what is "fake news" and what is fact-checked. Their job is to surface relevant content (relevant to the user, of course), and they do it exceedingly well. So well, in fact, that the engineers who built these algorithms are sometimes baffled: "Even the creators don't always understand why it recommends one video instead of another," says Guillaume Chaslot, an ex-YouTube engineer who worked on the site's algorithm.rnrn YouTube's algorithms can also radicalize by suggesting "white supremacist rants, Holocaust denials, and other disturbing content," Zeynep Tufekci recently wrote in the Times. "YouTube may be one of the most powerful radicalizing instruments of the rnrnThe problem extends beyond YouTube, though. On Google search, dangerous anti-vaccine misinformation can commandeer the top results. And on Facebook, hate speech can thrive and fuel genocidernrnSo what can we do about it? The solution isn't to outlaw algorithmic ranking or make noise about legislating what results Google can return. Algorithms are an invaluable tool for making sense of the immense universe of information online. There's an overwhelming amount of content available to fill any given person's feed or search query; sorting and ranking is a necessity, and there has never been evidence indicating that the results display systemic partisan bias. rnIt's imperative that we focus on solutions, not politics.
The below original text was the basis for Data & Society Founder and President danah boyd's March 2018 SXSW Edu keynote,"What Hath We Wrought?" - Ed. Growing up, I took certain truths to be self
Facebook was supposed to open up societies like Cambodia. But it's doing just the opposite - with disastrous consequences for its fragile politics.
At a time when anyone can broadcast live or post their thoughts to a social network, we should be living in a utopia of public discourse. We're not.
Why "Mark Zuckerberg, Global News Editor" Could Be A Bad Idea
The very loss of control that boxed me into writing just for my site has been exacerbated by a tool originally built to annotate hip hop lyrics.
While most of the major players are making their lawyers happy by being purposefully vague in public, Ellen Pao's resignation...
Caro il mio opinionista liberal,George W. Bush non piaceva né a te né a me. Te la ricordi quella dichiarazione piuttosto puerile che fece dopo l'11 settembre:
L'ondata di sdegno e solidarietà a suon di #jesuischarlie
è stata una manifestazione trasversale che ha permesso ai contenuti della rivista colpita dal terribile attentato terroristico di essere diffusi ai quattro venti. E' la libertà di espressione a riempire senza sosta lo spazio della discussione e ad animare la condivisione di vignette recanti Corani perforati
Vous n'195170tes pas Charlie.
Quanto è avvenuto lo scorso 7 gennaio in Francia presso la sede parigina di Charlie Hebdo, settimanale satirico noto per il suo stile ironico e provocatorio, ha
People profess affection for cartoons that offend a religion - until the targeted religion changes.
Arrogandosi il diritto di rappresentare una comunità senza consultarla, almeno capire il senso di quello che la comunità scrive.
If you are on a personal connection, like at home, you can run an anti-virus scan on your device to make sure it is not infected with malware.
Have you ever 'googled' your name? Many people do, and some find search results about themselves they rather not find publicly available on the internet. The question is; what do you do when that happens