mfioretti: history*

Bookmarks on this page are managed by an admin user.

392 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. ‘Inequality’ is a way of framing social problems appropriate to technocratic reformers, the kind of people who assume from the outset that any real vision of social transformation has long since been taken off the political table. It allows one to tinker with the numbers, argue about Gini coefficients and thresholds of dysfunction, readjust tax regimes or social welfare mechanisms, even shock the public with figures showing just how bad things have become (‘can you imagine? 0.1% of the world’s population controls over 50% of the wealth!’), all without addressing any of the factors that people actually object to about such ‘unequal’ social arrangements: for instance, that some manage to turn their wealth into power over others; or that other people end up being told their needs are not important, and their lives have no intrinsic worth. The latter, we are supposed to believe, is just the inevitable effect of inequality, and inequality, the inevitable result of living in any large, complex, urban, technologically sophisticated society. That is the real political message conveyed by endless invocations of an imaginary age of innocence, before the invention of inequality: that if we want to get rid of such problems entirely, we’d have to somehow get rid of 99.9% of the Earth’s population and go back to being tiny bands of foragers again. Otherwise, the best we can hope for is to adjust the size of the boot that will be stomping on our faces, forever, or perhaps to wrangle a bit more wiggle room in which some of us can at least temporarily duck out of its way.

    Mainstream social science now seems mobilized to reinforce this sense of hopelessness. Almost on a monthly basis we are confronted with publications trying to project the current obsession with property distribution back into the Stone Age, setting us on a false quest for ‘egalitarian societies’ defined in such a way that they could not possibly exist outside some tiny band of foragers (and possibly, not even then). What we’re going to do in this essay, then, is two things. First, we will spend a bit of time picking through what passes for informed opinion on such matters, to reveal how the game is played, how even the most apparently sophisticated contemporary scholars end up reproducing conventional wisdom as it stood in France or Scotland in, say, 1760. Then we will attempt to lay down the initial foundations of an entirely different narrative. This is mostly ground-clearing work. The questions we are dealing with are so enormous, and the issues so important, that it will take years of research and debate to even begin understanding the full implications. But on one thing we insist. Abandoning the story of a fall from primordial innocence does not mean abandoning dreams of human emancipation – that is, of a society where no one can turn their rights in property into a means of enslaving others, and where no one can be told their lives and needs don’t matter. To the contrary. Human history becomes a far more interesting place, containing many more hopeful moments than we’ve been led to imagine, once we learn to throw off our conceptual shackles and perceive what’s really there.
    Voting 0
  2. Right now, it’s Bitcoin. But in the past we’ve had dotcom stocks, the 1929 crash, 19th-century railways and the South Sea Bubble of 1720. All these were compared by contemporaries to “tulip mania”, the Dutch financial craze for tulip bulbs in the 1630s. Bitcoin, according some sceptics, is “tulip mania 2.0”.

    Why this lasting fixation on tulip mania? It certainly makes an exciting story, one that has become a byword for insanity in the markets. The same aspects of it are constantly repeated, whether by casual tweeters or in widely read economics textbooks by luminaries such as John Kenneth Galbraith.

    Tulip mania was irrational, the story goes. Tulip mania was a frenzy. Everyone in the Netherlands was involved, from chimney-sweeps to aristocrats. The same tulip bulb, or rather tulip future, was traded sometimes 10 times a day. No one wanted the bulbs, only the profits – it was a phenomenon of pure greed. Tulips were sold for crazy prices – the price of houses – and fortunes were won and lost. It was the foolishness of newcomers to the market that set off the crash in February 1637. Desperate bankrupts threw themselves in canals. The government finally stepped in and ceased the trade, but not before the economy of Holland was ruined.

    Yes, it makes an exciting story. The trouble is, most of it is untrue.
    Voting 0
  3. Since she first started specializing in old documents, Watson has expanded beyond things written in English. She now has a stable of collaborators who can tackle manuscripts in Latin, German, Spanish, and more. She can only remember two instances that left her and her colleagues stumped. One was a Tibetan manuscript, and she couldn’t find anyone who knew the alphabet. The other was in such bad shape that she had to admit defeat.

    In the business of reading old documents, Watson has few competitors. There is one transcription company on the other side of the world, in Australia, that offers a similar service. Libraries and archives, when they have a giant batch of handwritten documents to deal with, might recruit volunteers. Even today, when computers have started to excel at reading books, handwritten works escape their understanding. Scholars who study medieval manuscripts have been working on programs that might have a chance of helping with this task, but right now a trained eye is still the best and only way to make real progress.
    Voting 0
  4. Brexiteers have ostensibly got what they want: Brexit. They assumed we could dictate the terms; we can’t. They assumed we could just walk away; we can’t. They had no more plans for leaving than a dog chasing a car has to drive it. They are now finding out how little sovereignty means for a country the size of Britain in a neoliberal globalised economy beyond blue passports (which we could have had anyway). What we need isn’t a change of leader but a change of direction.

    May is no more personally to blame for the mess we are in with Europe than Anthony Eden was for the mess with the 1956 Suez crisis – which provides a more salient parallel for Britain than the second world war. It took Britain and France overplaying their hand, in punishing Egypt for seizing the Suez canal from colonial control and nationalising it, to realise their imperial influence had been eclipsed by the US and was now in decline.

    “France and England will never be powers comparable to the United States,” the West German chancellor at the time, Konrad Adenauer, told the French foreign minister. “Not Germany either. There remains to them only one way of playing a decisive role in the world: that is to unite Europe … We have no time to waste; Europe will be your revenge.”

    Once again, Britain has overplayed its hand. Preferring to live in the past rather than learn from it, we find ourselves diminished in the present and clueless about the future.
    Voting 0
  5. what happened to cause such a profound shift in the human psyche away from egalitarianism? The balance of archaeological, anthropological and genomic data suggests the answer lies in the agricultural revolution, which began roughly 10,000 years ago.

    The extraordinary productivity of modern farming techniques belies just how precarious life was for most farmers from the earliest days of the Neolithic revolution right up until this century (in the case of subsistence farmers in the world’s poorer countries). Both hunter-gatherers and early farmers were susceptible to short-term food shortages and occasional famines – but it was the farming communities who were much more likely to suffer severe, recurrent and catastrophic famines.

    Hunting and gathering was a low-risk way of making a living. Ju/’hoansi hunter-gatherers in Namibia traditionally made use of 125 different edible plant species, each of which had a slightly different seasonal cycle, varied in its response to different weather conditions, and occupied a specific environmental niche. When the weather proved unsuitable for one set of species it was likely to benefit another, vastly reducing the risk of famine.

    As a result, hunter-gatherers considered their environments to be eternally provident, and only ever worked to meet their immediate needs. They never sought to create surpluses nor over-exploited any key resources. Confidence in the sustainability of their environments was unyielding.
    The Ju/’hoansi people have lived in southern Africa for hundreds of thousands of years.
    The Ju/’hoansi people have lived in southern Africa for hundreds of thousands of years. Photograph: James Suzman

    In contrast, Neolithic farmers assumed full responsibility for “making” their environments provident. They depended on a handful of highly sensitive crops or livestock species, which meant any seasonal anomaly such as drought or livestock disease could cause chaos.
    Business Today: sign up for a morning shot of financial news
    Read more

    And indeed, the expansion of agriculture across the globe was punctuated by catastrophic societal collapses. Genomic research on the history of European populations points to a series of sharp declines that coincided first with the Neolithic expansion through central Europe around 7,500 years ago, then with their spread into north-western Europe about 6,000 years ago.

    However, when the stars were in alignment – weather favourable, pests subdued, soils still packed with nutrients – agriculture was very much more productive than hunting and gathering. This enabled farming populations to grow far more rapidly than hunter-gatherers, and sustain these growing populations over much less land.

    But successful Neolithic farmers were still tormented by fears of drought, blight, pests, frost and famine. In time, this profound shift in the way societies regarded scarcity also induced fears about raids, wars, strangers – and eventually, taxes and tyrants.
    Fruits and tubers gathered by the Ju/’hoansi.
    The Ju/’hoansi traditionally made use of 125 different edible plant species. Photograph: James Suzman

    Not that early farmers considered themselves helpless. If they did things right, they could minimise the risks that fed their fears. This meant pleasing capricious gods in the conduct of their day-to-day lives – but above all, it placed a premium on working hard and creating surpluses.

    Where hunter-gatherers saw themselves simply as part of an inherently productive environment, farmers regarded their environment as something to manipulate, tame and control. But as any farmer will tell you, bending an environment to your will requires a lot of work. The productivity of a patch of land is directly proportional to the amount of energy you put into it.

    This principle that hard work is a virtue, and its corollary that individual wealth is a reflection of merit, is perhaps the most obvious of the agricultural revolution’s many social, economic and cultural legacies.
    From farming to war

    The acceptance of the link between hard work and prosperity played a profound role in reshaping human destiny. In particular, the ability to both generate and control the distribution of surpluses became a path to power and influence. This laid the foundations for all the key elements of our contemporary economies, and cemented our preoccupation with growth, productivity and trade.

    Regular surpluses enabled a much greater degree of role differentiation within farming societies, creating space for less immediately productive roles. Initially these would have been agriculture-related (toolmakers, builders and butchers), but over time new roles emerged: priests to pray for good rains; fighters to protect farmers from wild animals and rivals; politicians to transform economic power into social capital.
    Voting 0
  6. After World War I the U.S. Government deviated from what had been traditional European policy – forgiving military support costs among the victors. U.S. officials demanded payment for the arms shipped to its Allies in the years before America entered the Great War in 1917. The Allies turned to Germany for reparations to pay these debts. Headed by John Maynard Keynes, British diplomats sought to clean their hands of responsibility for the consequences by promising that all the money they received from Germany would simply be forwarded to the U.S. Treasury.

    The sums were so unpayably high that Germany was driven into austerity and collapse. The nation suffered hyperinflation as the Reichsbank printed marks to throw onto the foreign exchange market. The currency declined, import prices soared, raising domestic prices as well. The debt deflation was much like that of Third World debtors a generation ago, and today’s southern European PIIGS (Portugal, Ireland, Italy, Greece and Spain).

    In a pretense that the reparations and Inter-Ally debt tangle could be made solvent, a triangular flow of payments was facilitated by a convoluted U.S. easy-money policy. American investors sought high returns by buying German local bonds; German municipalities turned over the dollars they received to the Reichsbank for domestic currency; and the Reichsbank used this foreign exchange to pay reparations to Britain and other Allies, enabling these countries to pay the United States what it demanded.

    But solutions based on attempts to keep debts of such magnitude in place by lending debtors the money to pay can only be temporary. The U.S. Federal Reserve sustained this triangular flow by holding down U.S. interest rates. This made it attractive for American investors to buy German municipal bonds and other high-yielding debts. It also deterred Wall Street from drawing funds away from Britain, which would have driven its economy deeper into austerity after the General Strike of 1926. But domestically, low U.S. interest rates and easy credit spurred a real estate bubble, followed by a stock market bubble that burst in 1929. The triangular flow of payments broke down in 1931, leaving a legacy of debt deflation burdening the U.S. and European economies. The Great Depression lasted until outbreak of World War II in 1939.

    Planning for the postwar period took shape as the war neared its end. U.S. diplomats had learned an important lesson. This time there would be no arms debts or reparations. The global financial system would be stabilized – on the basis of gold, and on creditor-oriented rules. By the end of the 1940s the Untied States held some 75 percent of the world’s monetary gold stock. That established the U.S. dollar as the world’s reserve currency, freely convertible into gold at the 1933 parity of $35 an ounce.
    It also implied that once again, as in the 1920s, European balance-of-payments deficits would have to be financed mainly by the United States. Recycling of official government credit was to be filtered via the IMF and World Bank, in which U.S. diplomats alone had veto power to reject policies they found not to be in their national interest. International financial “stability” thus became a global control mechanism – to maintain creditor-oriented rules centered in the United States.

    To obtain gold or dollars as backing for their own domestic monetary systems, other countries had to follow the trade and investment rules laid down by the United States. These rules called for relinquishing control over capital movements or restrictions on foreign takeovers of natural resources and the public domain as well as local industry and banking systems.

    By 1950 the dollar-based global economic system had become increasingly untenable. Gold continued flowing to the United States, strengthening the dollar – until the Korean War reversed matters. From 1951 through 1971 the United States ran a deepening balance-of-payments deficit, which stemmed entirely from overseas military spending. (Private-sector trade and investment was steadily in balance.)
    Voting 0
  7. It turns out the Romans were lucky. The centuries during which the empire was built and flourished are known even to climate scientists as the “Roman Climate Optimum.” From circa 200 BC to AD 150, it was warm, wet, and stable across much of the territory the Romans conquered. In an agricultural economy, these conditions were a major boost to GDP. The population swelled yet still there was enough food to feed everyone.

    But from the middle of the second century, the climate became less reliable. The all-important annual Nile flood became erratic. Droughts and severe cold spells became more common. The Climate Optimum became much less optimal.

    The lesson to be drawn is not, of course, that we shouldn't worry about man-made climate change today, which threatens to be more severe than what the Romans experienced. To the contrary, it shows just how sensitive human societies can be to such change — now amplified in speed and scope by human activity.
    Voting 0
  8. Il Cavallo di Troia non era un cavallo, ma una nave. È quanto sostiene, da circa un anno, un nostro “cervello in fuga”, l’archeologo navale Francesco Tiboni, dottore di ricerca dell’Università di Marsiglia, collaboratore di diverse università e enti stranieri ed italiani.

    L’equivoco millenario sarebbe nato da un errore nella traduzione dei testi successivi a Omero, ai quali si ispirò lo stesso Virgilio (avvalendosi di un traduttore) per comporre l’Eneide. Secondo Tiboni, il manufatto realizzato dai greci per penetrare nelle mura di Troia non sarebbe stato letteralmente un cavallo, in greco hippos, bensì un tipo di nave fenicia che veniva abitualmente chiamata “Hippos”, appunto.

    Plinio il Vecchio sembra spiegare il perché di questa denominazione riferendo che tale imbarcazione fu inventata da un maestro d’ascia fenicio il cui nome era Hippus. Queste navi, non a caso, erano dotate di una caratteristica polena: una testa equina.
    Tags: , , , by M. Fioretti (2017-11-03)
    Voting 0
  9. The American social structure, as Sacks notes, was based on biblical categories. There was a political realm, but the heart of society was in the covenantal realm: “marriages, families, congregations, communities, charities and voluntary associations.”

    America’s Judeo-Christian ethic celebrated neighborliness over pagan combativeness; humility as the basis of good character, not narcissism. It believed in taking in the stranger because we were all strangers once. It dreamed of universal democracy as the global fulfillment of the providential plan.

    That biblical ethic, embraced by atheists as much as the faithful, is not in great shape these days. As Sacks notes: “Today, one half of America is losing all those covenantal institutions. It’s losing strong marriages and families and communities. It is losing a strong sense of the American narrative. It’s even losing e pluribus unum because today everyone prefers pluribus to unum.…”

    Trump and Bannon have filled the void with their own creed, which is anti-biblical. The American story they tell is not diverse people journeying toward a united future. It’s a zero-sum struggle of class and ethnic conflict. The traits Trump embodies are narcissism, not humility; combativeness, not love;; the sanctification of the rich and blindness toward the poor.

    As other relationships wither, many Americans are making partisanship the basis of their identity — their main political, ethnic and moral attachment. And the polls show that if you want to win a Republican primary these days, you have to embrace the Trump narrative, and not the old biblical one.

    The Republican senators greeted Trump on Capitol Hill and saw a president so repetitive and rambling, some thought he might be suffering from early Alzheimer’s. But they know which way the wind is blowing. They gave him a standing ovation.

    Even Alexander Kerensky didn’t abase himself so humiliatingly.

    The people who oppose Trump make a big error: “Let’s Get Togetherism.” This is the belief that if we can only have a civil conversation between red and blue, then everything will be better. But you can’t destroy a moral vision with a process. You need a counter-moral vision.
    Tags: , , , by M. Fioretti (2017-11-02)
    Voting 0
  10. determining when radical physical changes in the Earth system happened provides a basis for determining which human activities were responsible, and thus what measures humans might take to prevent the change from reaching catastrophic proportions. In this article I offer an overview of the issues and stakes in the “when it happened” debate.

    in fact, a dozen or more proposals for dating the Anthropocene have been made to the AWG. While they differ substantially from each other, the starting dates under serious consideration fall into two broad groups that can be labelled Early and Recent, depending on whether the proposed starting date is in the distant past, or relatively close to the present.
    An Early Anthropocene?

    The first Early Anthropocene proposal was advanced by U.S. geologist William Ruddiman, who argues that the Anthropocene started when humans began large-scale agriculture in various parts of the world between eight and five thousand years ago. Those activities, he believes, produced carbon dioxide and methane emissions that raised global temperatures just enough to prevent a return to an Ice Age.9

    Other Early Anthropocene arguments suggest dating the Anthropocene from the first large-scale landscape modifications by humans, from the extinction of many large mammals in the late Pleistocene, from the formation of anthropogenic soils in Europe, or from the European invasions of the Americas in the 1500s. Some archeologists propose to extend the beginning of the Anthropocene back to the earliest surviving traces of human activity, which would take in much of the Pleistocene, and others have suggested that the entire Holocene should simply be renamed Anthropocene, since it is the period when settled human civilizations first developed.

    This outpouring of proposals reflects humanity’s long and complex relationships with the earth’s ecosystems—many of the proposed beginnings are significant turning points in those relationships, and deserve careful study. But the current discussion is not just about human impact: “the Anthropocene is not defined by the broadening impact of humans on the environment, but by active human interference in the processes that govern the geological evolution of the planet.”10 None of the Early Anthropocene options meet that standard, and none of them led to a qualitative break with Holocene conditions.

    Even if Ruddiman’s controversial claim that the agriculture revolution caused some global warming is correct, that would only mean that human activity had extended Holocene conditions. The recent shift out of Holocene conditions, to a no-analogue state, would still need to be evaluated and understood. Noted climatologist James Hansen and his colleagues make this argument clearly in a recent paper:

    Even if the Anthropocene began millennia ago, a fundamentally different phase, a Hyper-Anthropocene, was initiated by explosive 20th century growth of fossil fuel use. Human-made climate forcings now overwhelm natural forcings. CO2, at 400 ppm in 2015, is off the scale … Most of the forcing growth occurred in the past several decades, and two-thirds of the 0.9 C global warming (since 1850) has occurred since 1975.”11

    The Early Anthropocene has been promoted by anti-environmental lobbyists associated with the Breakthrough Institute, because it supports their claim that there has been no recent qualitative change and thus there is no need for a radical response. In their view, today’s environmental crises “represent an acceleration of trends going back hundreds and even thousands of years earlier, not the starting point of a new epoch.

    Moving in exactly the opposite direction, the IGBP’s 2004 book Global Change and the Earth System included several pages of graphs showing historical trends in human activity (GDP growth, population, energy consumption, water use, etc.) and physical changes in the Earth system (atmospheric carbon dioxide, ozone depletion, species extinctions, loss of forests, etc.) from 1750 to 2000. Every trend line showed gradual growth from 1750 and a sharp upturn in about 1950. The authors said that “the last 50 years have without doubt seen the most rapid transformation of the human relationship with the natural world in the history of the species,” but did not explicitly connect that to dating the Anthropocene.14

    In 2005, Will Steffen, principal author of that book, together with Crutzen and environmental historian John McNeill and others, coined the term Great Acceleration for the dramatic social-environmental changes after 1950. The name was a deliberate homage to The Great Transformation, Karl Polanyi’s influential book on the social, economic, and political upheavals that accompanied the rise of market society in England.15

    In 2007, in a journal article provocatively titled “The Anthropocene: Are Humans Now Overwhelming the Great Forces of Nature?,” Steffen, Crutzen, and McNeill republished the Great Acceleration graphs, and suggested that the second half of the twentieth century should be viewed as Stage 2 of the Anthropocene. Updated versions of the 2004 Great Acceleration graphs were prepared this year by the IGBP. As in the original graphs all the trend lines show hockey stick-shaped trajectories.
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 40 Online Bookmarks of M. Fioretti: Tags: history

About - Propulsed by SemanticScuttle