mfioretti: open data*

Bookmarks on this page are managed by an admin user.

1237 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. As Becky Hogge notes in her important report on the impact of open data, tracking impact is tricky. By its nature, open data is resistant to traditional impact reporting; in part because we don’t know exactly how it is being used, and in part because the value chain is so diffuse. So, Hogge argues, outside of sweeping statements about potential, at this stage impact is largely indicated by “fragments” of stories from the ground.

    Despite this methodological challenge, we think OpenCorporates has already made many tangible contributions to revealing how power runs through corporate networks. Therefore, in this part of our reading list we’ll keep track of fragments that illustrate the importance of open company data, including anti-corruption investigations, and internal and external impact reports.
    https://blog.opencorporates.com/2017/...ding-list-impact-of-open-company-data
    Tags: , by M. Fioretti (2018-01-08)
    Voting 0
  2. From 1900 to 2010, the amount of materials accumulated in buildings and infrastructure across the world increased 23-fold. We are depleting our resources at unprecedented rates. Instead of extracting dwindling raw materials from nature at ever-increasing cost, the time has come to start re-using materials from buildings and infrastructure in our cities.

    We have been working on identifying the material resources in cities that could be “mined” for re-use. In a case study, we modelled more than 13,000 buildings in central Melbourne, Australia. We estimated the quantities of construction materials as well as the embodied energy, water and greenhouse gas emissions associated with constructing these buildings (if they were built today). We also modelled the replacement of materials over time and into the future.

    Further reading: The 20th century saw a 23-fold increase in natural resources used for building

    The extraction and transformation of resources have broad environmental effects. These include resource depletion, loss of biodiversity, soil and water pollution, and greenhouse gas emissions, which drive climate change.

    Adding to these challenges is the amount of waste generated, especially by the construction sector due to construction, renovation and demolition activities. Every time a construction material is discarded, all the embodied energy, water and emissions that went into producing it also go to waste.

    In our two recent studies, we propose a model that can help us “mine” our cities and quantify the environmental benefits of this urban mining.


    These maps allow us to start thinking of cities as urban mines and places of material production (supply), rather than just consumption (demand).

    We can imagine how a new construction project could survey what materials would be available at its start and how it can best re-use these and incorporate them into the design. This would save large amounts of energy and water, while avoiding greenhouse gas emissions and further ecosystem degradation from raw material extraction (usually far from the city).
    https://theconversation.com/with-the-right-tools-we-can-mine-cities-87672
    Voting 0
  3. Secondo alcuni studi ed esperienze, questo tipo di progettazione innovativa può consentire almeno il 10% di risparmi di spese di gestione e risparmi lungo tutto il ciclo dell’opera, abbattendo il ricorso alle varianti e prevedendo per tempo le manutenzioni necessarie.

    Il decreto è il risultato di un lavoro complesso e approfondito, avviato da una Commissione appositamente istituita dal ministro e composta da rappresentanti del Ministero delle Infrastrutture e dei Trasporti, di Anac, Agid, delle Università degli Studi di Brescia, Sapienza di Roma, Federico II di Napoli, del Politecnico di Milano, della Rete delle Professioni Tecniche, che ha proceduto alle audizioni dei principali stakeholder del settore e predisposto una prima bozza del provvedimento. La bozza è stata poi sottoposta a consultazione pubblica, i cui contributi sono stati valutati ed integrati nella stesura finale del testo.

    Il provvedimento disciplina inoltre gli adempimenti preliminari delle stazioni appaltanti, che dovranno adottare un piano di formazione del proprio personale, un piano di acquisizione o di manutenzione di hardware e software di gestione dei processi decisionali e informativi e un atto organizzativo che espliciti il processo di controllo e gestione, i gestori dei dati e la gestione dei conflitti.

    E’ previsto l’utilizzo di piattaforme interoperabili a mezzo di formati aperti non proprietari da parte delle stazioni appaltanti ed è definito l’utilizzo dei dati e delle informazioni prodotte e condivise tra tutti i partecipanti al progetto, alla costruzione e alla gestione dell’intervento.

    Il decreto prevede, già dall’entrata in vigore, l’utilizzo facoltativo dei metodi e degli strumenti elettronici specifici per le nuove opere e per interventi di recupero, riqualificazione o varianti, da parte delle stazioni appaltanti che abbiano ottemperato agli adempimenti preliminari.

    L’obbligo all’utilizzo dei metodi e degli strumenti elettronici di modellazione decorre dal 1° gennaio 2019 per le opere di importo pari o superiore a 100 milioni di euro, e poi via via per importi minori a decorrere dagli anni successivi al 2019 fino alle opere di importo inferiore a 1 milione di euro, per le quali il termine decorre dal 1° gennaio 2025.
    https://www.corrierecomunicazioni.it/...prechi-digitale-obbligatorio-dal-2019
    Voting 0
  4. Questo post lo devo alle tante persone che utilizzano i dati statistici prodotti e pubblicati dall’Istituto Nazionale di Statistica e che hanno necessità di farlo in modalità “machine to machine” sfruttando la comodità di utilizzare API

    Non tutti sanno infatti che Istat rende disponibili i dati pubblicati su dati.istat.it, il data warehouse dell’Istituto anche attraverso API Rest con output Json. Va detto subito che questo è un canale alternativo di diffusione dati in “beta version”, è stato rilasciato nel 2011 ma è sufficientemente stabile e interessante da poter essere utilizzato senza particolari problemi.

    Il formato del Json di output non è un formato qualsiasi ma risponde alle specifiche Json-Stat (https://json-stat.org/). Json-Stat nasce alcuni anni fa da un’intuizione di Xavier Badosa, un mio geniale amico e collega dell’Istituto di Statistica di Catalogna. Avevamo avuto modo di parlarne e di confrontarci in più di un’occasione, ma dopo una sessione di lavoro a Parigi presso l’OCSE in cui avemmo modo di approfondire la cosa, passammo ad implementare Json-Stat su un intero sistema di diffusione. Quello dell’Istat. Nasce così apistat.istat.it.
    https://medium.com/@vincpatruno/come-...nale-di-statistica-istat-ca874316f5a9
    Tags: , , by M. Fioretti (2017-10-03)
    Voting 0
  5. In short, the authors together seem to find little evidence that tools for "citizen voice" translate into "citizen teeth" to prompt action on the part of governing officials. To put it another way: There is a wide chasm between uptake of these tools by the public and institutional impact. Reflecting on the chapters, Peixoto and Sifry write: "To conclude, the challenges of inclusiveness and government responsiveness are not exclusive to civic technology and are certainly not new. Rather, they are the backdrop against which institutions and democracy have evolved throughout history. Whether civic technology makes a difference or not will ultimately depend on the extent to which it addresses these challenges as they are manifested today."
    http://comminit.com/global/content/ci...outh-assessing-technology-public-good
    Voting 0
  6. Il dissesto del Comune? Impossibile: danneggerebbe immagine (e rating) di tutto il Paese – La situazione è insomma assai ingarbugliata. Lo era del resto anche prima dell’accordo della Raggi con le banche, come ben sapeva l’ex sindaco Gianni Alemanno che pensò di venir fuori dall’intreccio Campidoglio-Atac girando il credito vantato sulla municipalizzata alla gestione commissariale, la bad bank che si occupa di smaltire i 12 miliardi di debito cumulati negli anni dal Comune di Roma. Uscito dalla porta di Alemanno, il vecchio credito è però rientrato dalla finestra di Ignazio Marino che lo ha chiesto indietro per “sistemare” il bilancio della sua gestione capitolina. Una storia antica che ora rischia di scoppiare nelle mani della Raggi, intenzionata a ristrutturare l’Atac che, secondo l’ex assessore Mazzillo, potrebbe spingere l’amministrazione capitolina verso il dissesto, cioè ad una procedura straordinaria di rientro con un piano che viene vidimato dal Ministero dell’Interno e validato dalla Corte dei Conti. Secondo una fonte interna alla magistratura contabile, si tratta però di un’opzione più teorica che pratica perché avrebbe un impatto sulla credibilità dell’intero Paese che, oltre al danno d’immagine a livello internazionale, pagherebbe lo scotto in termini di maggiori tassi di interesse per il debito pubblico di ogni genere e grado.

    Tutti i salvataggi della Capitale – Non è un caso del resto che in passato la politica abbia optato per risolvere senza grandi clamori le criticità del bilancio di Roma Capitale. Nel 2008, per coprire il buco ereditato da Veltroni, l’allora sindaco Alemanno chiese e ottenne dal governo di Silvio Berlusconi la gestione commissariale straordinaria separando buona parte del vecchio debito della Capitale (22 miliardi, oggi scesi a 12). L’obiettivo era consentire al Campidoglio di ripartire da zero smaltendo intanto la pesante eredità del passato grazie a un contributo pubblico annuo di 500 milioni, poi riconfermato in vario modo da tutti i governi successivi. Purtroppo però la gestione Alemanno non fu così oculata. Secondo quanto riferì poi la relazione sulla verifica amministrativo-contabile a Roma Capitale chiesta da Marino al Mef, nel periodo 2009-2012 Alemanno creò un disavanzo da quasi 500 milioni arrivando a triplicare i trasferimenti alle municipalizzate. La falla venne tappata da Marino con i crediti vantati verso l’Atac restituiti al Campidoglio dalla gestione straordinaria del debito di Massimo Varazzani e con un aumento dell’Irpef nel decreto Salva-Roma siglato dal tandem Renzi-Letta. Tutto questo non fu sufficiente a sistemare i conti della Capitale che richiedono innanzitutto pulizia e trasparenza nei rapporti con le partecipate. Di qui la volontà della Raggi di entrare nel vivo della ristrutturazione Atac attraverso un concordato preventivo in continuità con cifre e tempi d’incasso tutti da definire per i creditori. Ma l’operazione che la Raggi ha affidato al nuovo assessore Gianni Lemmetti rischia di essere decisamente dolorosa per l’amministrazione capitolina senza l’appoggio del governo e della gestione commissariale.
    http://www.ilfattoquotidiano.it/2017/...ra-mai-fallire-il-campidoglio/3838848
    Voting 0
  7. Dukes are the highest-ranking tier of the British aristocracy – a select elite within an elite, ranking above Marquesses, Earls, Barons and Viscounts, whose lands and titles derive from centuries of Royal patronage.

    There are 30 Dukes in the UK today. Five of these are ceremonial titles for members of the Royal family, conferring no wealth or estates. The one other Royal Duke who is a significant landowner is Prince Charles, Duke of Cornwall, whose 135,000-acre estate I’ve written about elsewhere.

    The remaining 24 Dukes are all extremely wealthy men who together own around a million acres of land. Yet my investigations show that the taxpayer continues to subsidise them to the tune of £8million annually, through our broken farm subsidy system. What’s more, many of the Dukes benefit from tax breaks on their wealth and have constructed elaborate trust fund schemes to avoid inheritance taxes. How the Dukes have survived and prospered into the 21st century is a telling insight into modern English society.
    The Dukes’ landholdings and subsidies

    Britain’s Dukes are some of the largest private landowners in the country.
    https://whoownsengland.org/2017/05/08...tax-breaks-an-8million-annual-subsidy
    Tags: , , , by M. Fioretti (2017-05-09)
    Voting 0
  8. A comparison of trading data for the Swedish krona and British pound may provide further evidence that some investors could be trading with knowledge of U.K. official statistics before they are published.

    Sweden and Britain, two European countries with widely traded currencies, have very different approaches when it comes to policy on who sees official economic data before it goes out.

    In Sweden, nobody outside the statistics office, not even the country’s prime minister, is allowed to see sensitive data before release, according to Statistics Sweden, the country’s official data provider. In Britain, over a hundred lawmakers, advisers and press officers get to see some numbers up to a day before it comes out.

    The British pound often moves sharply in the hour before data is released, but the krona shows no signs of moving ahead of Swedish numbers, an analysis of trading data between January 2011 and March 2017 suggests.

    During the hour before unexpectedly strong or weak U.K. data is made public, the pound moved 0.065% versus the dollar on average in the same direction it subsequently did after those numbers came out, according to an analysis prepared for The Wall Street Journal by Alexander Kurov, associate professor of finance at West Virginia University.
    https://www.wsj.com/articles/are-u-k-...rrency-markets-suggest-yes-1493182801
    Voting 0
  9. Organizations use machine-readable data for a number of applications across countries as:

    A resource in the development of web and mobile products and services. Organizations create digital applications that present data in accessible ways. For instance, one agribusiness company in Ghana automates the translation of weather data and commodity prices into simple phrases that are texted to farmers in their local languages. Many organizations conduct predictive analytics and forecasting. For example, one Indian geospatial analytics company uses machine-readable geospatial and agricultural data to predict crop acreage and yields.
    A way to optimize organizational decision-making. Several organizations use machine-readable open data to inform their strategy and investments. Census, household and income surveys in particular are critical to many for targeting populations and markets. It is especially useful when disaggregated by sex, age, location and household income.
    Evidence for research and policy recommendations. Research institutions from Moldova to Zambia use machine-readable data as critical evidence to conduct analyses and support policy recommendations on issues ranging from regional and national economic development, poverty and economic integration, to health and democracy initiatives.
    A tool for advocacy on government spending, elections, and programs. For example, organizations use public. For example, one nonprofit in Ukraine uses spending data to monitor government finances and programs. Another in Nigeria uses budget data to develops infographics for citizens. Yet another provides a tool to monitor contracts, including for the extractive industries in various countries. Across regions, organizations are training journalists to use government data in their reporting, and monitor elections using open electoral commission data.

    Most of the data used is not (yet) machine-readable.

    While all the organizations in our study used machine-readable data as in their work, half of them told us that the majority of the data they need is still only available in PDFs, images, paper reports, or as website text. Over three quarters of the organizations stated formats were a barrier to data use. This is especially the case when working with large, historic and geospatial datasets. For example, organizations most benefit from geospatial data when it is highly detailed and available in shapefiles, GeoJSON, or CSV - formats that can be utilized by a computer - rather than in image form as it is too often provided. Similarly, census data is especially valuable when it can be accessed in bulk and is available in CSV or other machine-readable formats.
    http://blogs.worldbank.org/opendata/m...-it-s-applicable-developing-countries
    Voting 0
  10. Significant numbers of scientists do not publish their research data, a survey has found, despite a vast majority believing that having access to other scholars’ raw material would benefit them.

    While 73 per cent of respondents to a global survey of academics conducted by information and analytics company Elsevier and Leiden University agreed that having access to other researchers’ data would be beneficial, 34 per cent admitted that they did not publish their own figures.
    Binary data (illustration)
    Half of academics confused about open data

    Read more

    The survey, which attracted 1,162 responses from all scientific fields, found that one in 10 researchers (11 per cent) would never be willing to allow other researchers to access their data. Sixty-four per cent said that they would, while 25 per cent were undecided.

    When academics did publish data, they tended to do so in an appendix to a research article or as a stand-alone article in a data journal, the survey says. Only 13 per cent of scholars published their data in a data repository, which is regarded as being more accessible.

    Direct sharing of data person-to-person appears to be closely related to collaboration, the report, Open Data: the Researcher Perspective, adds, with only 14 per cent having passed their material to researchers they do not know.

    Ingeborg Meijer, a senior researcher at Leiden’s Centre for Science and Technology Studies, said that it was “concerning” that researchers did “not feel the responsibility to put more effort in to » data sharing”.

    “The fact that a third do not publish their data at all is of concern to the open data movement,” she said. “Some of them think that publishing their results as aggregated tables and figures is sufficient, which is the traditional researcher perception. Some journals do require the » uploading of data, but it is still easy to get around it if a researcher doesn’t want to share » .”

    Search our database for the latest global science jobs

    Scientists who responded to the survey had a number of reasons for not sharing or publishing their data, including “privacy concerns, ethical issues, and intellectual property rights”. Some said that they “do not like the idea that others might abuse or misinterpret their data, let alone take credit for it”. The study’s authors suggest that policies such as data sharing mandates by funders or publishers have had “limited effect” in increasing the practice.

    However, the survey also showed that many researchers feel insufficiently prepared for data sharing, while also stating a lack of incentive to do so. Only 37 per cent said that sharing data is rewarded in their field, while 41 per cent said that they had not received sufficient training. The findings underline previous calls for more guidance for academics confused about open data.

    The report concludes that a “change in the scientific culture is needed”, where researchers are “stimulated and rewarded for sharing data” and where institutions implement and support research data policies, including mandates in some cases.

    “Currently, researchers have many responsibilities and data sharing is not perceived as a responsibility that will help their careers,” it states. “With this shift in culture, the perception of open data practices will transform. Rather than being seen as an extra effort removed from the research itself, research data management may be recognised as an integral part of the daily work of researchers.”
    https://www.timeshighereducation.com/...-resist-shift-open-data#survey-answer
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 124 Online Bookmarks of M. Fioretti: Tags: open data

About - Propulsed by SemanticScuttle