Tags: open standards*

162 bookmark(s) - Sort by: Date ↓ / Title / Voting /

  1. As Google looks for ways to keep people using its own mobile search to discover content — in competition with apps and other services like Facebook’s Instant Articles — the company is announcing some updates to AMP, its collaborative project to speed up mobile web pages.

    Today at the Google I/O developer conference, Google announced that there are now over 2 billion AMP pages covering some 900,000 domains. These pages are also loading twice as fast as before via Google Search. Lastly, the AMP network is now expanding to more e-commerce sites and covering more ad formats.

    In Google’s post announcing that AMP pages load faster — which Lunden links to — they also explain some additional capabilities offered to AMP pages:

    Many of AMP’s e-commerce capabilities were previewed at the AMP Conf and the amp-bind component is now available for origin trials, creating a new interaction model for elements on AMP pages.

    Forms and interactive elements were previously verboten in AMP land, but they’re now allowed through a proprietary — albeit open source — and nonstandard fork of HTML largely developed and popularized by one of the biggest web companies out there.
    https://pxlnv.com/linklog/amp-taking-over
    Tags: , , , by M. Fioretti (2017-05-30)
    Voting 0
  2. Re-decentralizing the web

    Solid (derived from "social linked data") is a proposed set of conventions and tools for building decentralized Web applications based on Linked Data principles. Solid is modular and extensible. It relies as much as possible on existing W3C standards and protocols.
    Table of Contents

    About Solid
    Standards Used
    Platform Notes
    Project directory
    Contributing to Solid

    Pre-Requisites
    Solid Project Workflow

    About Solid

    Specifically, Solid is:

    A tech stack -- a set of complementary standards and data formats/vocabularies that together provide capabilities that are currently available only through centralized social media services (think Facebook/Twitter/LinkedIn/many others), such as identity, authentication and login, authorization and permission lists, contact management, messaging and notifications, feed aggregation and subscription, comments and discussions, and more.
    A Specifications document that describes a REST API that extends those existing standards, contains design notes on the individual components used, and is intended as a guide for developers who plan to build servers or applications.
    A set of servers that implement this specification.
    A test suite for testing and validating Solid implementations.
    An ecosystem of social apps, identity providers and helper libraries (such as solid.js) that run on the Solid platform.
    A community providing documentation, discussion (see the solid gitter channel), tutorials and talks/presentations.

    Standards Used

    The Solid platform uses the following standards.

    RDF 1.1 (Resource Description Framework) (see also RDF Primer) is heavily used in Solid data models. By default, the preferred RDF serialization format is Turtle. Alternative serialization formats such as JSON-LD and RDFa can also be used.

    The WebID 1.0 (Web Identity and Discovery) standard is used to provide universal usernames/IDs for Solid apps, and to refer to unique Agents (people, organizations, devices). See also the WebID interoperability notes for an overview of how WebID relates to other authentication and identity protocols.

    WebIDs, when accessed, yield WebID Profile documents (in Turtle and other RDF formats).

    The FOAF vocabulary is used both in WebID profiles, and in specifying Access Control lists (see below).

    Authentication (for logins, page personalization and more) is done via the WebID-TLS protocol. WebID-TLS extends WebID Profiles to include references to the subject's public keys in the form of X.509 Certificates, using Cert Ontology 1.0 vocabulary. The authentication sequence is done using the HTTP over TLS protocol. Unlike normal HTTPS use cases, WebID-TLS is done without referring to Certificate Authority hierarchies, and instead encourages host server-signed (or self-signed) certificates.

    In Solid, certificate creation is typically done in the browser using the HTML5 keygen element, to provide a one-step creation and certificate publication user experience.

    Authorization and access lists are done using Basic Access Control ontology (see also the WebAccessControl wiki page for more details).

    Solid uses the Linked Data Platform (LDP) standard (see also LDP Primer) extensively, as a standard way of reading and writing generic Linked Data resources.

    Solid Platform Notes

    Solid applications are somewhat like multi-user applications where instances talk to each other through a shared filesystem, and the Web is that filesystem.

    The LDP specification defines a set of rules for HTTP operations on Web resources, some based on RDF, to provide an architecture for reading and writing Linked Data on the Web. The most important feature of LDP is that it provides us with a standard way of RESTfully writing resources (documents) on the Web, without having to rely on less flexible conventions (APIs) based around sending form-encoded data using POST. For more insight into LDP, take a look at the examples in the LDP Primer document.

    Solid's basic protocol is REST, as refined by LDP with minor extensions. New items are created in a container (which could be called a collection or directory) by sending them to the container URL with an HTTP POST or issuing an HTTP PUT within its URL space. Items are updated with HTTP PUT or HTTP PATCH. Items are removed with HTTP DELETE. Items are found using HTTP GET and following links. A GET on the container returns an enumeration of the items in the container.

    Servers are application-agnostic, so that new applications can be developed without needing to modify servers. For example, even though the LDP 1.0 specs contains nothing specific to "social", many of the W3C Social Work Group's User Stories can be implemented using only application logic, with no need to change code on the server. The design ideal is to keep a small standard data management core and extend it as necessary to support increasingly powerful classes of applications.

    The data model is RDF. This means the data can be transmitted in various syntaxes like Turtle, JSON-LD (JSON with a "context"), or RDFa (HTML attributes). RDF is REST-friendly, using URLs everywhere, and it provides decentralized extensibility, so that a set of applications can cooperate in sharing a new kind of data without needing approval from any central authority.
    https://github.com/solid/solid/blob/master/README.md
    Voting 0
  3. Berners-Lee is working to make this a reality through an open source project called Solid. He hopes to create an open technology standard that different applications can use to share data, regardless of what that data is or what type of application needs to read it. Such a standard would enable applications—your hospital’s record-keeping software or a social network—to read and write data from the servers you choose and control, rather than the servers that belong to an individual company.

    The idea that people will eventually migrate from today’s tech giants to more decentralized systems may seem like a stretch. But last year at the Decentralized Web Summit in San Francisco, Berners-Lee pointed out that in the early days of the internet, many people thought proprietary online services like America Online, Compuserve, and Prodigy—all of which sought to tame the chaos of the web and the open internet—would dominate the mainstream market.
    https://www.wired.com/2017/04/tim-ber...r-web-plots-radical-overhaul-creation
    Voting 0
  4. Most texts that are being exchanged only need a small fraction of that what common data-formats have to offer in terms of formatting, mark-up or layout. A simple file composed of Latin-9 characters can be edited since decades on every computer by means of a simple text editor or any word processor. A small subset of HTML 2 could cater for advanced needs like headlines, bullet-lists and hyperlinks. Alternatively any simple textbased markup language like used by Wikis would work for many tasks. The Wikipedia pages and web-logs ("blogs") of the world are proof that lot of content can be expressed by simple means.

    Everyone – except vendors of proprietary software – profits from different software products competing which each other, while being secure and interoperable. The minimal principle for data-formats promotes all this. It just has one rule: Remove everything that is not absolutely necessary. Aim for your design to be simple and elegant. A good solution resembles as set of building blocks where an infinite number of buildings can be made, just by combining a few types of elements.

    Even though there may be good reasons to choose a data-format which covers several requirements we should ask ourselves each time: “Can this be done more simply?”
    https://fsfe.org/activities/os/minimalisticstandards.en.html
    Voting 0
  5. Greer’s archive includes floppy disks, tape cassettes and CD-roms, once cutting-edge technologies that are now obsolete. They are vulnerable to decay and disintegration, leftovers from the unrelenting tide of technological advancement. They will last mere decades, unlike the paper records, which could survive for hundreds of years.

    Buchanan and her team are now working out how to access, catalogue and preserve the thousands of files on these disks, some of them last opened in the 1980s. “We don’t really know what’s going to unfold,” Buchanan says.

    The Greer archivists are facing a challenge that extends far beyond the scope of their collection. Out of this process come enormous questions about the fate of records that are “born digital”, meaning they didn’t start out in paper form. Record-keepers around the world are worried about information born of zeroes and ones – binary code, the building blocks of any digital file.

    Archives are the paydirt of history. Everything else is opinion
    Germaine Greer

    Like floppy disks of the past, information stored on USB sticks, on shared drives or in the cloud is so easily lost, changed or corrupted that we risk losing decades of knowledge if we do not figure out how to manage it properly.

    Though the problem applies to everyone – from classic video-game enthusiasts to people who keep photos on smartphones – it is particularly pressing for universities and other institutions responsible for the creation and preservation of knowledge.
    https://www.theguardian.com/books/201...archive-digital-treasure-floppy-disks
    Voting 0
  6. why did the GDS ban apps? It wasn’t because they weren’t technically savvy enough to build them.

    Cost, he says. Apps are “very expensive to produce, and they’re very very expensive to maintain because you have to keep updating them when there are software changes,” Terrett says. “I would say if you times that by 300, you’re suddenly talking about a huge team people and a ton of money to maintain that ecosystem”.

    How did the UK reach an increasingly mobile population? Responsive websites, he replies. “For government services that we were providing, the web is a far far better way… and still works on mobile.”

    Sites can adapt to any screen size, work on all devices, and are open to everyone to use regardless of their device. “If you believe in the open internet that will always win,” he says. And they’re much cheaper to maintain, he adds, because when an upgrade is required, only one platform needs recoding.

    From voter registration to driving license applications, citizens use response sites with simple designs that are easy to follow. According to estimates by the British Treasury, the GDS saved US$8.2bn (£4.1bn) over four years by taking an approach that emphasized simplicity of design and openness of the service.
    https://govinsider.asia/smart-gov/why-britain-banned-mobile-apps
    Voting 0
  7. lo SPID che ci propongono/impongono è organizzato in modo centralizzato, e cioè in modo tale che il gestore di identità digitale, che è un soggetto terzo tra il fornitore di servizi digitali e l’utente, debba essere coinvolto in ogni singola autenticazione, con la lampante conseguenza che il gestore di identità digitale conosce, per ogni utente, quali applicazioni usa, quando le usa, consentendo quindi una profilazione ben più pesante di quanto fanno oggi i motori di ricerca e/o le applicazioni social. Vi siete mai chiesti perché Google e Facebook siano cosi generosi da farvi autenticare con le loro credenziali su applicazioni che nulla hanno a che fare con loro? Semplice, state così fornendo altre informazioni relativamente ai vostri utilizzi di servizi digitali e li state aiutando/agevolando a completare la vostra profilazione.

    SPID è tecnicamente similare a quanto stanno facendo Google e Facebook e con quanto chiunque, con un minimo di capacità professionale, potrebbe fare, ma con alcune barriere di ingresso che la norma italiana introduce e che non servono per limitare l’accumulo delle informazioni sui comportamenti degli utenti, ma solo per limitare il numero di gestori di identità digitale.
    http://andrea.elestici.com/2016/02/08...-unaltra-inutile-rendita-di-posizione
    Voting 0
  8. This article details a Linux user's struggles to submit a grant application when the process requires finicky, proprietary software. It also covers familiar ground made timely by the upcoming elections: the U.S. should prefer open source software and open standards over proprietary alternatives. The grant application required a PDF created by Adobe Acrobat — software Adobe no longer supports for Linux. Once the document was created, attempting to submit it while using Ubuntu fails silently. (On Windows 7, it worked immediately.) The reader argues, "By requiring Acrobat the government gives preference to a particular software vendor, assuring that thousands of people who otherwise would not choose to use Adobe software are forced to install it. Worse, endorsing a proprietary, narrowly supported technology for government data poses the risk that public information could become inaccessible if the vendor decides to stop supporting the software. Last but not least, there are privacy and fairness issues at stake. Acrobat is a totally closed-source program, which means we have to take Adobe's word for it that nothing sketchy is going on in its code. ... It would seem to be in the interest of the public for the government to prefer an open source solution, since it is much harder to hide nefarious features inside code that can be publicly inspected."
    http://thevarguy.com/open-source-appl...-and-open-standards-tale-personal-woe
    Tags: , , , by M. Fioretti (2016-01-27)
    Voting 0
  9. The Web is always in trouble for some reason or other. I remember when Microsoft came after Netscape and threatened to lock Web standards into IE. Only the Web is so big, with such reach to billions of users, that no one owns it. This means it will always be contested ground.

    But the Web today faces a primal threat.

    Some say the threat to the Web is “mobile”, but the Web is co-evolving with smartphones, not going away. Webviews are commonplace in apps, and no publisher of note is about to replace its primary website with a walled-garden equivalent. Nor can most websites hope to develop their own apps and convert their browser users to app-only users.

    I contend that the threat we face is ancient and, at bottom, human. Some call it advertising, others privacy. I view it as the Principal-Agent conflict of interest woven into the fabric of the Web.

    You use a browser to find and contribute information, but you generally do not pay for the websites who host that information. Across billions of people, for most sites in most countries, it isn’t realistic to expect anything but a free Web. And as Ben Thompson points out, “free” means ad-supported in the main. Yes, successful sites and apps may convert you to a paying customer, but most won’t.

    You might object: “Hey, I’m ready to pay for websites I support”. I’m with you, but many people are not so well-off that they can support most of the commercial sites they use. Also, the Web missed an opportunity back in the early days to define payments and all they entail as a standard.

    Once you grant this premise, that the Web needs ads in the large, it follows that your browsing habits will be surveilled, to the best of the ad ecosystem players’ abilities. Also, depending on how poorly ads are designed and integrated, you may become blind or averse to them. Since the ‘90s, I’ve seen several races to the bottom along these lines.
    https://www.brave.com/#about
    Voting 0
  10. I see a need for funding of ODF implementation, innovation and interoperability testing, driven by a genuine longterm vision for ODF, and which is not tied to a particular vendor, even if a vendor supplies some of the resources needed. Unfortunately, in the short term, there is little incentive for anyone to fund any of this - SUN used to support Open Office and IBM used to actively promote ODF, but SUN is no more and IBM seems to have lost active interest. What we have, a defacto standardisation around MS Office formats, works well enough, although with some risk - and Office can probably read ODF documents well enough to save them as MS formats, if necessary. There are long term costs and risks assciated with being tied into a "standard" owned by a single vendor - I see this as a governance issue - but (these days especially), short trerm considerations usually win out.

    who could provide this funding and the resources necessary for active ODF interoperability testing for Office packages? The big vendors, for whatever reason, seem satisfied with the status quo. Even national governments who support ODF, such as Italy, Germany and the UK), seem to allow the continuing use of MS Office formats, in practice. The Apache Software Foundation doesn't allow the Apache Open Office project to accept the sort of donations it would need to drive a wider ODF vision - which will probably drive it into the Apache "attic" for dormant projects; it has done good work, which will continue to be available, but it doesn't have many developers left. There is Libre Office, of course, which seems to be better marketed and has more of a "buzz" to it, but I don't meet lots of people using it outside of the project enthusiasts, although it claims to be "one of the friendliest and fastest growing projects in the free and open source world". And even Libre Office (like Open Office) often fails to format Powerpoint presentations correctly

    Perhaps my readers can comment on whether they see ODF as ever becoming ubiquitous - and suggest who might fund it without taking it over. In the meantime, I guess I'll be off exploring Libre Office, which seems to be the main hope for popularizing a vendor-independent document environment... Although, I must admit that I still find that Open Office supports most of what I need to do - I just can't send anyone an Open Office ODF document with any confidence that the recipient will be able to read it.
    http://www.it-director.com/blogs/the-...sis-now-that-the-standards-war-is-won
    Tags: , , by M. Fioretti (2015-12-01)
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 17 Online Bookmarks of M. Fioretti: tagged with "open standards"

About - Propulsed by SemanticScuttle