Tags: open standards*

165 bookmark(s) - Sort by: Date ↓ / Title / Voting /

  1. Software matters more than formats (much)#

    Too often people try to design a format first, and then make software that conforms to the format. You might get some good demos. But not much chance of developing a standard that way.

    Users matter even more than software#

    People choose to interop because it helps them find new users. If you have no users to offer, there won't be much interest in interop.

    One way is better than two#

    No matter how much better the new way is, you'll still have to support the old way.

    Fewer formats is better#

    If you can replace two formats with one, without breakage or loss of interop, then I say go for it.
    Removing complexity from the world is always good.
    Think of this like code factoring, but on a larger scale.
    This is 1/2 of Postel's robustness principle -- be conservative in what you send.

    Fewer format features is better#

    If you want to add a feature to a format, first carefully study the existing format and namespaces to be sure what you're doing hasn't already been done. If it has, use the original version. This is how you maximize interop.

    Perfection is a waste of time#

    I've witnessed long debates over which name is better than another.
    I once led a standards discussion beginning with this rule: We always had to come up with the worst possible name for every element. That way when someone said "I think foo is better" (and they did) we could all laugh and say that's exactly why we won't use it.
    It totally doesn't matter what we call it. We can learn to use anything. There are more important things to spend time on.
    Think of people whose first language isn't English. To them the names we choose are symbols, they don't connote anything.

    Write specs in plain English#

    I write for people who have brains, like to think, are educated, care about interop. I understand that people reading specs are not computers.

    Explain the curiosities#

    I also try to explain why things are as they are because people seem to be interested. But only after explaining how it works and providing an example.

    If practice deviates from the spec, change the spec#

    In writing the spec for RSS 0.91, I found that a lot of the limits imposed by the earlier spec were being ignored by developers. So I left the limits out of 0.91 spec. No one complained.
    After RSS 2.0, the format was frozen, so no more adjustments based on practice.

    No breakage#

    Version 2 of your format should be backward compatible. This means that a version 1 file is a valid version 2 file.
    Don't break the installed base. (Not that you can. There are still lots of people running XP even though Microsoft said it was over. And that's a commercial product, not a standard.)

    Freeze the spec#

    At some point, when the new ideas have slowed to a trickle, and as a base of compatible software develops, freeze the spec, but provide an extension mechanism so new ideas have an outlet.
    Developers need a foundation to build on, one that is fixed and isn't moving.

    Keep it simple#

    Beware of open formats that are impossible to fully support.
    XML-RPC could be fully supported in a few days. You could never fully support SOAP. I believe this is no accident. Large companies crafted SOAP so they could say they were open without interoperating with competitors. The goal of XML-RPC was to make it easy to interop.
    Tags: by M. Fioretti (2018-03-07)
    Voting 0
  2. Since she first started specializing in old documents, Watson has expanded beyond things written in English. She now has a stable of collaborators who can tackle manuscripts in Latin, German, Spanish, and more. She can only remember two instances that left her and her colleagues stumped. One was a Tibetan manuscript, and she couldn’t find anyone who knew the alphabet. The other was in such bad shape that she had to admit defeat.

    In the business of reading old documents, Watson has few competitors. There is one transcription company on the other side of the world, in Australia, that offers a similar service. Libraries and archives, when they have a giant batch of handwritten documents to deal with, might recruit volunteers. Even today, when computers have started to excel at reading books, handwritten works escape their understanding. Scholars who study medieval manuscripts have been working on programs that might have a chance of helping with this task, but right now a trained eye is still the best and only way to make real progress.
    Voting 0
  3. A video game’s look and feel is often highly dependent on specific hardware setups, and for most of the medium’s history, those setups often involved a CRT. The iconic black scanlines we associate with old games, for instance, exist because consoles would tell a TV to only draw every other line — thus avoiding the flickering that interlaced video could produce, and smoothing out the overall image. (For more detail, retro gaming enthusiast Tobias Reich maintains an exhaustive guide about scanlines and other CRT rendering issues.) Old games may look torn or feel laggy on a new TV. That’s in part because LCD screens process an entire frame of an image and then display it, rather than receiving a signal and drawing it right away.

    Some games are completely dependent on the display technology. One of the best-known examples is Duck Hunt, which uses Nintendo’s Zapper light gun. When players pull the trigger, the entire screen briefly flashes black, then a white square appears at the “duck’s” location. If the optical sensor detects a quick black-then-white pattern, it’s a hit. The entire Zapper system is coded for a CRT’s super fast refresh rate, and it doesn’t work on new LCD TVs without significant DIY modification.

    A less extreme — but much more popular — case is Super Smash Bros. Melee, a 2001 Nintendo GameCube title that’s become one of the most beloved fighting games of all time. Originally designed for casual players at parties, Melee upends the conventions set by series like Street Fighter and Mortal Kombat: instead of memorizing combos to chip down an opponent’s health bar, players try to knock each other off the screen using careful positioning and improvised, super fast moves. Despite its age, and the increasing difficulty of finding a copy, it’s a mainstay at fighting game tournaments.
    Voting 0
  4. As Google looks for ways to keep people using its own mobile search to discover content — in competition with apps and other services like Facebook’s Instant Articles — the company is announcing some updates to AMP, its collaborative project to speed up mobile web pages.

    Today at the Google I/O developer conference, Google announced that there are now over 2 billion AMP pages covering some 900,000 domains. These pages are also loading twice as fast as before via Google Search. Lastly, the AMP network is now expanding to more e-commerce sites and covering more ad formats.

    In Google’s post announcing that AMP pages load faster — which Lunden links to — they also explain some additional capabilities offered to AMP pages:

    Many of AMP’s e-commerce capabilities were previewed at the AMP Conf and the amp-bind component is now available for origin trials, creating a new interaction model for elements on AMP pages.

    Forms and interactive elements were previously verboten in AMP land, but they’re now allowed through a proprietary — albeit open source — and nonstandard fork of HTML largely developed and popularized by one of the biggest web companies out there.
    Tags: , , , by M. Fioretti (2017-05-30)
    Voting 0
  5. Re-decentralizing the web

    Solid (derived from "social linked data") is a proposed set of conventions and tools for building decentralized Web applications based on Linked Data principles. Solid is modular and extensible. It relies as much as possible on existing W3C standards and protocols.
    Table of Contents

    About Solid
    Standards Used
    Platform Notes
    Project directory
    Contributing to Solid

    Solid Project Workflow

    About Solid

    Specifically, Solid is:

    A tech stack -- a set of complementary standards and data formats/vocabularies that together provide capabilities that are currently available only through centralized social media services (think Facebook/Twitter/LinkedIn/many others), such as identity, authentication and login, authorization and permission lists, contact management, messaging and notifications, feed aggregation and subscription, comments and discussions, and more.
    A Specifications document that describes a REST API that extends those existing standards, contains design notes on the individual components used, and is intended as a guide for developers who plan to build servers or applications.
    A set of servers that implement this specification.
    A test suite for testing and validating Solid implementations.
    An ecosystem of social apps, identity providers and helper libraries (such as solid.js) that run on the Solid platform.
    A community providing documentation, discussion (see the solid gitter channel), tutorials and talks/presentations.

    Standards Used

    The Solid platform uses the following standards.

    RDF 1.1 (Resource Description Framework) (see also RDF Primer) is heavily used in Solid data models. By default, the preferred RDF serialization format is Turtle. Alternative serialization formats such as JSON-LD and RDFa can also be used.

    The WebID 1.0 (Web Identity and Discovery) standard is used to provide universal usernames/IDs for Solid apps, and to refer to unique Agents (people, organizations, devices). See also the WebID interoperability notes for an overview of how WebID relates to other authentication and identity protocols.

    WebIDs, when accessed, yield WebID Profile documents (in Turtle and other RDF formats).

    The FOAF vocabulary is used both in WebID profiles, and in specifying Access Control lists (see below).

    Authentication (for logins, page personalization and more) is done via the WebID-TLS protocol. WebID-TLS extends WebID Profiles to include references to the subject's public keys in the form of X.509 Certificates, using Cert Ontology 1.0 vocabulary. The authentication sequence is done using the HTTP over TLS protocol. Unlike normal HTTPS use cases, WebID-TLS is done without referring to Certificate Authority hierarchies, and instead encourages host server-signed (or self-signed) certificates.

    In Solid, certificate creation is typically done in the browser using the HTML5 keygen element, to provide a one-step creation and certificate publication user experience.

    Authorization and access lists are done using Basic Access Control ontology (see also the WebAccessControl wiki page for more details).

    Solid uses the Linked Data Platform (LDP) standard (see also LDP Primer) extensively, as a standard way of reading and writing generic Linked Data resources.

    Solid Platform Notes

    Solid applications are somewhat like multi-user applications where instances talk to each other through a shared filesystem, and the Web is that filesystem.

    The LDP specification defines a set of rules for HTTP operations on Web resources, some based on RDF, to provide an architecture for reading and writing Linked Data on the Web. The most important feature of LDP is that it provides us with a standard way of RESTfully writing resources (documents) on the Web, without having to rely on less flexible conventions (APIs) based around sending form-encoded data using POST. For more insight into LDP, take a look at the examples in the LDP Primer document.

    Solid's basic protocol is REST, as refined by LDP with minor extensions. New items are created in a container (which could be called a collection or directory) by sending them to the container URL with an HTTP POST or issuing an HTTP PUT within its URL space. Items are updated with HTTP PUT or HTTP PATCH. Items are removed with HTTP DELETE. Items are found using HTTP GET and following links. A GET on the container returns an enumeration of the items in the container.

    Servers are application-agnostic, so that new applications can be developed without needing to modify servers. For example, even though the LDP 1.0 specs contains nothing specific to "social", many of the W3C Social Work Group's User Stories can be implemented using only application logic, with no need to change code on the server. The design ideal is to keep a small standard data management core and extend it as necessary to support increasingly powerful classes of applications.

    The data model is RDF. This means the data can be transmitted in various syntaxes like Turtle, JSON-LD (JSON with a "context"), or RDFa (HTML attributes). RDF is REST-friendly, using URLs everywhere, and it provides decentralized extensibility, so that a set of applications can cooperate in sharing a new kind of data without needing approval from any central authority.
    Voting 0
  6. Berners-Lee is working to make this a reality through an open source project called Solid. He hopes to create an open technology standard that different applications can use to share data, regardless of what that data is or what type of application needs to read it. Such a standard would enable applications—your hospital’s record-keeping software or a social network—to read and write data from the servers you choose and control, rather than the servers that belong to an individual company.

    The idea that people will eventually migrate from today’s tech giants to more decentralized systems may seem like a stretch. But last year at the Decentralized Web Summit in San Francisco, Berners-Lee pointed out that in the early days of the internet, many people thought proprietary online services like America Online, Compuserve, and Prodigy—all of which sought to tame the chaos of the web and the open internet—would dominate the mainstream market.
    Voting 0
  7. Most texts that are being exchanged only need a small fraction of that what common data-formats have to offer in terms of formatting, mark-up or layout. A simple file composed of Latin-9 characters can be edited since decades on every computer by means of a simple text editor or any word processor. A small subset of HTML 2 could cater for advanced needs like headlines, bullet-lists and hyperlinks. Alternatively any simple textbased markup language like used by Wikis would work for many tasks. The Wikipedia pages and web-logs ("blogs") of the world are proof that lot of content can be expressed by simple means.

    Everyone – except vendors of proprietary software – profits from different software products competing which each other, while being secure and interoperable. The minimal principle for data-formats promotes all this. It just has one rule: Remove everything that is not absolutely necessary. Aim for your design to be simple and elegant. A good solution resembles as set of building blocks where an infinite number of buildings can be made, just by combining a few types of elements.

    Even though there may be good reasons to choose a data-format which covers several requirements we should ask ourselves each time: “Can this be done more simply?”
    Voting 0
  8. Greer’s archive includes floppy disks, tape cassettes and CD-roms, once cutting-edge technologies that are now obsolete. They are vulnerable to decay and disintegration, leftovers from the unrelenting tide of technological advancement. They will last mere decades, unlike the paper records, which could survive for hundreds of years.

    Buchanan and her team are now working out how to access, catalogue and preserve the thousands of files on these disks, some of them last opened in the 1980s. “We don’t really know what’s going to unfold,” Buchanan says.

    The Greer archivists are facing a challenge that extends far beyond the scope of their collection. Out of this process come enormous questions about the fate of records that are “born digital”, meaning they didn’t start out in paper form. Record-keepers around the world are worried about information born of zeroes and ones – binary code, the building blocks of any digital file.

    Archives are the paydirt of history. Everything else is opinion
    Germaine Greer

    Like floppy disks of the past, information stored on USB sticks, on shared drives or in the cloud is so easily lost, changed or corrupted that we risk losing decades of knowledge if we do not figure out how to manage it properly.

    Though the problem applies to everyone – from classic video-game enthusiasts to people who keep photos on smartphones – it is particularly pressing for universities and other institutions responsible for the creation and preservation of knowledge.
    Voting 0
  9. why did the GDS ban apps? It wasn’t because they weren’t technically savvy enough to build them.

    Cost, he says. Apps are “very expensive to produce, and they’re very very expensive to maintain because you have to keep updating them when there are software changes,” Terrett says. “I would say if you times that by 300, you’re suddenly talking about a huge team people and a ton of money to maintain that ecosystem”.

    How did the UK reach an increasingly mobile population? Responsive websites, he replies. “For government services that we were providing, the web is a far far better way… and still works on mobile.”

    Sites can adapt to any screen size, work on all devices, and are open to everyone to use regardless of their device. “If you believe in the open internet that will always win,” he says. And they’re much cheaper to maintain, he adds, because when an upgrade is required, only one platform needs recoding.

    From voter registration to driving license applications, citizens use response sites with simple designs that are easy to follow. According to estimates by the British Treasury, the GDS saved US$8.2bn (£4.1bn) over four years by taking an approach that emphasized simplicity of design and openness of the service.
    Voting 0
  10. lo SPID che ci propongono/impongono è organizzato in modo centralizzato, e cioè in modo tale che il gestore di identità digitale, che è un soggetto terzo tra il fornitore di servizi digitali e l’utente, debba essere coinvolto in ogni singola autenticazione, con la lampante conseguenza che il gestore di identità digitale conosce, per ogni utente, quali applicazioni usa, quando le usa, consentendo quindi una profilazione ben più pesante di quanto fanno oggi i motori di ricerca e/o le applicazioni social. Vi siete mai chiesti perché Google e Facebook siano cosi generosi da farvi autenticare con le loro credenziali su applicazioni che nulla hanno a che fare con loro? Semplice, state così fornendo altre informazioni relativamente ai vostri utilizzi di servizi digitali e li state aiutando/agevolando a completare la vostra profilazione.

    SPID è tecnicamente similare a quanto stanno facendo Google e Facebook e con quanto chiunque, con un minimo di capacità professionale, potrebbe fare, ma con alcune barriere di ingresso che la norma italiana introduce e che non servono per limitare l’accumulo delle informazioni sui comportamenti degli utenti, ma solo per limitare il numero di gestori di identità digitale.
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 17 Online Bookmarks of M. Fioretti: tagged with "open standards"

About - Propulsed by SemanticScuttle