mfioretti: file formats*

Bookmarks on this page are managed by an admin user.

93 bookmark(s) - Sort by: Date ↓ / Title / Voting / - Bookmarks from other users for this tag

  1. Since she first started specializing in old documents, Watson has expanded beyond things written in English. She now has a stable of collaborators who can tackle manuscripts in Latin, German, Spanish, and more. She can only remember two instances that left her and her colleagues stumped. One was a Tibetan manuscript, and she couldn’t find anyone who knew the alphabet. The other was in such bad shape that she had to admit defeat.

    In the business of reading old documents, Watson has few competitors. There is one transcription company on the other side of the world, in Australia, that offers a similar service. Libraries and archives, when they have a giant batch of handwritten documents to deal with, might recruit volunteers. Even today, when computers have started to excel at reading books, handwritten works escape their understanding. Scholars who study medieval manuscripts have been working on programs that might have a chance of helping with this task, but right now a trained eye is still the best and only way to make real progress.
    https://www.atlasobscura.com/articles...ource=twitter.com&utm_campaign=buffer
    Voting 0
  2. A video game’s look and feel is often highly dependent on specific hardware setups, and for most of the medium’s history, those setups often involved a CRT. The iconic black scanlines we associate with old games, for instance, exist because consoles would tell a TV to only draw every other line — thus avoiding the flickering that interlaced video could produce, and smoothing out the overall image. (For more detail, retro gaming enthusiast Tobias Reich maintains an exhaustive guide about scanlines and other CRT rendering issues.) Old games may look torn or feel laggy on a new TV. That’s in part because LCD screens process an entire frame of an image and then display it, rather than receiving a signal and drawing it right away.

    Some games are completely dependent on the display technology. One of the best-known examples is Duck Hunt, which uses Nintendo’s Zapper light gun. When players pull the trigger, the entire screen briefly flashes black, then a white square appears at the “duck’s” location. If the optical sensor detects a quick black-then-white pattern, it’s a hit. The entire Zapper system is coded for a CRT’s super fast refresh rate, and it doesn’t work on new LCD TVs without significant DIY modification.

    A less extreme — but much more popular — case is Super Smash Bros. Melee, a 2001 Nintendo GameCube title that’s become one of the most beloved fighting games of all time. Originally designed for casual players at parties, Melee upends the conventions set by series like Street Fighter and Mortal Kombat: instead of memorizing combos to chip down an opponent’s health bar, players try to knock each other off the screen using careful positioning and improvised, super fast moves. Despite its age, and the increasing difficulty of finding a copy, it’s a mainstay at fighting game tournaments.
    https://www.theverge.com/2018/2/6/169...vs-crt-restoration-led-gaming-vintage
    Voting 0
  3. Most texts that are being exchanged only need a small fraction of that what common data-formats have to offer in terms of formatting, mark-up or layout. A simple file composed of Latin-9 characters can be edited since decades on every computer by means of a simple text editor or any word processor. A small subset of HTML 2 could cater for advanced needs like headlines, bullet-lists and hyperlinks. Alternatively any simple textbased markup language like used by Wikis would work for many tasks. The Wikipedia pages and web-logs ("blogs") of the world are proof that lot of content can be expressed by simple means.

    Everyone – except vendors of proprietary software – profits from different software products competing which each other, while being secure and interoperable. The minimal principle for data-formats promotes all this. It just has one rule: Remove everything that is not absolutely necessary. Aim for your design to be simple and elegant. A good solution resembles as set of building blocks where an infinite number of buildings can be made, just by combining a few types of elements.

    Even though there may be good reasons to choose a data-format which covers several requirements we should ask ourselves each time: “Can this be done more simply?”
    https://fsfe.org/activities/os/minimalisticstandards.en.html
    Voting 0
  4. Greer’s archive includes floppy disks, tape cassettes and CD-roms, once cutting-edge technologies that are now obsolete. They are vulnerable to decay and disintegration, leftovers from the unrelenting tide of technological advancement. They will last mere decades, unlike the paper records, which could survive for hundreds of years.

    Buchanan and her team are now working out how to access, catalogue and preserve the thousands of files on these disks, some of them last opened in the 1980s. “We don’t really know what’s going to unfold,” Buchanan says.

    The Greer archivists are facing a challenge that extends far beyond the scope of their collection. Out of this process come enormous questions about the fate of records that are “born digital”, meaning they didn’t start out in paper form. Record-keepers around the world are worried about information born of zeroes and ones – binary code, the building blocks of any digital file.

    Archives are the paydirt of history. Everything else is opinion
    Germaine Greer

    Like floppy disks of the past, information stored on USB sticks, on shared drives or in the cloud is so easily lost, changed or corrupted that we risk losing decades of knowledge if we do not figure out how to manage it properly.

    Though the problem applies to everyone – from classic video-game enthusiasts to people who keep photos on smartphones – it is particularly pressing for universities and other institutions responsible for the creation and preservation of knowledge.
    https://www.theguardian.com/books/201...archive-digital-treasure-floppy-disks
    Voting 0
  5. That’s good and interesting software criticism! But also testament to how long Photoshop has stuck around (since 1990), how it can’t be avoided, and how every Photoshop file carries with it the terrible burden of technology history, which this poor programmer had to face down in their spare time. What you see on your desktop is a simple preview icon; what lurks inside is the result of terrible imperatives. And Photoshop is so important that the Computer History Museum has preserved its source code. Grady Booch, a very famous programmer, looked through the source code. He said:

    Having the opportunity to examine Photoshop’s current architecture, I believe I see fundamental structures that have persisted, though certainly in more evolved forms, in the modern implementation. Tiles, filters, abstractions for virtual memory (to attend to images far larger than display buffers or main memory could normally handle) are all there in the first version. Yet it had just over 100,000 lines of code, compared to well over 10 million in the current version! Then and now, much of the code is related to input/output and the myriad of file formats that Photoshop has to attend to.
    https://posts.postlight.com/fun-photo...-format-facts-edbc1374c715#.tw13q7mrr
    Voting 0
  6. The shift in design and production that Singh revealed includes 3D design software, 3D printing, social network sites such as Kickstarter, and the software behind generative design. Generative design is a method of design in which software creates a product based on a set of rules or an algorithm. “In building a product, you’re capturing requirements,” explained Singh. “What temperature does it need to withstand? How big does it need to be? How strong? You capture the requirement before you design the product. From those requirements, you begin the design.” She noted that the computer will tell you what the product should look like and what material it should be made from.
    READ MORE ARTICLES ON 3D DESIGN:

    Mechanical Engineers Release Web-Based 3D Tool for Part Design Optimization

    Latest Autodesk Fusion 360 Update Highlights Cloud Power

    Stratasys Buys GrabCAD to Strengthen 3D Design Business

    You move from functional requirements, such as the product needing to sustain a particular load, and then you add the environment in which the product needs to work. “That gets you into generative modeling,” said Singh. “You tell the computer how you need the design to function, and then you let the computer create the design. You don’t need any drafting skills.”

    According to Singh, the design will tell you how to manufacture the product.” We told the computer the product needs to carry a certain load. And it had to attach to a wall,” said Singh. “The computer comes back with a simulated 3D design.” She noted that the new processes do not mean the end of traditional design. “We’re seeing a convergence of art and manufacturing. Generative design has a component of manufacturability already in it.”

    As Many Changes as You Wish

    The new design process is revision friendly, according to Singh. She notes that now products often go through multiple revisions, with design changes affecting form, shape, materials, and manufacturing. “We go back to the drawing board and we redesign and redesign,” said Singh. “You tell your design what functional aspects you need and it tells you what it needs to look like and it offers options for the user. The user requirement is always present.”

    Attention to user requirements is one of the predominant factors in new design and manufacturing processes. “As you bring a product to market, the customer expectations are always changing,” said Singh. “People will spend more money for ear buds that are specifically designed for your ears and custom created on the spot.”
    http://www.designnews.com/author.asp?...nd_186,kw_2,aid_278950&dfpLayout=blog
    Tags: , , by M. Fioretti (2015-10-29)
    Voting 0
  7. "We save it as a picture as it's longer life than a file. You don't rely on PowerPoint or Word. In 50 years they can still just look at it,"
    http://www.theinquirer.net/inquirer/n...preserve-human-history-argues-vatican
    Voting 0
  8. that is a great question. And you’re right, up until now I have not covered what I feel is the best file format(s) to save scanned photos with. But, as you astutely noticed, I did sort of allude to my personal choice in a couple of my posts. Especially in some of my images I used in my 3-part “naming convention” series you brought up called What Everyone Ought to Know When Naming Your Scanned Photos.

    I think your question actually deserves a slightly more complex answer than I could normally get away with. Had you simply asked, “Which do you prefer for scanning photos, the TIFF or PNG format?”, I would feel comfortable quickly answering you that in my humble opinion, the TIFF format is by far more superior for the purpose of scanning photos. But, since you brought up your interest in “archiving” your photographs, I want to make sure I elaborate a bit more to explain why our personal goals of scanning need to be considered when making the final decision which file format to save our master image files.
    http://www.scanyourentirelife.com/tif...-file-format-hurt-save-scanned-photos
    Voting 0
  9. Office Open XML: la storia e i segreti di uno standard che non è uno standard
    http://www.libreitalia.it/office-open...ource=twitter.com&utm_campaign=buffer
    Voting 0
  10. Last July, the U.K. Cabinet Office formally adopted ODF, the OpenDocument Format developed by OASIS and adopted by ISO/IEC, as an approved open format for editable public documents. It did not give the same approval to OOXML, another XML-based document format that was based on a contribution from Microsoft to ECMA, another standards organization. OOXML was also in due course adopted by ISO/IEC. The Cabinet Office decision came ten years after the largest standards war of the decade was launched by a similar, but later reversed, decision by the Commonwealth of Massachusetts.

    As that war heated up, both sides (ODF was supported by IBM, Oracle, Motorola, Google and others) recruited as many allies as they could. One of those recruited by Microsoft was the U.S. Library of Congress.

    What should we make of such different decisions?

    The answer may be different than you are likely to imagine. In the UK, the decision was consciously made with the goal of changing the marketplace to favor competition and choice, and was backed by overwhelming public support. It was also taken in the context of an ongoing public/private debate across the EU involving open standards, open data, and open source as instruments of policy.

    The Library of Congress action, on the other hand, was taken in the context of…well, nothing. The US government at any and every level, as well as the states, have at times avoided, and in most cases been simply oblivious to the open format debate. Indeed, the Library of Congress announcement includes the following statement:
    http://www.consortiuminfo.org/standar...og/article.php?story=2015021111505690
    Voting 0

Top of the page

First / Previous / Next / Last / Page 1 of 10 Online Bookmarks of M. Fioretti: Tags: file formats

About - Propulsed by SemanticScuttle