Is TIFF a legacy format?
The most recent version of the TIFF specification, 6.0, dates from 1992. Adobe updated it with three technical notes, the latest coming out in 2002. Since then there has been nothing.
The format is solid, but the past quarter-century has seen reasons to enhance it. BigTIFF is a variant of the format to accommodate larger files. It isn’t backward-compatible with TIFF, but the changes mostly concern data lengths and are easy to add to a TIFF interpreter. The format sits in a kind of limbo, since Adobe owns the spec but is no longer updating it. There have been new tags which have achieved consensus acceptance but don’t have official status. AWare Systems has a list of known tags but has no reliable way to say which ones are private and which are generally accepted. There’s no way to add a new compression or encryption algorithm, or any other new feature, and give it official status.
The Libtiff source code repository is now on Gitlab. The old CVS repository on maptools.org will be maintained for historical purposes but won’t get any updates. One reason for choosing Gitlab rather than Github is that there’s already a libtiff … Continue reading
Libtiff 4.0.9 has been released. According to the email announcing it:
A great many security improvements have been implemented by Even Rouault.
Much thanks to OSS Fuzz, team OWL337, Roger Leigh, and of course Even Rouault.
Obligatory reminder: Don’t download from libtiff dot org. It’s many years out of date.
My brief post yesterday on the TI/A initiative provoked a lively discussion on Twitter, mostly on whether archival formats should allow compression. The argument against compression rests on the argument that archives should be able to deal with files that have a few bit errors in them. This is a badly mistaken idea.
A project to define an archive-safe subset of TIFF has been going on for a long time. Originally it was called the TIFF/A initiative, but Adobe wouldn’t allow the use of the TIFF trademark, so it’s now called the TI/A initiative.
So far it’s been very closed in what it presents to the public. It’s easy enough to sign up and view the discussions; I’ve done that, and I have professional credentials but no inside connections. However, it bothers me that it’s gone so long presenting nothing more to the public than just a white paper and no progress reports.
I’m not going to make anything public which they don’t want to, but I’ll just say that I have some serious disagreements with the approach they’re taking. When they finally do go public, I’m afraid they won’t get much traction with the archival community. Some transparency would have helped to determine whether I’m wrong or they’re wrong.
Libtiff is still offline at remotesensing.org, but there’s a mirror of the source available on GitHub. I held off on mentioning it in this blog till Bob Friesenhahn confirmed it’s reliable.
Posted in Links
Tagged software, TIFF
The Libtiff library, which has been a reference implementation of TIFF for many years, has disappeared from the Internet. It was located at remotesensing.org, a domain whose owner apparently was willing to host it without having any close connection to the project. The domain fell into someone else’s hands, and the content changed completely, breaking all links to Libtiff material. Malice doesn’t seem to be involved; the original owner of remotesensing.org just walked away from the domain or forgot to renew it. Who owns it now is unknown, since it’s registered under a privacy shield.
Originally Libtiff was hosted on libtiff.org, but that fell into the hands of a domain owner with no interest in the project. I don’t know why. It still holds Libtiff code, but it’s many years out of date.
As I’m writing this, people on the Libtiff list are trying to figure out exactly what happened. There’s talk of trying to get libtiff.org back, though that may or may not be possible.
For the moment, there’s no primary source for Libtiff on the Web. I’ll hopefully be able to post more information later.
Posted in News
Tagged software, TIFF
TIFF is a very popular image format, but it can’t handle really huge files. “Really huge” means files bigger than 4 gigabytes, or more precisely, files in which any data offset can’t be represented in 32 bits. That’s not a limitation that comes up often, but some applications, such as medical scans, need enough detail to push the limit.
A dozen years ago, members of the TIFF community at AWare Systems came up with a simple idea: Create a variant of TIFF with 64-bit offsets instead of 32 bits. The result was BigTIFF.
The work on the TI/A project, to define an archive-friendly version of TIFF analogous to PDF/A, is still going, even though hardly any of it is publicly visible. Marisa Pfister’s leaving the project, along with her position at the University of Basel, was unfortunate, but others are continuing a detailed analysis of TIFF files used at various archives. This will help them to learn what features and tags are used.
The target of March 1, 2016, for a submission to ISO has been crossed out, and nothing has replaced it, but we can still hope it will happen.
Posted in News
Tagged ISO, standards, TIFF
How big a concern is physical degradation of files, aka “bit rot,” to digital preservation? Should archives eschew data compression in order to minimize the effect of lost bits? In most of my experience, no one’s raised that as a major concern, but some contributors to the TI/A initiative consider it important enough to affect their recommendations.