Ars Technica published a story that should strike sadness into the heart of cultural afficianados of the 1990s and fear into the minds of investors in giant media companies.
The gist of the story is that current day engineers called upon to assist with re-mixing and re-mastering old content from the 1990s for new digital delivery formats are horrified at a problem they are discovering. The 1990s was the decade in which digital recording to spinning hard drives via software platforms like ProTools, Cakewalk and other products became widespread. At the very beginning of the 1990s, the ADAT digital format created by Alesis was king but probably by 1993 or 1994, most pro studios switched to hard-drive based digital recording. HDD based recording supported virtually unlimited tracks, programmable control of mixing parameters eliminating the need for five people to try to sit at a console adjusting levels, left/right balance controls, reverb send levels, etc to create a final mix, etc.
Once a studio and recording engineer were done with mixing and mastering, those HDDs were treated like removable media and typically shipped to tightly secured and perfectly environmentally conditioned spaces to protect the drives in case the original tracks were needed later.
Now, thirty-plus years later, attempts to locate those HDDs, plug 'em into something, spin 'em up and use the data to re-mix the content into formats suitable for streaming, 5.1 sound, etc. are finding two MASSIVE problems. One, per the Ars Technica story, at least 20% of the drives will not function. Either they cannot be read due to electronic problems or they won’t even spin up. Two, in many cases, the software used to create the data is no longer available in the original version or that software cannot run on modern hardware.
Media companies expecting to milk profits from back catalogs for another thirty years by re-selling what the public has purchased in droves may find those future revenue streams severely crippled as more of this old content is examined and found to be rotted away.
The media world has already suffered one catastrophy affecting roughly thirty years of content from the 1950s to 1980s on magnetic tape that was destroyed in a fire that hit a warehouse operated by MCA Universal around 2011.
As a fan of music, this chokes me up a bit. In my narrow minded, bigoted opinion, the decade of the 1990s was the last generation that produced a respectable amount of good music across multiple genres every year of the decade (though it did tail off starting in 1996 for business reasons…). The idea that content from that last “golden decade” could already be lost and unavailable for recovery is sad. If you have a lot of music from that era on CD but are thinking of dumping those old clunky CDs for a Spotify library, I would think again.
There’s another takeaway for anyone from this…
Digital rot takes many forms.
Failure of physical media – Hard drives motors can fail. Hard drive electronics can fail, preventing bits from being read. Factory-made CDs can encounter corrosion inside the disk that begins creating additional pits in the reflective surface which can start corrupting data. CD-Rs written in a home PC actually burn away a dye to record in the first place but that dye can break down in as little as 2-3 years. NAND memory in USB drives can fail.
Obsolete file formats – Software applications change over time and change file formats over time. Most are “consecutively compatible” so version 10 might be able to read files from version 9, 8 and maybe 7 but it is RARE that current versions of any app can read ANY prior version of files created by that application.
Inoperable software – From a horsepower standpoint, a 2024 PC is 100x faster than a 1995 PC but software sold in 1995 may have so many ties to the 1995 operating system of its era that it cannot run on any modern OS. It is unlikely anyone will be able to create an emulator that mimics a 30 year old operating system that can provide the I/O performance needed by a recording type of application.
In a nutshell, if you have megabytes or gigabytes of “data” you want to preserve in the form of family photos, family financial documents, old career documents you’ve taken with you, etc., there are some “best practices” for safeguarding your data. Think of your house like a mini data center cuz that’s what we’re all becoming…
- Organize data on your PCs so YOUR data is not mixed in with byzantine directories housing actual software applications. On Windows, I install most apps under C:/Apps and write all documents under C:/Docs so I can drag the entire C:/Docs directory to a USB and not waste time backing up software.
- Create multiple, FULL backups on a consistent interval, maybe every month or six months.
- Label each backup with the full yyyy-mm-dd timestamp and identify the machine from which the data was copied
- For data files created by key applications (like Office Suite stuff, video editors, music editors, etc.), each time you buy an upgrade to that application, upgrade ALL of your old files in old formats to the latest version and save with a new name reflecting the new file format.
- Test your prior backup devices every time you create a new backup.
- If costs permit, don’t OVERWRITE old backups, just create new backups to new devices. This is more practical using USB drives since a 1 TB drive can be had for $19.99.
- If you really need to be paranoid, keep one backup generation “offsite” from your house, at a friend’s or a parent’s or a child’s house. Make sure you can trust them, though.
- You can use a cloud back-up service but I’m not personally a fan of that so I can’t provide specific suggestions here.
WTH