|
Authored by: betajet on Friday, December 07 2012 @ 10:38 AM EST |
... and OCR scanning keeps getting better. I need to resurrect and re-edit some
old documents that were on a Macintosh (now deceased, so cannot access hard
drive) with backups on 800K DS/DD Mac format floppies. I don't think anybody
makes hardware that can access the latter, and fat chance of being able to read
the data after 20 years anyway.
But I do have high-quality hard copy and a copy shop nearby that can scan them
for me. So once again good old papyrus remains the archival medium of choice.[ Reply to This | Parent | # ]
|
|
Authored by: Wol on Friday, December 07 2012 @ 06:34 PM EST |
That bit us rather badly at work some years ago. We had a bunch of 9-track tapes
(that dates it a bit!).
When we retired the system, we didn't think to copy the data tapes. Oops.
Even worse, we were "forced" to migrate databases, and while the new
one (UniVerse) was a clone of the old one (INFORMATION) - so much so that the
programs mostly ran after just a simple recompile - the actual file format was
rather different.
And then our researcher came to me and said "how far back can you go with
historic data?". Help!
Cheers,
Wol[ Reply to This | Parent | # ]
|
|
Authored by: Anonymous on Saturday, December 08 2012 @ 01:21 PM EST |
Migrate to newer technologies as they show up. I migrate my HD stored stuff
every few years to keep the magnetic domains good.. When I store it I make
checksum files and verify the contents when it comes time to migrate. Depending
on importance I make two or three copies, obviously on different hard disks.
One set is always stored off site.
When I migrate from drive to drive, I
do a copy of everything, then I use the original checksum file to verify the
contents transfered OK. Any file that didn't, has happened a few times already,
gets checked on the original copied again if OK or I go to the backup. Also
there are programs like parchive that will generate a set of redundancy files
for a group of files. They allow correction of some bad blocks of data before
needing to go to the second backup HD set. I now use them for important stuff,
but likely should set their use by default in my backup scripts. With a quad
core processor I have compute cycles to spare when generating my backup HDs. I
have the scripts that do all the work run in the background as I do other stuff
like writing up this post.
[ Reply to This | Parent | # ]
|
|
|
|
|