4 Oct 2008 mca   » (Journeyer)

Ping.

Life is pretty good just now, but doesn't involve much hacking on anything. A few items...

  • Got married earlier this year. 8-) I'm now mcra, but changing user ID generally looks like a lot of trouble.

  • Rediscovered my desk during a recent archaeological expedition into the study. There's still a desktop machine under it. It still contains about three copies of each filesystem off the knackered disk, in various states of attempted repair.
    • reiserfsck had already brought the leftovers back in a self-consistent state.
    • Self-consistent is all very well and good, but it doesn't show where the holes are.
    • lost+found is OK, in its tedious way, because if I want the data badly enough I can probably figure out where each file belongs. Vast swathes of md5sums match backup copies of old junk in ancient backups, telling me that I didn't need most of those files anyway.
    • ...but what happened to the inodes with bad sectors? It looks like the valid data is dropped, giving no chance to look at the leftovers. I started another recovery, to fill the lost sectors with a breakage-marker, instead of marking them bad. This doesn't work on the superblock and root directory though...
    • Fishing through old backups is a sorry fallback position.
    • unison-synced copies of the data are much better, when they're recent and complete. My old CVS repository now appears to be intact! (I'll skip the migration to Subversion and go straight to Git.)
    • I'm wondering whether I want to translate some old ~/.unison/ar* archives into a form that show me missing files, when I only have half the sync.
    • I did start fishing through the fsck logs looking for info about files that got left in the bitbucket. I'm as interested in filesystems as the next general purpose geek, you understand, but I'm not sure I wanted the in-depth practical just now.
    • In all, it's shocking that 299.5 KiB of bad sectors can do so much damage to a 68 GiB filesystem. I guess the popular sectors took the worst hammering.

  • Strangely, the idea of automated push-backups from all my machines has come alive - again.
    • This time I've chosen rsync -axSWH with an ssh private key per filesystem to give access to the destination tree. Writing into a root-owned encfs seems to work, in the sense that special files, hard links and owners can be preserved. [later] But not sparse files, which will explode to full size.
    • Last time I preferred an rsync pull, with an ssh key per source machine, and a custom set of --excludes per filesystem. I think I didn't set up the cronjob because Life got Busy. Sigh.
    • gibak looks like a good idea, but pruning out large temporary/intermediate files or decimating the history could be hard/slow/large. Maybe some neat loopy branch structure would work? A problem for another day.
    • What I need now is something simple and flexible. cp -al is looking tempting.

  • Looking forward (?!) to a similar expedition into $MAIL . I expect to find many lost things, neglected and overdue things, and many incomprehensible notes-to-self.
  • Discovered Advogato now has password reminder mailer. Woo! I can just log in, instead of messing with the cookie from some ancient homedirectory. It's a great relief, actually, to see pieces of World just quietly carrying on while I'm not getting involved.
  • Above all, it's time to start making things again.

Latest blog entries     Older blog entries

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!