Mar 052013

I’ve discovered the fantastic yearbox plugin for Dokuwiki, which is superb for my electronic lab notes.  However, implementing it neatly required me to change the filename of every log entry from the last few years.  In my old Dokuwiki system, each log had a filename like


I needed to keep the directory structure, but change each filename from two digits to the form “2013-01-07-ljr.txt”.

It turns out to be fairly straightforward with some clever bash tricks.  I changed into the log directory and used this command:

 for i in `find -name '??.txt'` ; do mv $i ${i%/*}/${i:2:4}-${i:7:2}-${i:10:2}-ljr.txt; done

Here the first part finds all files with names of the form “??.txt”.  Then each file is renamed with the “mv” command.  The ${i%/*} gives the file path with everything after the last “/” removed (ie it gives the subdirectory of the file).  Then the rest of the parentheses give extracted parts of the filepath string.

All files stayed in their own directories, and were renamed with additional information from their directory path.  Easy!  Well, not unless you know the details of string manipulation in bash – which I don’t.  I found enough information to do this after looking here and here.

Mar 052013

I often use the “-” symbol to separate parts of a long filename.  For example, my photos are systematically named things like “20130214-1123-34_some_event.jpg”.  For filename maintenance I regularly use the rename command, which allows me to do things like

rename some another *

which would replace the word “some” with the word “another” for every filename in the current directory.  When I need to replace a chunk of filename starting with the “-” character, the rename command thinks I’m passing it an option.  I’ve just found a helpful answer that shows how the “end-of-options” signal fixes this problem.  This signal is simply a double dash “–“.  For example, this command

rename — -11 -10 *

would change the timezone on a bunch of photo filenames.

Feb 022010

At ASA Convention, a number of us took many photographs.  Late on the last night, I helped compile the highlights into a souvenir data DVD.  Using the nifty little exiv2 utility, I was able to adjust the image timestamps and retrospectively synchronise the clocks in every camera.  This makes for very enjoyable photo browsing, as you can view chronologically regardless of which camera was used.

Unfortunately I was too rushed to place attribution in the filenames, and so it is not obvious who took each photo.  Luckily we all used different camera models, and that information is still present in the exif metadata.  I used this to split the photos into directories and embed the photographer in both metadata and filename.

I was shooting on my Nikon D200, and so was able to separate my photos by:

for ii in *.jpg; do if grep -q "NIKON D200" $ii ; then mv $ii lachlan/ ; fi ; done

This says: for each jpg file, check if it contains “NIKON D200” in the exif metadata and if it does then move it to the subdirectory lachlan/

All you need to do is find the camera model string to search for (my friends with Canons had things like “Canon EOS 30D”).  Now I’m going to append photographer names to the filenames, and then return them to their original chronological directories.

[For Facebook Notes readers: this post is redirected from my personal website]

Nov 102009

I jotted this down a year ago when I needed to produce a set of handout notes for a 3rd year physics lecture I took. Just last week, after taking a similar set of lectures, I wanted to find it but couldn’t. Murphy’s Law has come into effect, and my jotted note has turned up now that my need for it has passed.

The Beamer class for LaTeX is a great way to produce very nice presentation slides with useful features such as automatic progress markers and internal hyperlinks. Being LaTeX, it is also possible to completely change the output formatting by simply altering certain document settings. This allows me to produce slides that have black backgrounds for better projection onto a screen, and then change a single line (specifying the colour theme) to get a white-background version optimised for printing on paper.

To make it even more efficient to print, I used the following command to fit 3 slides to an A4 page:

pdfnup --frame false --nup 1x3 --paper a4paper --orient auto --pages all --trim "0 0 0 0" --delta "1cm 1cm" --offset "0 0" --scale 0.91 --turn true --noautoscale false --openright false --column false --columnstrict false --tidy true --outfile main3up.pdf main.pdf

Sep 242009

A few months ago I purchased a “Google Phone”: a HTC Magic running the Linux-based Android operating system. I am incredibly pleased with this device.

Today I wanted to convert some video that I recorded on a Nikon Coolpix camera into a format that would play nicely on my Android phone. The resolution needed to be scaled down for the small screen, and I knew that h264 was the best video codec. After some searching online, and a bit of experimenting, I found that this worked wonderfully:

ffmpeg -i input.avi -aspect 3:2 -s 400x300 -vcodec libx264 -b 480k -r 30 -acodec libfaac -ac 1 -ab 32k -padtop 10 -padbottom 10 -padleft 40 -padright 40 -sameq -pass 1 output.mp4

Continue reading »

Jun 162009

Even when shooting images in raw format, it is typically easier to do simple sorting and sharing with jpg files. With Nikon *.nef raw files, it is possible to extract a full-resolution jpg image using the nefextract script. However, exif metadata is not automatically copied to this extracted jpg.

The tool that I use to manipulate my images according to their exif metadata is exiv2, and I can quite simply copy metadata from raw files to their corresponding jpgs (matching filenames, and all in the one working directory) with:

exiv2 insert -l./ -S.nef *.jpg

After sorting through the jpgs and deleting all but those worth keeping, I wanted to automatically remove the raw files of those deleted images. Sure enough, this is easily done with some bash shell magic:

for file in *??.jpg; do mv raw/${file%%.jpg}.nef 2> /dev/null rawKeep/ ; done

This command says “for each jpg file you find here, move the matching nef file from the subdirectory raw/ to the subdirectory rawKeep/”. I can then delete any files left in the raw/ subdirectory, as they mustn’t have a matching jpg.

People often ask my why I persevere with the “command line” (more technically the “shell”). It seems that they assume tools with graphical interfaces are more powerful and faster. These two routine tasks demonstrate yet again that the shell really is the most efficient way to do many common jobs.

Jan 142009

Last week I had reason to duplicate some video DVDs.  Yes, it was legitimate; I wanted copies of personal video footage to share with family.  After fiddling around a bit I found a very straighforward way to do this task using simple GNU/Linux tools.

The first step is to copy the DVD to the hard drive (I only have one DVD drive).
cat /dev/sr0 > dvd.iso
where /dev/sr0 is my DVD device.

Then, after changing to a blank DVD, simply
growisofs -dvd-compat -Z /dev/dvd=dvd.iso
and it worked.

Aug 152008

Last week I learnt a very important lesson: a “backup” is not actually a backup if it is the only copy you have, it is at most an archive. In the process of tidying up the files on my external “backup” hard disk I deleted a few directories of photos from the beginning of this year. As I pressed the Enter key I was sure that I had a copy of those photos still on my laptop; but fractions of a second later I experienced a piercing wave of doubt. It was already too late.

After checking my laptop and finding that the doubt was justified, I remembered with relief that before heading over to Europe I had copied all my photos onto DVDs and left them in my office at uni (just in case something our house burnt down or something). I went to sleep mostly certain that I my accidentally deleted files were safe on discs at uni.

As you probably suspect, I did not have a copy of the photos on DVD. My DVD backups only went to the end of 2007, and I had deleted files from the first 2 months of 2008.

It is not a tragedy, as the main photos of consequence were from ASA Convention and I do have the best of my photos on the official DVD. However, it provided me with significant incentive to learn about data recovery on ext3 formatted partitions. I’ve included some of my discoveries below. Continue reading »

Mar 062008

In preparation for my trip to Europe, I needded to free up some space on my hard disk. It was a great opportunity to find out how to burn data DVDs.

I found this DVD creation article on the Gentoo wiki, and the following command worked brilliantly for me:

growisofs -dvd-compat -Z /dev/sr0 -joliet-long -R -V "<volume_name>" dvd/

where /dev/sr0 is my device and dvd/ the directory containing the desired contents of the DVD.

Feb 292008

I am an avid user of bash, the standard command line environment (technically “shell”) for GNU/Linux. In today’s graphical-rich computing culture, many people notice my command terminal and assume I must be a stubborn nostalgist of the digital dark ages. The truth is that the command line allows many regular tasks to be performed more efficiently, and makes some things possible that are simply not available any other way.

The extra power of the command line comes at the cost of learning its ways, which are not always obvious or self-explanatory. Today I found a very good guide to increasing bash productivity using vi editing commands, and leveraging the command line history to save time and effort. Both of these articles have very good downloadable cheat-sheets.