It's been some time since I've had any real system administration responsibilities, and it amazes me how much my memory can degrade from neglect. Today I had to move a web site from one system to another. The hostname changed too, and unfortunately, this site uses lots of fully qualified links and image sources. I considered my options for changing the name across all files and directories. My first thought was that it wouldn't be too bad to type up a Perl script that could recursively descend each directory and rewrite each of the files, but my very next thought was "that seems like an awful lot of work for a fairly simple thing." As I looked up from the monitor to think about better approaches, my eyes drifted over the bookshelf and alighted on an old copy of sed & awk
by Dougherty & Robbins. Well duh, that's exactly what
is designed to do. And nowadays most implementations have the handy
option that makes changes in place without explicitly using an interim file. Which means that I could reduce the work of typing in the Perl script to a single line at the command prompt:
$ find . -name "*.htm" -o -name "*.html" | xargs \
sed -i.old 's/www.example.org/fumc.example.com/'
That can all go on one line without the backslash. It uses
to traverse the directory structure looking for files that end in either ".htm" or ".html." For each file it finds, it executes the
command, which says to substitute the first hostname with the second one and make a backup of the original file with the ".old" extension.
There are probably many ways to solve this one. Leave me a comment explaining how you would do it.