By Nathan L. Walls

  • .
  • Encased/Raleigh
  • .
  • Completed Snow Angel/Raleigh

Articles tagged “development”

The Airport Cemetery

My wife, Robin, and I visited the cemetery at Raleigh-Durham International airport after lunch Saturday, prompted by a discussion I had on Twitter earlier in the week with aviation geeks and meteorologist Nate Johnson.

Nate started with his surprise that Chicago’s O'Hare International Airport has a cemetery. That was also news to me, but I was reminded of the small cemetery at RDU. I figured Nate also knew it. But, no, it was news to him, and I suspect it’d be a small surprise to a lot of folks.

Robin and I have used RDU’s ParkRDU Economy Lot 4 for our occasional trips out of town, and on the drive in, we’ve passed Cemetery Road and seen a little bit of fencing. So, I knew it was there. But, it’s out of the way and for folks accustomed to coming and going from the airport via Aviation Parkway or Airport Blvd, they might never pass by. Even if you drive up to the Observation Park and then out to Lumley Road, you might miss it.

Here it is from using Google Maps’ satellite view:

Robin and I were in the area and, given the discussion from earlier in the week, we decided to drop by. There’s a chainlink fence around the cemetery and a small driveway, enough for two or three cars. There’s a pedestrian gate in the chainlink fence and a double swinging gate up a gravel and grass incline to allow vehicle traffic.

We walked around, looking at headstones and took a few photographs.

Here’s a view of the headstones looking diagonally SE to NW across the cemetery towards Runway 5L-23R:

Mt. Hermon Baptist Church Cemetery

This view is SW to NE, where cars parked in Lot 4 are visible in the background:

Mt. Hermon Baptist Church Cemetery

Finally, the sign that offered a clue about the history of the cemetery:

Mt. Hermon Baptist Church Cemetery III/RDU

The name Mt. Hermon Baptist Church struck a memory. I thought I remembered that church north of the airport, off of Leesville Road, just into Durham County. Looking at a map on my iPad, I could also see a Mt. Hermon Road running north and south that terminated on the north side of the Glenwood Avenue interchange with I-540. But, looking further, there was a continuation of the road on the south side of the interchange, crossing to Lumley Road and continuing as Commerce Blvd on the airport itself. (View on Google Maps)

That struck me as interesting and probably meant that it was contiguous at one time, before I-540 was built, beginning in 1992. Later in the afternoon, I went out for a walk and thought about where I might find a map of northwest Wake County from before I-540’s construction. I was thinking that I’d end up at the library looking for county property maps (and that will still be valuable), but, for whatever reason, I instead remembered the US Geological Survey’s topographic quadrangles. If I could get a past version of one of those, I might learn something.

As it happens, the USGS does have historical quads online in a variety of formats (PDF, TIFF, JPEG, etc) and scales. Using their TopoView tool, I was able to narrow down available maps for the area and then look at past dates. As it happens, there’s a 1982 edition map of the SE Durham quadrangle using 1973 survey data (large JPG).

Looking at the lower-right corner, there are a few things that we can see. One item is that Mount Hermon Road, labeled as Route 1646, is fully connected, with an interchange at Glenwood Avenue (U.S. 70). We see the absence of Interstate 540. Following Mt. Hermon Road south from Glenwood, we see the cemetery and a Mt Hermon Church on the map. Continuing south, we see that Runway 5L-23R does not yet exist. (It appears on the 1993 edition, but not on the 1991 edition 1:100,000 scale wider map from 1990 survey data, which is interesting because RDU history says the runway opened in 1986.)

The church congregation appears to have moved from what becomes a taxiway around the General Aviation apron, to Olive Branch Rd in Durham County. The cemetery is still taking newer burials. The newest burial we found is from Feb. of 2016.

This leaves me wondering when the church moved. I see there was a small cluster of buildings as an unincorporated settlement on the 1973 map, labeled Hermon. Digging into that might require looking at past census data and property tax records at the county level.

I’m fascinated by all of this and wonder how this all played out. Accordingly, there’s some interesting research to do yet in order to learn at least part of fuller history here. I will have a follow-up post when I do.

Benfits of "throwaway" scripting

I like listening to concerts from some of my favorite artists like Mogwai, Explosions in the Sky and Hot Chip. Some artists have a definitive place to go for concert recordings, such as Reflecting in the Chrome for Nine Inch Nails.

For most artists, I end-up visiting YouTube and finding a concert and recently, I’ve found a bunch on YouTube:

While watching on YouTube is great, I would like to listen to these concerts through iTunes or on my phone.

I looked up how I get YouTube video converted to audio and found this Meta Filter thread.

I ended up using the following idea, highlighted in this comment:

youtube-dl UUGB7bYBlq8
ffmpeg -i UUGB7bYBlq8.WHATEVER -vn \
  -acodec copy 'Artist -Title I Want.mp4'

Three keys here:

  1. Get the IDs of the videos I wanted to convert from YouTube. I did this manually
  2. Install youtube-dl, which I did through Homebrew
  3. Install ffmpeg, also through Homebrew

While there are plenty of online or graphical tools one could use to convert YouTube videos to audio, the benefit of a command line tool is that I could then use these tools in a couple of Ruby scripts.

A lot of times, writing code involves writing tests and solving a problem through an application. Theoretically, I could have done that here. But, that felt like overkill because, right now, I have eight or so concert videos.

I wrote two scripts to help me. The first is download.rb:

#!/usr/bin/env ruby

file_list = "concerts.md"
files = File.readlines(file_list)

files.each do |file|
  `youtube-dl #{file}`

In the file, concerts.md is in the same directory and just contains a list of YouTube video ids.

Once these were all downloaded, I needed another script to convert the video files to audio files. I also wanted to name the resulting files. I could do both with a simple data structure. So, I wrote converter.rb.

Neither of these two files is doing anything particularly difficult. I’m just running those command line utilties. But, I’m not having to run them repeatedly. I was able to use ls and Vim to get the file names into converter.rb, then regular expressions to coerce the file listing into a data structure. I filled out the :destination keys manually. That felt like a pretty decent balance of effort to automation.

If I use this file much more, I may improve both of these scripts into something more mature. But, without waiting for that to happen, I was able to take care of some very pragmatic automation right now to save me some tedium.

I’ll take that.

Tool Sharpening: March 11, 2015

I’m presenting Intermediate Git on March 31. It’s a one-day, hands-on workshop to build skills from beginner to intermediate with Git on the command line. Cost: $449. You can register here.

Environment + Process tweaks

I separated “Reading and Learning” posts from “Tool Sharpening” posts, as I mentioned I was going to do. These are separate concerns and they’ll be easier to make both types of post separately. My expectation is I’ll have roughly twice as many Reading and Learning posts as Tool Sharpening posts.

Also, with my new gig, I’m a lot more comfortable working on a Pull Request model. Even if I’m the only person working on a project, it’s still good process reinforcement. I’m also using a rebase model. I’ve never cared for merge commits. Conversely, reducing a feature branch to one or two very descriptive commits feels pretty good. I’ll use small commits in the moment when I’m taking small steps to implement a feature. Once I finish the feature, those small steps lose their utility. They become noise. The Pull Request model lets me strip away that noise and tell the larger story about what the commit is doing and why its there.

I also:

  • Adjusted the TextExpander shortcuts I use to generate much of each podcast line to account for wording changes I’ve made
  • Refactored my TextExpander shortcuts for podcasts to create the Markdown link for podcasts with a predictable URL structure, which thankfully, is most of the ones I listen to
    • A further refactoring to do is to transition to a single TextExpander shortcut that calls a Ruby script with which I could retrieve the show notes page and grab the episode title
    • This would allow me to more quickly scrobble the podcasts I listen to, even though there’s not a direct way to get the information out of Overcast.
  • Created some TextExpander shortcuts for fitness and exercise
  • I added some additional Composure shell commands for Git
    • grbc is git rebase --continue
      • Useful for resolving conflicts on a merge or, more likely with the way I work, a rebase before a merge
    • gmt is git mergetool
      • Get me into merge conflict resolution as quickly as possible
    • fbc is git push origin :$1 && git branch -d $1 and allows me to clean-up a feature branch after I’ve merged it to master
      • E.g. fbc nlw-new-feature
      • fbc stands for feature branch cleanup
    • gsu is git branch --set-upstream-to=origin/$1 $1
      • Useful for when I’m working with another developer on a feature branch I started and pushed, but now need to pull changes the other develoer made on the feature branch

Project work

  • Moved Accomplished over to GitHub and fixed up the initial batch of tests
  • Set-up a new Digital Ocean server ($10 credit for using the link) that this blog will eventually be migrated to. Outside of the initial machine creation and user addition, I’ll be doing as much of the set-up and management that I can through Ansible

For some more background on what’s going on here, see the first tool sharpening post.

Tool Sharpening: Jan 1, 2015

For some background on what’s going on here, see the first tool sharpening post

Environment + Process tweaks

I added several zsh aliases for some Git actions that are becoming more common for me, when working on feature branches

alias gsl='git stash list'
alias gssv='git stash save'
alias gss='git stash show'
alias gssp='git stash show -p'
alias gsp='git stash pop'

These are particularly useful when I’m in the middle of work on a repository and I want to do anything else, like run a git bisect or examine another branch.

Another use case I have is searching, with find or with ag, then opening resulting files in BBEdit.

A typical case looks like this:

ag -l foo -print0 | xargs bbedit

In this example, any file in the current directory or below containing the term ‘foo’ will be opened in BBEdit. The find version is very similar for cases when I want to open all files with a name containing ‘foo’.

find . -name '2014*' -print | xargs bbedit

Rather than type these commands out every time, I want to type a much shorter command and the search string, like so:

bbf foobar


bbg foobar

It would be easiest for me, by experience, to implement each of these in Ruby or Perl. Instead, I chose to implement these as zsh shell functions. I used Erich Smith’s Composure shell library to build the functions. The resulting code looks like this:

# author, about, param, example and group are Composure functions
# ag is the Silver Searcher (brew install ag)
bbf () {
    author 'Nathan L. Walls'
    about 'Function to find files matching a term and open them in BBEdit'
    param '1: Term to search for via `find . -name`'
    example 'bbf 2014*'
    group 'utility'

    find . -name '$*' -print | to_bbedit

bbg () {
    author 'Nathan L. Walls'
    about 'Func. to open files containing a term in BBEdit'
    param '1: Term to search for via `ag -l`'
    example 'bbg foo'
    group 'utility'

    ag -l $* -print0 | to_bbedit

to_bbedit () {
    author 'Nathan L. Walls'
    about 'Func. to open files in BBEdit through xargs, via a pipe'
    param 'None, implied to come through xargs'
    example 'ls | to_bbedit'
    group 'utility'

    xargs bbedit

One other use case I have for this sort of thing doesn’t even require opening files, necessarily, and that’s seeing what files I have that have the word “focus” within them, except spec_helper.rb, in my Ruby projects. This way, I can keep from committing code with focus: true in place and save myself the step of rerunning tests and amending a commit. That looks like this:

alias foc='ag -l focus --ignore spec_helper.rb

Or, at least it did until I thought about it a little more and wanted to make it a function as well. Now, it looks like this, instead:

foc () {
    author 'Nathan L. Walls'
    about "Find errant 'focus: true' statements before committing"
    param 'None'
    example 'foc'
    group 'testing'

    ag -l 'focus: true' --ignore spec_helper.rb

Why not use an alias here? Partly because I’m enjoying using Composure and partly because I believe I might make further use of this foc () function elsewhere.

There’s still a bit more I could do with these functions, namely adding some error checking, but these will get the job done for me well enough.

I also:

  • Tweaked BBEdit 11 settings to limit the scope of clippings used for given languages, which keeps me from getting cross-language clipping pollution. In my case, there was a large set of PHP clippings coming up when I was looking for Ruby clippings. And now, I’m not
  • Found a way to Save All open documents in BBEdit, Cmd-Opt-S
  • Added TextExpander shortcuts related to writing Git commit messages
  • Updated my work machine to the dotfiles set-up I highlighted in the previous tool-sharpening entry
  • Created a Cider profile to get my Homebrew set-up into source control
  • Updated Git to address a Mac OS X vulnerability

Project work

  • No project work since the last entry

Skill improvements

See above regarding working with Composure to write shell functions. Overall, I’m finding Composure a really nice little framework for iterating over commands quickly and making them understandable later. It was starting to write these commands and then trying to remember Smith’s presentation at Triangle Ruby Brigade back in August 2014 about Composure that led to my coworker Steve finding Smith’s lightning talk.

Reading and Learning: Dec. 30, 2014

For some background on what’s going on here, see the first tool sharpening post

This is a longer post, simply because of how much time has passed since the previous entry, largely due to end-of-year holiday preparation and celebrations. I hope you enjoyed your holidays with family and friends as I did.

I’m also splitting out what I read, watched and listened to from technical tool sharpening. These are different concerns on different schedules and just as I want to sustain and improve both sets of practices, I don’t want them unnecessarily coupled. So, here’s articles, podcasts and presentations with technical pieces to follow separately.

Articles read

Screencasts, podcasts and presentations

← Previous