Bash Script – Restarting OpenVPN Connections

March 16th, 2020

I use Sanoid, and Syncoid, to take snapshots of my Centos based ZFS storage system and copy them to replicas. Generally speaking, I use re-purposed Thecus N5550 storage units for the remote hardware. I have a long history with Thecus hardware, and while I’m not a fan of their software, I am a big fan of removing it and installing Linux. I usually take out the 2GB of memory it comes with, and swap it for a pair of 4GB modules. The only down side, especially for systems I install in remote locations, is that I don’t have any out of band management.

I have the machines configured to boot up and connect to my OpenVPN server. This works great, but after a while the units drop offline. The Thecus hardware is good, but I’m asking the hardware to do a lot more than it was built for, running more RAM than Thecus says is possible. When it goes offline, I just power it off and then back on.. or really, I ask someone near by to do that for me.

Recently though, I started to wonder if this was a machine lock up or if OpenVPN was just getting confused. I let a system stay ‘offline’ for a week or so until I had a chance to get on site. I SSH’d into the machine locally, and lo and behold, the system was running great and OpenVPN had no idea it wasn’t connected. This is a problem I can solve…

I wanted to setup a script to check if the system could ping an address across the VPN, and if it wasn’t, restart the OpenVPN client.

Ping, on Linux, has a feature where it will ping an address for up to a certain amount of time, it’s the ‘-w’ switch. It will complete immediately if the ping is successful, but keep trying without any output until the wait time is over and then reports the failure. This gives me a clean output to put into a script. I didn’t want to restart the OpenVPN service for a single missed ping, that happens all the time, I wanted to be sure it was down.



count=$( ping -w 30 -c 1 $target | grep icmp* | wc -l )

if [ $count -eq 0 ]

echo "Restarting VPN"
systemctl restart (Service Name for your OpenVPN Client Config)


echo "VPN is Running"


(I’m sure I got the bones of this script online, but I can’t find it again to link to.)

For the script to work you enter the IP you want to ping on the line for the ‘target’ variable. The script will then try to ping the address. If it gets a response back, the script will get a value not equal to zero and drops to the ‘else’ line. This just writes the output to the system log and closes. If, after the wait time, it hears nothing, which would be the ‘0’ output, it runs the command to restart the OpenVPN client process. This process name will be different depending on what you called your config file.

I then have this run, via crontab, every fifteen minutes

*/15 * * * * (user with permissions to restart the service) (/location/of/script/)

I got all of the OpenVPN endpoints back on the network over the weekend and rolled out the automation. As luck would have it, my internet connection dropped a few hours later. One of the end points re-joined without any delay. The other did not. I waited for the next quarter hour, when the script would fire, and voila it ran the script, detected the down connection and rejoined. Success.

Maybe this will save you some time too.

Behringer X-Live – Splitting 32 Channel WAV Files and Deleting Silence

March 11th, 2020

At my church we use a Behringer X32 mixer to run Sunday services and we added an X-Live card, instead of the included USB Audio interface card, so we can record our services, multi-tracked, direct to an SD card. This has saved a lot of recording overhead, since we don’t need a PC, display, etc., but it’s also created some hassles.

The X-Live card works great, but it does lack some flexibility we’d like. For instance, you can record 8/16/32 channels off the board but you can’t really pick which ones. For the mix of channels we need, I tell the X-Live to just capture all 32 channels. This works great, but it doesn’t record 32 individual files; It records all 32 channels into one multi-channel file that are a bit over 12 minutes in length. From a SD Card bandwidth standpoint, it’s much easier to write one big file than it is to write 32 separate files. When the file hits the 4GB limit of the SD Card’s filesystem, the X-Live closes that file, and opens a new one seamlessly. There is not gap, pop, or pause between the two files.

When it comes time to work with the files though, you have to do some cleanup before you can really get started. Your DAW, like Pro Tools, Reaper, Cubase, etc., will easily open the files the X-Live creates, but there is no clean way to import this one multi-track file as 32 channels in your DAW. It creates one channel as though the file were a really massive surround sound mix.

For a while I was opening the files in (the excellent, free and open source audio editor) Audacity, removing the silent ones by hand and then exporting out just the tracks I need. Its tedious, and I would often get behind in the clean up, which means I was using up a lot of storage. For just the music from a two service Sunday I was looking at around 30GB of data.

There had to be a better way…

The first step was to use a tool to break the 32 channel file into 32 separate files. I had tried to do this once before with ffmpeg. I knew it was possible, but I hadn’t been able to get it working the way I wanted. I lost invested 5 or 6 hours this past weekend to try again and finally had some more success.

I first started with the ffprobe command to give me the details of the file.

ffprobe (filename)

For my use, it was really only the very last line of it’s output I needed to see:

Stream #0:0: Audio: pcm_s32le ([1][0][0][0] / 0x0001), 44100 Hz, 32 channels, s32, 45158 kb/s

The important details are the name of the stream, #0.0. The audio codec, pcm_s32le. The sample rate, 44.1k (We run our X32 at the 44.1k sample rate, for easier playback from USB thumbdrives, but you may have yours setup for 48k.). And finally, the number of channels, 32.

We now have the raw information we need to build our ffmpeg command. It’s quite long though, so I created the command as a bash script on Centos 8.

You run the command by calling the script, which I called ‘Split’. It needs you to enter two variables to run, which are noted in the script by $1, and $2. The first input needed is the filename you want to split, and after that you specify the file name prefix of the output. I usually have 2 or more files to split from the same recording and I wanted to be able to split them into the same folder without risk of overwriting any of the files.

# To run, use script name, then the file nane, and then the prefix for the saved files.
# i.e. 'Split 00000001.wav 01'
# This will process the file 00000001.wav and create 32 files, 1-32, prefixed by '01-'
# 01-01.wav
# 01-02.wav
# 01-03.wav
ffmpeg -i $1 -ar 44100 -acodec pcm_s32le -map_channel 0.0.0 $2-01.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.1 $2-02.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.2 $2-03.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.3 $2-04.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.4 $2-05.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.5 $2-06.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.6 $2-07.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.7 $2-08.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.8 $2-09.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.9 $2-10.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.10 $2-11.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.11 $2-12.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.12 $2-13.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.13 $2-14.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.14 $2-15.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.15 $2-16.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.16 $2-17.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.17 $2-18.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.18 $2-19.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.19 $2-20.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.20 $2-21.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.21 $2-22.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.22 $2-23.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.23 $2-24.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.24 $2-25.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.25 $2-26.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.26 $2-27.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.27 $2-28.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.28 $2-29.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.29 $2-30.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.30 $2-31.wav \
-ar 44100 -acodec pcm_s32le -map_channel 0.0.31 $2-32.wav

What I’m doing with the script is using the X32’s wav file and splitting each ‘channel’ into it’s own file.

ffmpeg -i $1 is running the program ffmpeg, and then using as an input (-i) the first variable you entered when you ran the command. From there, we just map each channel to a new file, and we tell ffmpeg what format the new file should be in.

-ar 44100 tells ffmpeg to output 44.1k, just like the source file. You should change this to match your ffprobe output. If you’re X32 is set to 48k, swap this for 48000 instead.

-acodec pcm_s32l3 tells ffmpeg to output 32bit files. I was suprised the X-Live recorded in 32bits, but if you don’t specify anything it will default to 16bits.

-map_channel 0.0.X tells ffmpeg which stream/track of the file to select.

And Finally, $2-XX.wav tells ffmpeg to use the 2nd variable you entered when you ran the command as a prefix to the filename it creates. I also use this file naming process to map wav files to the channels on the board. We don’t have a channel zero, for instance, we have channel one. So I map stream 0 to wav file 1, and so on, for my sanity down the road.

The last detail, the \ at the end of each line, just tells BASH that the command isn’t done and to keep going.

Once you run the script it will create all 32 files simultaneously. On the very low power server I’m running this on, it takes 20-30 minutes. I also tried running 32 ffmpeg commands, instead of 1 command with 32 streams being saved out, and it took about 50% longer. On a faster system, with fast storage, I’m sure you could cut way down on the time. (I just did a test of the same script on my 2019 Macbook Pro and it ran more than 10X faster. The machine I run this on normally is a repurposed Thecus N5550 running Centos 8 on a ZFS RAIDZ2 of 4x 3TB hard disks.)

Now comes the part that you need to be careful with. The next script ‘listens’ to the files and DELETES the ones that are below a certain volume level.

I found another open source bit of software called SoX, which is a processor of audio. It can insert effects, mix files together.. basically anything except what the above ffmpeg script does.

If you run the software in ‘stat’ mode though you can just get it to output some data about the files. I ran the command against a number of files I knew to be as silent as the X32 can make them, and against a number of files with some amount of known signal.

In my experience, none of the files are actually silent in a true digital sense. The X-Live is recording what it sees, and even a channel we’re not using has some self-noise to it before it hits the recorder. The trick is to set a threshold you’re comfortable with. Through some trial and error, I’ve set the script up to consider anything where the max volume on a channel is below 0.0001 as silent. In my testing, for the X32 we use, these are the volumes I see reported back for tracks I know to not be in use:


For tracks I know to have audio of various levels like room mics, vocalists, drums, piano:


There is a pretty big difference, thank goodness! Based on that data, looking for audio below 0.0001 felt like a safe limit and that is the variable I inserted in the script. By all means, look for and set your own safe value.

For the script itself, I was lucky enough to find this blog post by David Hilowitz. His goal is the same as mine, deleting silent WAV files, but his execution is a bit different. I used his script as my starting point. I changed variable names, and I just delete the data I don’t want where he, wisely, makes a list of the blank files and listens through them.

David – If you’re still out there. Thanks! You wrote you post many years ago, but it still works great. The only tweak needed to get it running, as some of the commenters mentioned, is the need for to ‘//’ before the g in the sed command.

For the script below, which I call ‘Delete_Silent’ on my system, I have commented out the line to delete audio, and replaced it with a line to echo the file name to the terminal. I used this for testing, until I was happy, and now I have commented out the ‘echo’ line and use the rm line. The output of the script now just reports to me the files it HAS DELETED. Be careful!

The script will look at, and process, all of the .wav files in a folder. It runs the file through SoX to get a ‘Maximum Amplitude’ value. It cleans up the output from SoX to just produce a number and then passes that number through a greater than/equalto routine. If the volume returned is lower than the variable you set, MAX, it deletes it and shows the name of the deleted file.

#Check for Silent, or near silent WAV files.
# The process will run on all WAV files inthe


for wav in *.wav
V=$(sox "$wav" -n stat 2>&1 | grep "Maximum amplitude" | cut -d ":" -f 2 | sed 's/ //g')

if [[ $(echo "if ("$V" > "$MAX") 1 else 0" | bc) -eq 0 ]]
then echo "I think $wav is silent";
# then rm -rfv $wav;

There you have it. You can now take those massive X-Live Wav files and pair them down to just the data you need. So much disk space saved!

The next step for me is to have a process look for new files in a folder and process and organize them automatically. But, that’s a post for another day…


MacOS Catalina and Beyond – Updating $PATH in ZSH (2020)

February 3rd, 2020

I’ve just moved from Fedora on my primary work laptop, to MacOS. It’s a change I’m pretty happy about, not because I had any trouble with Fedora, but the quality of my hardware has been upgraded in a big way. This transition has been pretty smooth overall, but I have needed to solve a few small details to make sure I’m not losing any features.

One of those has been adding the tool MTR back to my machine. This was pretty straight forward, as it’s a part of Brew, a sort of package manager for MacOS. The only issue was that once MTR was installed, it’s location wasn’t in my $PATH for the default terminal emulator used in MacOS Catalina, called Z Shell, or zsh.

I could have put a symlink in place to work around that, but instead I added it’s location ‘/usr/local/sbin’ to my path instead.

This was a simple as creating a .zshrc file in my home directory, and making sure the folder was in the list.

Step one is to pull your current $PATH information:

echo $PATH

This should respond back with the current list of folders in your default path. It looked like this on my machine:


From the terminal, I then created a new file in my Home directory called ‘.zshrc’, and added in the data required. The leading period is very important. It’s a way to create a file that is hidden from normal view, and zsh won’t see it otherwise.

vi ~/.zshrc

Once in vi, enter insert mode by pressing the letter i on the keyboard. Then, we build the command we need. On the first line of the file, create the command to update your path:

export PATH={your current PATH information from the previous command}

And then append the new path you want to add, following the same format, which includes a colon between each path.


Then, you can save and quit. To do this in vi press the escape key, to leave insert mode, and then enter ‘:wq’ (colon w q), which will tell vi to save and quit. Then to be sure typed it all correctly, let’s look at the output with the cat command:

cat ~/.zshrc

You should see an output like this:

export PATH=/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/sbin

With that complete, fully quit terminal and then relaunch. You should see the updated information when you re-run our first command again:

echo $PATH

There are lots of other items you can add to this file to customize your terminal, but that’s for another day…


Backup Hosted Email – OfflineIMAP with Fastmail

January 30th, 2020

A few years ago I finally let go and started using a hosted email provider. Running a secure email platform is something I’ve done or years, both professionally and personally, but I was tired of power outages at home taking my email offline and it’s far less expensive to move to hosted email than convert my home into a redundant datacenter.

After some research, I landed on Fastmail. Pricing is very reasonable and they allow me to use all of the domains I have in my one account. To date, and it’s been a couple of years now, I’ve never lost access to my email due to their back-end going down. And, they are always pushing the boundaries with security and new features to improve the service. I’ve been very happy.

One of my favorite things to do is setup a dedicated email alias whenever I signup for something new online. Then, if that company sells my email address to spammers, etc., I can always tell who it was and delete the alias. I’m sure other providers can do that too, but its a wonderful thing.

Since I’m a sysadmin at heart though, I can’t just trust my email to them and not think about it again. I could re-point my domains pretty quickly if something were to happen to them as a company, but I also want to maintain my own totally offline backup of my IMAP mailbox.

Enter OfflineIMAP.

I run a script every day on one of my Linux servers that syncs down all of my email to a local machine, which I then further backup and protect.

This process begins with a BASH script:

#!/usr/bin/env bash
#First step is to make sure we're not running already. If so, wait 2 seconds and try again. Nearly pointless, but good to have, espcially in testing.
while pkill --signal 0 offlineimap
sleep 2
rm -f /home/topslakr/sent # This is Mutt's cache of sent emails, which I don't maintain long term.
offlineimap -c /home/topslakr/.offlineimaprc > /home/topslakr/Email_Sync.tmp 2>&1 & # Runs backup command and sends to background, writing to the .tmp file.
sleep 60 # Wait 1 minute for process to complete. It usually takes less than 10 seconds.
cat /home/topslakr/Email_Sync.tmp | mutt -s "Email Sync" {Your Email Address} # Emails the output of OfflineIMAP's run to me

For that script to work, you’ll need to setup configuration files both for OfflineIMAP itself, and mutt, a text based email client on Linux.

Firstly, here is my config file for OfflineIMAP. This file, on my CentosOS 7 system, is located in my /home// directory and called ‘.offlineimaprc’.

accounts =

[Account ]
localrepository = Local
remoterepository = Remote
status_backend = sqlite

[Repository Local]
type = Maildir
localfolders =

[Repository Remote]
type = IMAP
remotehost =
remoteport = 993
ssl = yes
cert_fingerprint = ddac83e619367e9e5f6f0142decba6872d7319f2
holdconnectionopen = yes
remoteuser = User Name
remotepass = Single Use Password

“[General]” defines the accounts OfflineIMAP is aware of. You can give them any name you like, and you then define it’s properties in the following [Account ___] section. The settings I have listed for General and Account will likely work for you as well.

Repository Local is the spot on your system where email is stored. It will build the folder directory you have in your IMAP account and put the messages within in.

Repository Remote is the details about the remote IMAP server itself. We’ll dig into ‘cert_fingerprint’ below. Make sure you set your username and password. With Fastmail you will need to set an app password in your account for this. Your normal Fastmail password will not work.

The final piece that I use is mutt to send me a status email. This isn’t required. I like to get notifications about routine jobs that run, so I can keep an eye on them. In this case, I take the status output of OfflineIMAP and email it to myself each day so I know it ran.

This is pretty simple. You just need to put a few details into your .muttrc file. My .muttrc file like like this:

set ssl_starttls=yes
set ssl_force_tls=yes
set smtp_url = "smtp://[fastmail Login]:[Another App Password]"

Pretty simple. I use a different app passwords for each piece but it’s probably possible to use just one.

So, the ‘cert_fingerprint’ line. When you setup OfflineIMAP you add to your config a unique identifier from the SSL cert installed on the secure remote IMAP server. OfflineIMAP will then only sync if that certificate remains in place. If the remote side gets compromised, or someone intercepts the traffic and tries to decrypt it, OfflineIMAP will not run. Also, when Fastmail updates their SSL certificates, OfflineIMAP will fail. It’s really easy to get that identifier though, using a tool called ‘gnutls-cli’.

Simply run the tool with the web address:


In the return from that command, look for the ‘SHA-1 fingerprint’ for the web address you submitted. Often times the return will give you data for the certificate on the server, and the other certificates in the chain.

For this command this is the relevant data, and it’s not unique to you. This is the current (01/30/2020) SHA-1 fingerprint for

- subject `C=AU,ST=Victoria,L=Melbourne,O=FastMail Pty Ltd,CN=*', issuer `C=US,O=DigiCert Inc,CN=DigiCert SHA2 Secure Server CA', RSA key 2048 bits, signed using RSA-SHA256, activated `2020-01-22 00:00:00 UTC', expires `2021-02-24 12:00:00 UTC', SHA-1 fingerprint `ddac83e619367e9e5f6f0142decba6872d7319f2'

Good luck with your backups!


Books Read: 2019

January 17th, 2020

As I did last year, below is a listing of all the books I read in the previous year, and my thoughts on them. For the first time, all of the books I read last year were in digital form.

I began the year finishing Charles Dickens’s ‘A Christmas Carol’. I read this as part of a collection of his Christmas stories.

In keeping with my love of murder mysteries, I read Introducing Agatha Raisin: The Quiche of Death/The Vicious Vet, the first of the Agatha Raisin books by M. C. Beaton. We’ve watched the TV series and they seemed like some light reading after the holidays. I enjoyed the book, but it wasn’t my favorite.

I then read a book I’ve been meaning to read for ages, The Hidden Life of Trees. This book was fascinating! I learned a huge amount and as each chapter ended I was always thinking ‘Surely, that must be it…’ and then was surprised with another chapter about trees that I would never had imagined in my wildest dreams. Stunning book. Worth a read for sure.

The next book ties into a major theme in my life, which started early in 2019. Having seen the documentary Minimlism – A Documentary about the Important Things, I started to make some changes with my relationship to the stuff I own and I read Everything That Remains: A Memoir by The Minimalists as a part of that. I’m not looking to become a monk like minimalist, but I own a lot less stuff now than I did at the beginning of 2019 and I’m much happier for it. This book, like the documentary, strikes a very relaxed and non-judgemental tone to help you look a life a different way. I liked it, and I’ll probably read it again.

I then moved toward some aviation books, beginning with The Electra Story: The Dramatic History of Aviation’s Most Controversial Airliner. This was a fun book about an aircraft that live a very interesting life. Good book, if you’re aviation inclined.

From there I read The Long Way Home. This is a book about a flight in a Boeing Clipper ship aircraft in the days just after the Pearl Harbor attack. Pan Am built their name with planes like these, offering routes no other form of transport could offer. They were, at the time, the only real way to travel relatively quickly around the world.

They accomplished this by completing flights between islands, and sometimes landing in the middle of the ocean near a fuel boat, to bridge the long gaps. We are able to do this much more easily today with efficient jet engines, but these aircraft flew pretty slowly and spent long hours in the air to cover a distance we could cover in a fraction of the time. Since these planes could do what no others could, the pilots were given special instructions on where to report if war broke out. When they landed at one of their stops they recieved the report of war and opened their instructions to find out where their aircraft, which Pan Am had prearranged to lend to the military, was to report. The story follows their journey, trying to dodge enemy territory and pushing the plane and it’s crew to their very limits. It’s a harrowing tale, and all the more fascinating since it really happened.

This story, which involved a Pan Am aircraft, naturally led me to wonder about Pan Am themselves and I read SKYGODS: The Fall of Pan Am , which takes you through not just the fall, but the whole story of the airline, and their unique leader.

On the recommendation of a friend, I then read Extreme Ownership: How U.S. Navy SEALs Lead and Win. While I appreciated the content in the book, I don’t resonate with the military point of view of the authors. Nothing against the military, or the author’s careers, its just not something I have any direct experience with. The book is broken into three pieces per lesson, an office/work based situation, a military operation of that same lesson, and then a more direct explanation, or expansion, on the lesson with some practical items to try. I got a lot out of it, and I would recommend it, but I can’t say I have a lot in common with a Navy Seal.

I also read Be the Master: Achieve Success & Help Others around the same time. This book was also very good, and more directly relatable to my experience. It helps to formalize in your mind the process of learning a skill, mastering it, and then passing it on; Not just learning something and keeping it for yourself. I do see a lot of people deal with the fear of losing their job because others can complete the same tasks they can. Among the many points made in the book he says that by sharing you skills you enable yourself to be promoted, and take on new things instead of just stagnating where you are today.

After all that stuff, it seems inevitable that I drifted back toward murder mysteries. I read three of Agatha Christie’s Hercule Poirot stories, Murder on the Links, The Murder of Roger Ackroyd, and The Big Four. They were, of course, excellent.

I then read one more book on leadership, Managing Humans: Biting and Humorous Tales of a Software Engineering Manager, by Michael Lopp. I’ve followed him online for years so it was good to actually read his most well known book. I like his style, and it was good to finally read it.

I then went back to Agatha Christie, and I read all four of the ‘Tommy & Tuppence’ stories. While they were all good, they were all very different. Agatha wrote these stories throughout her career, so while the stories themselves were enjoyable, it was also interesting to see how her writing style developed and changed. The first story, her second published book, was written in 1922, and the final one was published 51 years later in 1973. A few stories were published after the last Tommy and Tuppence, but were written many years earlier.

I then read a memoir by Glyn Johns, Sound Man. He is a talented sound engineer that worked with a lot of big names, like The Who, the Rolling Stones, etc., and his memoir is a pretty frank look at his life as it relates to his craft.

I then read one more book about minimalism, Minimalism: Live a Meaningful Life, which was great. It’s a continuation of the book I had read earlier in the year, and just as good. Minimalism isn’t a set of hard and fast rules, its more of a thought process and they do a great job making that approachable.

By now we’re moving into autumn, and with the shorter days it was time to move toward warm quilts and mystery stories.

First up was another book by Faith Martin. I had read a dozen of her murder mystery books last year, and read A Fatal Obsession, which is book 1 of a series following a different main character. I will no doubt dip further into this series in the future.

From there, I read the complete canon of Sherlock Holmes stories. All of the stories are available in the public domain, so I was able to grab them for my Kindle from Project Gutenberg, which is a great resource.

Having seen so many adaptations of Sherlock over the years, it was great the actually read them all. I cannot help but use Jeremy Brett, however, as the model of my mind’s Holmes character. He’s was incredible in that role, and I’m all the more impressed now having read the descriptions of Holmes in the books.

Reading through all of those stores took around two months, and I then I read a couple of books based on reccomendations.

A friend of my asked me to read The Alchemist. I enjoyed the book, and it was a big departure from the kind of thing I normally read. It’s well written, especially considering the English text is translated from the author’s native Portuguese. It’s a story about finding your purpose, and staying focused on it through a lifetime. I can understand why it resonated so strongly with the person who recommended it to me.

I then saw a TV special, with comedian and former teacher Greg Davies, called Greg Davies: Looking for Kes. I like Davies’ shows generally, so I watched it without knowing what it was about. It’s a documentary following Davies as he looks at the roots of a book called A Kestrel for a Knave, which is a book many young British students read in school. It’s a coming of age story of a boy in a coal mining town who doesn’t want to end up a miner himself. I enjoyed the story and it was interesting to read it having already watched a show that delved into it’s background.

With that one complete, it was time to dig back into Dickens’ ‘A Christmas Carol’ once more. I read through that story, and then pressed on to the the next of the 4 stories in the collection, ‘The Chimes’. This one was horrible and I couldn’t get through it… I gave it up a week into January and will certainly skip past it next Christmas season!

Keep reading!


Switching from Nikon to Olympus: Part 3 – Trading Up.

January 15th, 2020

As I’ve been writing about over these past posts, I’ve decided to make the move from Nikon to Olympus. I spent about a week with some rental equipment, and while I didn’t love all of the gear I rented, I did love the micro four third system.

I spent some time evaluating my Nikon gear, to figure out how much money I would get for it’s sale. I used KEH for this process, since it’s a lot easier and a lot less work than trying to get it all listed for sale myself on eBay.

As I began this process I was initially considering selling off just my Nikon digital gear, and holding onto some of my well loved film cameras and lenses. But, as part of a broader push to reduce the amount of stuff I own, I chose to sell it all. I sold off every DSLR, film SLR, lens, flash, and accessory I had accumulated over more than a decade. It was bittersweet to pack up that box!

With that complete, and a rough idea of the amount of money I would get for the sale, I began the process of selecting the items I’d need to build an Olympus micro four thirds camera system.

Firstly, I chose the Olympus OM-D EM-1 Mark II as my camera body. In my testing with the EM-5 Mark II I learned that, for my shooting, I would need the more powerful AF system, and the grip built into that camera would also be most welcome. I didn’t rent the body first, but I knew I could send it back for a refund if I really didn’t like it.

For lenses, I started with the Olympus 17mm F/1.8 lens. I had rented the lens, and was very happy with it. It’s very small and light and performed great. The Olympus primes are a little more expensive than I was expecting, in relation to their Pro zoom counterparts, but I ordered this lens used and saved some money that way.

I also knew I wanted to order the Olympus 12-40mm F/2.8 lens. One of the things that drew me to micro four thirds was the much lower cost of extremely high-end lenses. In the Nikon system, to buy the Pro 24-70mm F/2.8 lens it would have cost more than $2000. This Olympus lens has a longer effective focal length, 24-80mm, the same fast aperture, less than half the weight and an $850 price tag. It’s the first time I’ve been able to consider owning top tier lenses for a camera, instead of using the slower variable aperture zoom lenses. This lens was part of my rental, and I loved it.

When I was making my purchase, in early 2019, this lens was available as a kit with the Olympus OM-D EM-1 Mark II and the two together were less expensive than buying them used. That was a great surprise. I have very happy to buy used, but getting a good deal, and getting it new, is always nice.

From there, I needed to balance my options. In the Nikon system I had the 35mm focal length equivalent of 15mm through to 450mm covered across a variety of zoom and prime lenses. On the Olympus side, I didn’t want to spend a lot more money than my Nikon gear sale brought in, but I wanted to make sure I could still handle most of the shooting situations I find myself in. And, I wanted to do it with the higher tier of lens, wherever possible.

Olympus offers an incredible zoom lens in their Pro line, the Olympus M.Zuiko Digital ED 40-150mm F2.8 PRO Lens. This lens offers the equivalent focal range of 80-300mm with a fast F/2.8 aperture and is smaller than most of the zoom lenses I used in my Nikon days. That would get me to 300mm, but I do use the longer end of my zooms when shooting wildlife, and live music. Pairing this lens with the Olympus MC-14 1.4x Teleconverter gets me to 420mm equivalent, and at a still very good F/4.0. This was a bit more money than I was hoping to spend in my initial purchase, but a year into my Olympus system, I’ve not regretted it for a single second.

I also added a few accessories to the initial kit, including a remote shutter release cable and some Flashpoint flashes to allow for flash photography in larger spaces than the included flash would be able to really help with. This also give me the ability to do off camera flash, which I used with my Nikon system. I do keep the small included flash in my day bag though. It’s so small, and doesn’t require it’s own batteries. It’s great for a bit of fill flash and since you can rotate the head around can be used for bounce in smaller rooms.

All totaled, I ended up investing (…lets be honest, the word is SPENDING) around $1000 above the sale of my Nikon gear, and I had a lot less actual stuff to show for it. But, I have the right stuff now and nNot just lots of stuff.

The question is, am I happy with the new system? The answer is, unequivocally YES. I haven’t had this much fun, this much enjoyment, from photography since my very first DSLR, the Nikon D40. I am so happy with the image quality. Sharpness is incredible and I’m just not seeing any issues with sensor noise. A properly exposed image, even at ISO6400 in the ‘noisy’ micro four thirds system, looks great and I couldn’t be more pleased.

Beyond just pure image quality, I’m also really impressed with the computational photography features Olympus puts into their cameras. Being able to shoot HDR images in camera is awesome. I’m not one for the ultra processed HDR style images, but having a RAW file with that extra data can be really helpful. Being able to use their live composite mode means you can shoot star images, with star trails, in camera so you can see what the final image will look like as it develops. No guesswork. No wondering if you captured what you needed. It’s great. I’ll dig more into this in some future posts, but I’m having a lot of fun trying out these features.

Having such a powerful and small camera systems has been awesome this year and I’m just as excited about the system today as I was 12 months ago when I took the plunge. I find myself going out for a hike with the camera much more often, and I’m much happier with the images I’m taking.

Look for more posts this year about how my camera kit has grown, and how I’ve been using it.


Switching from Nikon to Olympus: Part 2 – Before I make the jump, I’ll test out the system

January 9th, 2020

With a decision made to seriously consider Olympus, as I wrote about previously, it was time to rent some gear and give it a whirl. I looked over my calendar for a period of time that would give me a chance to use the system in a variety of situations and settled on a 6 day period around Christmas. This allowed me to shoot some event photography, some family gatherings, as well as some shooting while hiking.

I placed my order with a local rental company, BorrowLenses. Their website is really slow to return search results, but they had a deal going so I ordered an Olympus OM-D E-M5, and a lens kit that would replace the lenses I used with my Nikon gear. Bear in mind that the micro four thirds sensor is smaller than full frame, so you have to add a 2x multiplier to focal length to get the equivalent framing of a full frame lens. For instance, the standard 50mm lens for full frame, is a 25mm lens in micro four thirds. You get the same light gathering at a given aperture, but deeper depth of field, with micro four thirds.

The Olympus 12-40mm F/2.8 Zoom – This is the Olympus equivalent to the standard 24-70mm F/2.8 Pro zoom most manufactures make. This lens works out to be 24-80mm in full frame terms. It’s part of Olympus’ pro line of lenses and offers full weather sealing, dust protection, all metal construction and a manual focus clutch system.

The Olympus 17mm F/1.8 Prime – My favorite way to shoot in the Nikon system, for normal non-photography day trips, family events, etc., was with the Nikon FM body and a 28mm F/2.8 lens. It’s wide enough to photograph people around a table but without too much distortion and a fast enough lens to work in low light. There are micro four thirds equivalent lenses, but not at the purchase price I’m willing to consider. This 35mm full frame equivalent would, I hope, make a good stand in. It’s super compact, and plenty fast.

The Panasonic Leica 100-400mm f/4-6.3 – This lens is rated highly for optics and is the full frame equivalent of 200-800mm lens; Huge reach, and relatively compact size.

The equipment arrived the day before a dress rehearsal for a large choir performance I was a part of. When these events happen I usually use dress rehearsal to capture a few images of the full choir for posterity and for use in marketing the event. Normally, I struggle with my Nikon APS-C sensor and some slow zooms to try and capture the full choir. It’s a dark environment, and I didn’t own any fast wide lenses for my Nikons. I’ve never been that happy with the images. They are fine for posting on the web and printing for small materials, but they are noisy and tend to have a lot of distortion at the edges.

During the rehearsal I grabbed the E-M5, mounted the 12-40mm lens and fired a couple shots wide open to see what I could do. The camera was in auto ISO and it chose ISO6400 to get a reasonably fast shutter speed. The image looked fine. On par or better, from the onboard LCD of the camera, than what I would expect from my Nikons. Since I shot slower lenses on my Nikon I had to crank the ISO up to 12800, so the image looking better on the Olympus made sense. I was able to shoot at F/2.8, instead of F/5.6, and use a lower ISO. I’m sure my images in the Nikon system would have looked better as well with faster glass. I wanted to see some real improvement if I was going to make the switch, not just the benefits of faster glass.

I kept the lens at 12mm, F/2.8 and dropped the ISO down to 200, which is the base ISO for the E-M5. I stabilized myself, pressed the camera to my eye and used my best shooting technique to see how much I could get out of the in body stabilization of the E-M5. For proper exposure I needed a shot at 1/6th of a second. With the camera in continuous shutter, I fired off 3 or 4 shots.

When I looked at the back of the camera, every single one was sharp on the LCD screen. That was a surprise for sure. I expected maybe one shot of reasonable sharpness that I could, again, use for small items or the web. Even when I looked back on the PC later, the shots looked great. And, due to the longer shutter speed, the director’s arms are quite obviously in motion. I really liked the shot. And it’s a shot that would not have been possible on my Nikon without a tripod.

Olympus OM-D E-M5 12-40mm @ 12mm F/2.8 1/6th

That next Saturday it was time to pack the kit in my hiking bag and put a few miles in. I hike every Saturday, doing 3-6 miles, and I usually carry around two lenses and my D500. I packed the 12-40mm, and the 100-400mm and set off. It was a cool day, but no snow yet. I hiked over to an open field a mile or so from the road and found that someone had hung bird feeders on many of the trees. With all the birds flying and perching, it was a perfect opportunity to try out the long zoom lens and the AF system on the camera. The E-M5 Mark II doesn’t have as robust a system as the Olympus OM-D E-M1 II, but I wanted to see how much I really needed that system.

The big zoom on the small camera was awkward to say the least. The form factor of the Olympus OM-D E-M5 II is very similar, if even a bit smaller, than the Nikon FM I loved. In both cases though, I missed having a grip to wrap my fingers around. This isn’t a huge issue for casual shooting, but trying to be stable and supporting the camera with a large lens mounted didn’t feel good, and very quickly made my hand sore. The E-M5 Mark II has several grip options available, but I didn’t rent any.

Beyond comfort, which can be dealt with, I really didn’t like the lens. It was reasonably sharp, but it was a hassle to use. The zoom ring was just too tough to rotate. I thought perhaps my rental had seen some miles and was getting worn out, but many of the online reviews for the lens said the same thing. The lens was too big and too hard to handle. I was willing to make the trade off of an 800mm equivalent field of view at a very slow F/6.3, over something faster but not as long, but this lens was no fun to use at all.

Auto focus performance was on par for what I expected with fast moving birds. I had some shots in focus, but many shots were just barely out of focus; The AF system in the EM-5 Mark II is not designed to handle sports/action and wasn’t able to keep up with birds in flight. This answered, for me, the question about needing to step up to the EM-1 Mark II instead, for it’s superior AF system.

Here’s a couple quick shots made while out walking with the Olympus OM-D EM-5 Mark II, and I think the 12-40MM F/2.8 lens. I shot them in RAW, and then applied some of Olympus’ filters to them in post processing. You can, if you chose, apply these filters in camera and they can be visible in the viewfinder as well while you shoot.

Olympus OM-D EM-5 Mark II – Pop Filter II

Olympus OM-D EM-5 Mark II – Diorama

Later that evening we headed over to the seacoast for some Christmas sight seeing and a family dinner. I grabbed my Nikon D500, with the 35mm F/1.8 lens (~50mm equivalent), and my wife packed the Olympus with the 17mm prime lens. I was immediately struck by the size, the HEFT, of the Nikon! We traded cameras back and forth a bit that evening and I was very happy with the abilities of the E-M5 in low light with AF, and the images looked pretty good as well. My only issue was down to my lack of familiarity with the controls of the E-M5, but in time I’m sure I’d adapt. I can operate my D500 by feel alone, and I’m just not there yet with the rental camera.

Then, for Christmas Eve church services I used all three lenses to see what I could do in such low light. Below is a worst case scenario shot with the EM-5 Mark II at ISO6400 and the 100-400MM lens at F/6.3. I chose a shutter speed of 1/80th of a second, which sadly wasn’t fast enough to stop motion in the image. You can see in the image that the subject is moving, but for online viewing, and printing at smaller sizes I think this looks great. Very usable image and a lot of fine detail remains, in the hair, etc. (Please forgive any color casts, I’m color blind and I can’t always make sense of the colors I’m seeing to make them look natural.)

Olympus OM-D EM-5 Mark II with Panasonic Leica 100-400 @f/6.3 1/80 second

The rental also came with the tiny shoe mount flash that comes with the E-M5 II. It mounts in the hot shoe, and has a extra connector at the base that allows it to use the camera’s battery for power. Super compact. I wouldn’t want to shoot a big event with this small flash, but as fill flash, it’s excellent. I won’t share any images here, but I made good use of it Christmas morning and it worked great. I was really pleased.

So, to sum it all up micro four thirds worked great, and the only thing I feel like I’m giving up is size! I was really impressed with the system over all. Noise wasn’t nearly as big of an issue as I was expecting and I was very happy with the quality of several of the lenses I rented. The long Panasonic Leica zoom lens was too large, too slow, and too stiff for me. And, if I’m honest the EM-5 was too small for me. That brings me to two solid reasons to buy the Olympus OM-D EM-1 Mark II over the EM-5, size of the body and the AF system.

I really liked the 12-40mm F/2.8 lens. I’ve never shot much with pro level fast glass, and having the the Olympus lens be so sharp, small and light was a real eye opener for me. The 17mm F/1.8 was a great replacement for my ‘walking around’ lens and is definitely on my short list.

All that remains to do now is actually take the plunge and abandon Nikon, the only SLR or DSLR manufacturer I’ve ever used, and step into the unknown!

More on that next time…


Centos 7 – NextCloud-Client Installation Issues (September 2019)

September 16th, 2019

Quick post, which will hopefully be helpful to someone..

For the past couple of weeks I’ve been unable to update one of my Centos 7 servers. I chased it down this AM to the Nextcloud-Client software I use on that machine.

Long story short, there is a qt update in EPEL that isn’t compatible with Centos 7.6.1810. Upstream (RedHat) has already released RHEL 7.7, but Centos has not yet caught up. EPEL is tracking upstream, and this qt update will work once Centos 7.7 is released.

In the meantime, I stumbled across this page on Nextcloud-client Currently Not Installable From EPEL In CentOS7

If you already have the Nextcloud-client installed, simply exclude the qt package from updates by adding ‘exclude=qt5-qtwebkit’ as a new line to your /etc/yum.conf file. If you already have an exclusion line, just put a comma after the last package you’re excluding, and add qt5-webkit.

If you don’t have the client already installed, you will have to grab the ‘archive’ package linked in the above article.

Now that I have the updated exclusion in my yum.conf, running updates completes without issue.

Just remember to remove the exclusion when Centos 7.7 is released!


Switching from Nikon to Olympus: Part 1 – Isn’t Micro Four Thirds a Toy format?

February 8th, 2019

Though I’ve not been a very regular blogger, I have done a decent job of documenting my various photography gear related changes here since my first DSLR, a Nikon D40, back in 2006. From that first DSLR, through my foray into 35mm film cameras, medium format, and the various other DSLRs I moved to along the way, those milestones have been written about here. And, with the exception of the medium format gear, all of that equipment has always been from Nikon.

Perhaps to my detriment, I’ve always been 100% loyal to Nikon. SLRs, DSLRs, lenses and accessories, I always bought Nikon gear. To a certain extent this stems from the idea that it makes sense to shoot the same brand of camera as your group of friends, which served me well; It was great to be able to share around lenses/gear as needed.

On the flip side though, Nikon has never made APS-C lenses a priority and as they fall further behind with, to their mind, their more important full frame equipment, their DX lenses have really suffered. When Nikon released their first DSLRs, like the D1 and D2 series of cameras, they did make some inroads with professional quality DX lenses. Since those initial releases though, they have not updated or refreshed those lenses and major features are lacking like VR and newer glass coatings. As I was rebuilding my gear a couple years ago around the Nikon D500, their high-end sports and wildlife camera, I had zero on brand options for higher end glass. Since the reason I bought the D500 was to minimize weight, without sacrificing camera features, buying their full frame lenses wasn’t an option. Their weight, size, and price are just ridiculous. Have you seen some of the glass they announced for their mirror-less Z Series cameras? Sure, the camera is a bit smaller, but the lenses are bigger!

Enough is enough. I want a smaller & lighter setup, access to stunning lenses, and a company committed to a system with the innovation to show for it. The hunt was on!

I did a quick pass of all the camera brands I could think of. Fuji, Canon, Sony, Pentax, Panasonic, Olympus, etc., and did a quick scan of their current range of gear to see what I could find. Fuji has a developing business in APS-C sensor cameras, but lenses were still very large, and expensive. Canon has a LOT of product lines. They do offer a smaller line of mirror-less cameras but they have not made it a priority to release high quality glass. After some deeper looking, it felt to me like Nikon’s approach to APS-C, a second class citizen. Sony is a very popular brand these days, but I’m not willing to purchase, and lug around, huge full frame lenses. No viable options worth the hassle of changing systems.

At this point, I had more or less decided that where I was with the gear I already owned was where I had to be. There was no better options. The only other option I had was micro four thirds, and no one serious shoots that toy format.

All of this was happening toward the end of 2018 and as family events approached, and evenings spent walking around some local cities, I found myself deciding not to take my D500 and a 35mm prime lens with me. Too much weight. Too much hassle. The resulting pictures were almost never worth the trouble of carrying around even just the D500 with a 35mm prime, let alone my full kit. I could have shot more, and better, images on my iPhone.

Ok. It’s time for drastic change. My gear is officially holding me back.

After years of being in the camp of people who looked down on micro four thirds, the attractiveness of a smaller and lighter kit of gear was enough of a draw now for me to give the system a little more consideration. The knee jerk reaction for most people, myself included, was that the tiny sensor was useless at higher ISOs and not worth considering. I did some searching around on Flickr for higher ISO images and nothing I saw was alarming, so I kept digging. While my Nikon D500 could shoot at ISO 1.6 million, even by ISO 6400 the images were pretty noisy. The images I found on Flickr for the OM-D EM-1 Mark II certainly didn’t look any worse.

Next was to look into the lens options, and how they compared to Nikon’s lenses. For weight reasons, my current Nikon lens line up included some primes, but was mostly amateur zoom lenses. Image quality was fine, but AF speed is slower then with pro lenses and they aren’t built to the same standard as pro gear or weather sealed. After some web searches I came across, which has a pretty complete list of available micro four thirds lenses.

I started looking around on pricing for the same quality of lens as I had with Nikon and was a bit confused by the price. They were very very inexpensive. I did a quick look around for lens reviews and found results inline with the like lenses I had on Nikon. Not perfect lenses, definitely had their faults, but workable.

Ok, so how about the higher end Olympus Pro lenses then?

This is where things got interesting. For, relatively, short money I could seriously upgrade my quality of lens without having to sacrifice by taking on a lot of weight. At this point, my thinking is that I can protect myself a bit from the noise of the camera sensor by shooting with wider apertures. Trading my F/3.5-5.6 zooms for constant aperture F/2.8 zooms would make a big impact on keeping my ISOs lower when shooting.

Ok, so I’m working my way toward starting to maybe consider micro four thirds as my next camera system.

The next hurdle, for me, is AF. My Nikon D500 was very sure-footed in terms of auto focus. My lenses weren’t the fastest at focusing, but it never hunted around and it locked on and held focus without issue. Always. I won’t get down into the weeds too much here, but DSLRs use a dedicated system for finding focus, called phase detection, and mirror-less cameras, generally speaking, use the image sensor to find focus with a system called contrast detection. DSLRs that have live view and video modes use this contrast detection system for focus in those modes. Both systems have their positive aspects, and negative aspects. (You can click through here to a site that breaks this down in detail.)

For tracking focus of moving objects though, phase detection is important because using that system the camera knows if it needs to focus closer or further away, and by how much. Then it very quickly snaps the lens to that focus distance. With contrast detection the camera only knows it’s not in focus and has to hunt in and out to figure out how to get in focus. If a person is running toward, or away from you, phase detect focus is what you need. It’s an area of very active development for the mirror-less vendors, but only Olympus offers it in micro four thirds, but using some special, extra, sensors embeded into their image sensor. So, that narrows down my camera choices. The other micro four thirds vendor, Panasonic, doesn’t have this feature and has announced they are not pursuing it.

Ok, so I’ve identified a camera manufacturer I need to look more deeply into. And, since Olympus and Panasonic are working in tandem on the micro four thirds format, two major manufactures of lenses, as well as a myriad of other third party makers.

The next step in this journey is to get my hands on some of this gear and see for myself what it can, or can’t do. The only way to know for sure if this kind of change is going to work is to rent the gear, and use it like I normally use my gear.

We’ll dig into that in Part 2 of this series, and thankfully, that’ll even include some images instead of just a massive wall of boring text!


Books Read: 2018

December 31st, 2018

Using my (now aged) Kindle, I do a fair bit of reading. For some reason, whenever I finish a book I put it into a folder on my Kindle named for the current year. These folders exist only on the Kindle itself, so I thought I might start to keep track of them here on the blog.

At the end of 2017 I was reading a lot of memoirs of people who moved into the wilderness, both in recent years and in centuries past. That continued into 2018 and the first book I read was:

Winds of Skilak, by Bonnie Ward
This was an excellent book, written by Bonnie, about her and her husband’s journey leaving Ohio and moving to an isolated island on Skilak Lake in Alaska. Her section on driving their jeep across the melting lake was a real nail biter!

Next, I read a book I received as a gift, Chernobyl 01:23:40: The Incredible True Story of the World’s Worst Nuclear Disaster. I’ve always been facinated by the story of the nuclear disaster but I had never read too deeply into it. I really enjoyed this book, which feels more honest and unbiased than much of the reading I had done to date. It was a fast read and did a nice job explaining what happened, both on a human and scientific level, without loosing me in the finer points of nuclear power generation.

After that it was back to a number of few more ‘moving to the wilderness’ memoirs.

First was Our Life Off the Grid: An Urban Couple Goes Feral. This was a great, very pragmatic, story about a Canadian couple that left the city to live on island in coastal western Canada. This couple chooses a harder life than most, but they had far more neighbors than many of the stories I had read. It was very well written and far less stoic than some of the others I read that were written by men.

Then I read Arctic Homestead: The True Story of One Family’s Survival and Courage in the Alaskan Wilds. This one was enjoyable, though it contained a couple of sections that I found completely unbelievable. This was written by the wife and mother of the family and much of it read like she was trying to keep life happening as usual while her husband spent his time making rash or short sighted decisions that had consequences for the family down the road. Without reading both sides of the story its hard to be sure just where the truth lies…

Next I read a book about the Appalachian trail called A Walk in the Woods: Rediscovering America on the Appalachian Trail. I enjoyed the book, and the more diary-ed approach to his writing. It was a journey in multiple parts and it was great to hear about the situations and people. Many of the themes he touched on resurfaced in other books I read on the topic.

I then wanted to do a bit more reading about other trails here in my home state, New Hampshire. I decided to read Not Without Peril, Tenth Anniversary Edition: 150 Years of Misadventure on the Presidential Range of New Hampshire. To be frank, I did not enjoy this one. For someone who isn’t a hiker, or needs a graphic warning about the dangers of being unprepared and uninformed, it would be an important read. But, for me, it simply read like an unending string of people who made bad choices reaching the end of their life on the side of a mountain, or coming very close.

I decided to then read another story of someone walking the Appalachian trail, called Becoming Odyssa : Adventures on the Appalachian Trail. This was excellent. It’s written similarly to the previous book on the trail, but from the point of view of a young woman who chose to hike it more or less on her own. It was great to read about her journey and perseverance.

Next, it was time for a change of pace and I read Skunk Works: A Personal Memoir of My Years of Lockheed, which was an excellent book about the history of Lookheed Martin’s Skunk Works division but someone who was there during it’s heyday. Since I was very young I used to love the aircraft they designed, especially the SR-71, and I really enjoyed hearing about those years at the company.

Next up was quite a short book about life atop Mount Washington written by a scientist living in the observatory for a year called Among the Clouds: Work, Wit & Wild Weather at the Mount Washington Observatory I read this very quickly and wished it had been much, much longer!

Having finished that book, touching on weather, I then read a book about a hurricane that struck Texas in 1900 called Isaac’s Storm: A Man, a Time, and the Deadliest Hurricane in History. This book was truly fascinating. It goes into meteorology at the turn of the century, and how that data was used (or not…). Beyond the actual story of the storm and those involved, it was incredible to see how people were interacting with each other. Husbands, in suicidal hubris, telling their wives to stop worrying about nothing and get back to baking, when in fact they should have fled the area. Impactful for sure.

After the previous book, I needed something a bit lighter and I dug into Agatha Christie’s The Mysterious Affair at Styles. Apparently, this was her first novel and introduced to the world Hercule Poirot. I’ve long loved reading murder mysteries and this was the first of many.

First though, I read a book given to me as a gift: To the Edges of the Earth: 1909, the Race for the Three Poles, and the Climax of the Age of Exploration. This was a real slog. It was given to me in hardcover, which is Ok, though I do prefer to read on the Kindle. The book is actually pretty good, but the writing style differed greatly from my preference. I am comically bad at recalling which character is which in books (and TV shows, Movies…) and this book is written interweaving three stories about three groups trying to summit Mount Everest and reach the two poles. Had it been written in three parts, I’d have loved it but it kept switching from group to group and I could never tell who was were. Very interesting material, and I’m glad I read it… but it was slow going for me. I was reading a book every 10 days or so, but this one took me months!

After that, it was surely time to enjoy reading again! We were going on a bit of break and I wanted to fill up my Kindle with a few, light, stories to enjoy while away. I did a search for Agatha Christie’s book and snagged Dead Man’s Mirror: A Hercule Poirot Story, The Affair at the Bungalow: A Miss Marple Story, Problem at Sea: A Hercule Poirot Story, and The Witness for the Prosecution. I basically sorted by best reviews, filtered for $0.99 books and bought the top 6. I enjoyed them all but there was an interloper.

The final book of the bunch, and no doubt an advertisement Amazon slipped past me, was a book by Faith Martin called MURDER ON THE OXFORD CANAL a gripping crime mystery full of twists. I really enjoyed this book, which was a bit meatier than the Christie’s, and I then went nuts and read 11 more in the series. They all follow a DS in Oxford, England as she solved murders. They are light reading and a bit predictable in form, but I enjoyed them. The actual murder investigation is always interesting and there is a good bit of procedural detail in them. Additionally, she weaves in extra story lines with the book’s primary characters to create a larger story arc that takes place across three or four books. Each book had one such story line coming to a close, while others were percolating away. It was these extra details that kept me reaching for the next book. I read the following:

1. A Narrow Escape (2004)
aka Murder on the Oxford Canal
2. On the Straight and Narrow (2005)
aka Murder at the University
3. Narrow Is the Way (2006)
aka Murder of the Bride
4. By a Narrow Majority (2006)
aka Murder in the Village
5. Through a Narrow Door (2007)
aka Murder in the Family
6. With a Narrow Blade (2007)
aka Murder at Home
7. Beside a Narrow Stream (2008)
aka Murder in the Meadow
8. Down a Narrow Path (2008)
aka Murder in a Mansion
9. Across the Narrow Blue Line (2009)
aka Murder in the Garden
10. A Narrow Point of View (2010)
aka Murder by Fire
11. A Narrow Exit (2011)
aka Murder at Work
12. A Narrow Return (2012)
aka Murder Never Retires

There are a few more yet in the series, which I may get to next year, but I’ve put them to one side for the moment and have started reading some more varied murder mysteries.

I read a pretty good mix of fiction and non-fiction this past year, which was a surprise to me. Until the second half of 2017 I’m not sure I’ve ever sat down and read a non-fiction book for pleasure before. We shall see what 2019 holds. If I have the time, I’ll try to post monthly about my reading through the year. That being said, if I end up bingeing on one author’s collected murder mysteries perhaps a more broad digest is best…

Next »