MacOS Catalina and Beyond – Updating $PATH in ZSH (2020)

February 3rd, 2020

I’ve just moved from Fedora on my primary work laptop, to MacOS. It’s a change I’m pretty happy about, not because I had any trouble with Fedora, but the quality of my hardware has been upgraded in a big way. This transition has been pretty smooth overall, but I have needed to solve a few small details to make sure I’m not losing any features.

One of those has been adding the tool MTR back to my machine. This was pretty straight forward, as it’s a part of Brew, a sort of package manager for MacOS. The only issue was that once MTR was installed, it’s location wasn’t in my $PATH for the default terminal emulator used in MacOS Catalina, called Z Shell, or zsh.

I could have put a symlink in place to work around that, but instead I added it’s location ‘/usr/local/sbin’ to my path instead.

This was a simple as creating a .zshrc file in my home directory, and making sure the folder was in the list.

Step one is to pull your current $PATH information:

echo $PATH

This should respond back with the current list of folders in your default path. It looked like this on my machine:


From the terminal, I then created a new file in my Home directory called ‘.zshrc’, and added in the data required. The leading period is very important. It’s a way to create a file that is hidden from normal view, and zsh won’t see it otherwise.

vi ~/.zshrc

Once in vi, enter insert mode by pressing the letter i on the keyboard. Then, we build the command we need. On the first line of the file, create the command to update your path:

export PATH={your current PATH information from the previous command}

And then append the new path you want to add, following the same format, which includes a colon between each path.


Then, you can save and quit. To do this in vi press the escape key, to leave insert mode, and then enter ‘:wq’ (colon w q), which will tell vi to save and quit. Then to be sure typed it all correctly, let’s look at the output with the cat command:

cat ~/.zshrc

You should see an output like this:

export PATH=/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/sbin

With that complete, fully quit terminal and then relaunch. You should see the updated information when you re-run our first command again:

echo $PATH

There are lots of other items you can add to this file to customize your terminal, but that’s for another day…


Backup Hosted Email – OfflineIMAP with Fastmail

January 30th, 2020

A few years ago I finally let go and started using a hosted email provider. Running a secure email platform is something I’ve done or years, both professionally and personally, but I was tired of power outages at home taking my email offline and it far less expensive to move to hosted email than convert my home into a redundant datacenter.

After some research, I landed on Fastmail. Pricing is very reasonable and they allow me to use all of the domains I have in my one account. To date, and it’s been a couple of years now, I’ve never lost access to my email due to their back-end going down. And, they are always pushing the boundaries with security and new features to improve the service. I’ve been very happy.

One of my favorite things to do is setup a dedicated email alias whenever I signup for something new online. Then, if that company sells my email address to spammers, etc., I can always tell who it was and delete the alias. I’m sure other providers can do that too, but its a wonderful thing.

Since I’m a sysadmin at heart though, I can’t just trust my email to them and not think about it again. I could re-point my domains pretty quickly if something were to happen to them as a company, but I also want to maintain my own totally offline backup of my IMAP mailbox.

Enter OfflineIMAP.

I run a script every day on one of my Linux servers that syncs down all of my email to a local machine, which I then further backup and protect.

This process begins with a BASH script:

#!/usr/bin/env bash
#First step is to make sure we're not running already. If so, wait 2 seconds and try again. Nearly pointless, but good to have, espcially in testing.
while pkill --signal 0 offlineimap
sleep 2
rm -f /home/topslakr/sent # This is Mutt's cache of sent emails, which I don't maintain long term.
offlineimap -c /home/topslakr/.offlineimaprc > /home/topslakr/Email_Sync.tmp 2>&1 & # Runs backup command and sends to background, writing to the .tmp file.
sleep 60 # Wait 1 minute for process to complete. It usually takes less than 10 seconds.
cat /home/topslakr/Email_Sync.tmp | mutt -s "Email Sync" {Your Email Address} # Emails the output of OfflineIMAP's run to me

For that script to work, you’ll need to setup configuration files both for OfflineIMAP itself, and mutt, a text based email client on Linux.

Firstly, here is my config file for OfflineIMAP. This file, on my CentosOS 7 system, is located in my /home// directory and called ‘.offlineimaprc’.

accounts =

[Account ]
localrepository = Local
remoterepository = Remote
status_backend = sqlite

[Repository Local]
type = Maildir
localfolders =

[Repository Remote]
type = IMAP
remotehost =
remoteport = 993
ssl = yes
cert_fingerprint = ddac83e619367e9e5f6f0142decba6872d7319f2
holdconnectionopen = yes
remoteuser = User Name
remotepass = Single Use Password

General defines the accounts OfflineIMAP is aware of. You can give them any name you like, and you then define it’s properties in the following [Account ___] section. The settings I have listed for General and Account will likely work for you as well.

Repository Local is the spot on your system where email is stored. It will build the folder directory you have in your IMAP account and put the messages within in.

Repository Remote is the details about the remote IMAP server itself. We’ll dig into ‘cert_fingerprint’ below. Make sure you set your username and password. With Fastmail you will need to set an app password in your account for this. Your normal Fastmail password will not work.

The final piece that I use is mutt to send me a status email. This isn’t required. I like to get notifications about routine jobs that run, so I can keep an eye on them. In this case, I take the status output of OfflineIMAP and email it to myself each day so I know it ran.

This is pretty simple. You just need to put a few details into your .muttrc file. My .muttrc file like like this:

set ssl_starttls=yes
set ssl_force_tls=yes
set smtp_url = "smtp://[fastmail Login]:[Another App Password]"

Pretty simple. I use a different app passwords for each piece but it’s probably possible to use just one.

So, the ‘cert_fingerprint’ line. When you setup OfflineIMAP you add to your config a unique identifier from the SSL cert installed on the secure remote IMAP server. OfflineIMAP will then only sync if that certificate remains in place. If the remote side gets compromised, or someone intercepts the traffic and tries to decrypt it, OfflineIMAP will not run. Also, when Fastmail updates their SSL certificates, OfflineIMAP will fail. It’s really easy to get that identifier though, using a tool called ‘gnutls-cli’.

Simply run the tool with the web address:


In the return from that command, look for the ‘SHA-1 fingerprint’ for the web address you submitted. Often times the return will give you data for the certificate on the server, and the other certificates in the chain.

For this command this is the relevant data, and it’s not unique to you. This is the current (01/30/2020) SHA-1 fingerprint for

- subject `C=AU,ST=Victoria,L=Melbourne,O=FastMail Pty Ltd,CN=*', issuer `C=US,O=DigiCert Inc,CN=DigiCert SHA2 Secure Server CA', RSA key 2048 bits, signed using RSA-SHA256, activated `2020-01-22 00:00:00 UTC', expires `2021-02-24 12:00:00 UTC', SHA-1 fingerprint `ddac83e619367e9e5f6f0142decba6872d7319f2'

Good luck with your backups!


Books Read: 2019

January 17th, 2020

As I did last year, below is a listing of all the books I read in the previous year, and my thoughts on them. For the first time, all of the books I read last year were in digital form.

I began the year finishing Charles Dickens’s ‘A Christmas Carol’. I read this as part of a collection of his Christmas stories.

In keeping with my love of murder mysteries, I read Introducing Agatha Raisin: The Quiche of Death/The Vicious Vet, the first of the Agatha Raisin books by M. C. Beaton. We’ve watched the TV series and they seemed like some light reading after the holidays. I enjoyed the book, but it wasn’t my favorite.

I then read a book I’ve been meaning to read for ages, The Hidden Life of Trees. This book was fascinating! I learned a huge amount and as each chapter ended I was always thinking ‘Surely, that must be it…’ and then was surprised with another chapter about trees that I would never had imagined in my wildest dreams. Stunning book. Worth a read for sure.

The next book ties into a major theme in my life, which started early in 2019. Having seen the documentary Minimlism – A Documentary about the Important Things, I started to make some changes with my relationship to the stuff I own and I read Everything That Remains: A Memoir by The Minimalists as a part of that. I’m not looking to become a monk like minimalist, but I own a lot less stuff now than I did at the beginning of 2019 and I’m much happier for it. This book, like the documentary, strikes a very relaxed and non-judgemental tone to help you look a life a different way. I liked it, and I’ll probably read it again.

I then moved toward some aviation books, beginning with The Electra Story: The Dramatic History of Aviation’s Most Controversial Airliner. This was a fun book about an aircraft that live a very interesting life. Good book, if you’re aviation inclined.

From there I read The Long Way Home. This is a book about a flight in a Boeing Clipper ship aircraft in the days just after the Pearl Harbor attack. Pan Am built their name with planes like these, offering routes no other form of transport could offer. They were, at the time, the only real way to travel relatively quickly around the world.

They accomplished this by completing flights between islands, and sometimes landing in the middle of the ocean near a fuel boat, to bridge the long gaps. We are able to do this much more easily today with efficient jet engines, but these aircraft flew pretty slowly and spent long hours in the air to cover a distance we could cover in a fraction of the time. Since these planes could do what no others could, the pilots were given special instructions on where to report if war broke out. When they landed at one of their stops they recieved the report of war and opened their instructions to find out where their aircraft, which Pan Am had prearranged to lend to the military, was to report. The story follows their journey, trying to dodge enemy territory and pushing the plane and it’s crew to their very limits. It’s a harrowing tale, and all the more fascinating since it really happened.

This story, which involved a Pan Am aircraft, naturally led me to wonder about Pan Am themselves and I read SKYGODS: The Fall of Pan Am , which takes you through not just the fall, but the whole story of the airline, and their unique leader.

On the recommendation of a friend, I then read Extreme Ownership: How U.S. Navy SEALs Lead and Win. While I appreciated the content in the book, I don’t resonate with the military point of view of the authors. Nothing against the military, or the author’s careers, its just not something I have any direct experience with. The book is broken into three pieces per lesson, an office/work based situation, a military operation of that same lesson, and then a more direct explanation, or expansion, on the lesson with some practical items to try. I got a lot out of it, and I would recommend it, but I can’t say I have a lot in common with a Navy Seal.

I also read Be the Master: Achieve Success & Help Others around the same time. This book was also very good, and more directly relatable to my experience. It helps to formalize in your mind the process of learning a skill, mastering it, and then passing it on; Not just learning something and keeping it for yourself. I do see a lot of people deal with the fear of losing their job because others can complete the same tasks they can. Among the many points made in the book he says that by sharing you skills you enable yourself to be promoted, and take on new things instead of just stagnating where you are today.

After all that stuff, it seems inevitable that I drifted back toward murder mysteries. I read three of Agatha Christie’s Hercule Poirot stories, Murder on the Links, The Murder of Roger Ackroyd, and The Big Four. They were, of course, excellent.

I then read one more book on leadership, Managing Humans: Biting and Humorous Tales of a Software Engineering Manager, by Michael Lopp. I’ve followed him online for years so it was good to actually read his most well known book. I like his style, and it was good to finally read it.

I then went back to Agatha Christie, and I read all four of the ‘Tommy & Tuppence’ stories. While they were all good, they were all very different. Agatha wrote these stories throughout her career, so while the stories themselves were enjoyable, it was also interesting to see how her writing style developed and changed. The first story, her second published book, was written in 1922, and the final one was published 51 years later in 1973. A few stories were published after the last Tommy and Tuppence, but were written many years earlier.

I then read a memoir by Glyn Johns, Sound Man. He is a talented sound engineer that worked with a lot of big names, like The Who, the Rolling Stones, etc., and his memoir is a pretty frank look at his life as it relates to his craft.

I then read one more book about minimalism, Minimalism: Live a Meaningful Life, which was great. It’s a continuation of the book I had read earlier in the year, and just as good. Minimalism isn’t a set of hard and fast rules, its more of a thought process and they do a great job making that approachable.

By now we’re moving into autumn, and with the shorter days it was time to move toward warm quilts and mystery stories.

First up was another book by Faith Martin. I had read a dozen of her murder mystery books last year, and read A Fatal Obsession, which is book 1 of a series following a different main character. I will no doubt dip further into this series in the future.

From there, I read the complete canon of Sherlock Holmes stories. All of the stories are available in the public domain, so I was able to grab them for my Kindle from Project Gutenberg, which is a great resource.

Having seen so many adaptations of Sherlock over the years, it was great the actually read them all. I cannot help but use Jeremy Brett, however, as the model of my mind’s Holmes character. He’s was incredible in that role, and I’m all the more impressed now having read the descriptions of Holmes in the books.

Reading through all of those stores took around two months, and I then I read a couple of books based on reccomendations.

A friend of my asked me to read The Alchemist. I enjoyed the book, and it was a big departure from the kind of thing I normally read. It’s well written, especially considering the English text is translated from the author’s native Portuguese. It’s a story about finding your purpose, and staying focused on it through a lifetime. I can understand why it resonated so strongly with the person who recommended it to me.

I then saw a TV special, with comedian and former teacher Greg Davies, called Greg Davies: Looking for Kes. I like Davies’ shows generally, so I watched it without knowing what it was about. It’s a documentary following Davies as he looks at the roots of a book called A Kestrel for a Knave, which is a book many young British students read in school. It’s a coming of age story of a boy in a coal mining town who doesn’t want to end up a miner himself. I enjoyed the story and it was interesting to read it having already watched a show that delved into it’s background.

With that one complete, it was time to dig back into Dickens’ ‘A Christmas Carol’ once more. I read through that story, and then pressed on to the the next of the 4 stories in the collection, ‘The Chimes’. This one was horrible and I couldn’t get through it… I gave it up a week into January and will certainly skip past it next Christmas season!

Keep reading!


Switching from Nikon to Olympus: Part 3 – Trading Up.

January 15th, 2020

As I’ve been writing about over these past posts, I’ve decided to make the move from Nikon to Olympus. I spent about a week with some rental equipment, and while I didn’t love all of the gear I rented, I did love the micro four third system.

I spent some time evaluating my Nikon gear, to figure out how much money I would get for it’s sale. I used KEH for this process, since it’s a lot easier and a lot less work than trying to get it all listed for sale myself on eBay.

As I began this process I was initially considering selling off just my Nikon digital gear, and holding onto some of my well loved film cameras and lenses. But, as part of a broader push to reduce the amount of stuff I own, I chose to sell it all. I sold off every DSLR, film SLR, lens, flash, and accessory I had accumulated over more than a decade. It was bittersweet to pack up that box!

With that complete, and a rough idea of the amount of money I would get for the sale, I began the process of selecting the items I’d need to build an Olympus micro four thirds camera system.

Firstly, I chose the Olympus OM-D EM-1 Mark II as my camera body. In my testing with the EM-5 Mark II I learned that, for my shooting, I would need the more powerful AF system, and the grip built into that camera would also be most welcome. I didn’t rent the body first, but I knew I could send it back for a refund if I really didn’t like it.

For lenses, I started with the Olympus 17mm F/1.8 lens. I had rented the lens, and was very happy with it. It’s very small and light and performed great. The Olympus primes are a little more expensive than I was expecting, in relation to their Pro zoom counterparts, but I ordered this lens used and saved some money that way.

I also knew I wanted to order the Olympus 12-40mm F/2.8 lens. One of the things that drew me to micro four thirds was the much lower cost of extremely high-end lenses. In the Nikon system, to buy the Pro 24-70mm F/2.8 lens it would have cost more than $2000. This Olympus lens has a longer effective focal length, 24-80mm, the same fast aperture, less than half the weight and an $850 price tag. It’s the first time I’ve been able to consider owning top tier lenses for a camera, instead of using the slower variable aperture zoom lenses. This lens was part of my rental, and I loved it.

When I was making my purchase, in early 2019, this lens was available as a kit with the Olympus OM-D EM-1 Mark II and the two together were less expensive than buying them used. That was a great surprise. I have very happy to buy used, but getting a good deal, and getting it new, is always nice.

From there, I needed to balance my options. In the Nikon system I had the 35mm focal length equivalent of 15mm through to 450mm covered across a variety of zoom and prime lenses. On the Olympus side, I didn’t want to spend a lot more money than my Nikon gear sale brought in, but I wanted to make sure I could still handle most of the shooting situations I find myself in. And, I wanted to do it with the higher tier of lens, wherever possible.

Olympus offers an incredible zoom lens in their Pro line, the Olympus M.Zuiko Digital ED 40-150mm F2.8 PRO Lens. This lens offers the equivalent focal range of 80-300mm with a fast F/2.8 aperture and is smaller than most of the zoom lenses I used in my Nikon days. That would get me to 300mm, but I do use the longer end of my zooms when shooting wildlife, and live music. Pairing this lens with the Olympus MC-14 1.4x Teleconverter gets me to 420mm equivalent, and at a still very good F/4.0. This was a bit more money than I was hoping to spend in my initial purchase, but a year into my Olympus system, I’ve not regretted it for a single second.

I also added a few accessories to the initial kit, including a remote shutter release cable and some Flashpoint flashes to allow for flash photography in larger spaces than the included flash would be able to really help with. This also give me the ability to do off camera flash, which I used with my Nikon system. I do keep the small included flash in my day bag though. It’s so small, and doesn’t require it’s own batteries. It’s great for a bit of fill flash and since you can rotate the head around can be used for bounce in smaller rooms.

All totaled, I ended up investing (…lets be honest, the word is SPENDING) around $1000 above the sale of my Nikon gear, and I had a lot less actual stuff to show for it. But, I have the right stuff now and nNot just lots of stuff.

The question is, am I happy with the new system? The answer is, unequivocally YES. I haven’t had this much fun, this much enjoyment, from photography since my very first DSLR, the Nikon D40. I am so happy with the image quality. Sharpness is incredible and I’m just not seeing any issues with sensor noise. A properly exposed image, even at ISO6400 in the ‘noisy’ micro four thirds system, looks great and I couldn’t be more pleased.

Beyond just pure image quality, I’m also really impressed with the computational photography features Olympus puts into their cameras. Being able to shoot HDR images in camera is awesome. I’m not one for the ultra processed HDR style images, but having a RAW file with that extra data can be really helpful. Being able to use their live composite mode means you can shoot star images, with star trails, in camera so you can see what the final image will look like as it develops. No guesswork. No wondering if you captured what you needed. It’s great. I’ll dig more into this in some future posts, but I’m having a lot of fun trying out these features.

Having such a powerful and small camera systems has been awesome this year and I’m just as excited about the system today as I was 12 months ago when I took the plunge. I find myself going out for a hike with the camera much more often, and I’m much happier with the images I’m taking.

Look for more posts this year about how my camera kit has grown, and how I’ve been using it.


Switching from Nikon to Olympus: Part 2 – Before I make the jump, I’ll test out the system

January 9th, 2020

With a decision made to seriously consider Olympus, as I wrote about previously, it was time to rent some gear and give it a whirl. I looked over my calendar for a period of time that would give me a chance to use the system in a variety of situations and settled on a 6 day period around Christmas. This allowed me to shoot some event photography, some family gatherings, as well as some shooting while hiking.

I placed my order with a local rental company, BorrowLenses. Their website is really slow to return search results, but they had a deal going so I ordered an Olympus OM-D E-M5, and a lens kit that would replace the lenses I used with my Nikon gear. Bear in mind that the micro four thirds sensor is smaller than full frame, so you have to add a 2x multiplier to focal length to get the equivalent framing of a full frame lens. For instance, the standard 50mm lens for full frame, is a 25mm lens in micro four thirds. You get the same light gathering at a given aperture, but deeper depth of field, with micro four thirds.

The Olympus 12-40mm F/2.8 Zoom – This is the Olympus equivalent to the standard 24-70mm F/2.8 Pro zoom most manufactures make. This lens works out to be 24-80mm in full frame terms. It’s part of Olympus’ pro line of lenses and offers full weather sealing, dust protection, all metal construction and a manual focus clutch system.

The Olympus 17mm F/1.8 Prime – My favorite way to shoot in the Nikon system, for normal non-photography day trips, family events, etc., was with the Nikon FM body and a 28mm F/2.8 lens. It’s wide enough to photograph people around a table but without too much distortion and a fast enough lens to work in low light. There are micro four thirds equivalent lenses, but not at the purchase price I’m willing to consider. This 35mm full frame equivalent would, I hope, make a good stand in. It’s super compact, and plenty fast.

The Panasonic Leica 100-400mm f/4-6.3 – This lens is rated highly for optics and is the full frame equivalent of 200-800mm lens; Huge reach, and relatively compact size.

The equipment arrived the day before a dress rehearsal for a large choir performance I was a part of. When these events happen I usually use dress rehearsal to capture a few images of the full choir for posterity and for use in marketing the event. Normally, I struggle with my Nikon APS-C sensor and some slow zooms to try and capture the full choir. It’s a dark environment, and I didn’t own any fast wide lenses for my Nikons. I’ve never been that happy with the images. They are fine for posting on the web and printing for small materials, but they are noisy and tend to have a lot of distortion at the edges.

During the rehearsal I grabbed the E-M5, mounted the 12-40mm lens and fired a couple shots wide open to see what I could do. The camera was in auto ISO and it chose ISO6400 to get a reasonably fast shutter speed. The image looked fine. On par or better, from the onboard LCD of the camera, than what I would expect from my Nikons. Since I shot slower lenses on my Nikon I had to crank the ISO up to 12800, so the image looking better on the Olympus made sense. I was able to shoot at F/2.8, instead of F/5.6, and use a lower ISO. I’m sure my images in the Nikon system would have looked better as well with faster glass. I wanted to see some real improvement if I was going to make the switch, not just the benefits of faster glass.

I kept the lens at 12mm, F/2.8 and dropped the ISO down to 200, which is the base ISO for the E-M5. I stabilized myself, pressed the camera to my eye and used my best shooting technique to see how much I could get out of the in body stabilization of the E-M5. For proper exposure I needed a shot at 1/6th of a second. With the camera in continuous shutter, I fired off 3 or 4 shots.

When I looked at the back of the camera, every single one was sharp on the LCD screen. That was a surprise for sure. I expected maybe one shot of reasonable sharpness that I could, again, use for small items or the web. Even when I looked back on the PC later, the shots looked great. And, due to the longer shutter speed, the director’s arms are quite obviously in motion. I really liked the shot. And it’s a shot that would not have been possible on my Nikon without a tripod.

Olympus OM-D E-M5 12-40mm @ 12mm F/2.8 1/6th

That next Saturday it was time to pack the kit in my hiking bag and put a few miles in. I hike every Saturday, doing 3-6 miles, and I usually carry around two lenses and my D500. I packed the 12-40mm, and the 100-400mm and set off. It was a cool day, but no snow yet. I hiked over to an open field a mile or so from the road and found that someone had hung bird feeders on many of the trees. With all the birds flying and perching, it was a perfect opportunity to try out the long zoom lens and the AF system on the camera. The E-M5 Mark II doesn’t have as robust a system as the Olympus OM-D E-M1 II, but I wanted to see how much I really needed that system.

The big zoom on the small camera was awkward to say the least. The form factor of the Olympus OM-D E-M5 II is very similar, if even a bit smaller, than the Nikon FM I loved. In both cases though, I missed having a grip to wrap my fingers around. This isn’t a huge issue for casual shooting, but trying to be stable and supporting the camera with a large lens mounted didn’t feel good, and very quickly made my hand sore. The E-M5 Mark II has several grip options available, but I didn’t rent any.

Beyond comfort, which can be dealt with, I really didn’t like the lens. It was reasonably sharp, but it was a hassle to use. The zoom ring was just too tough to rotate. I thought perhaps my rental had seen some miles and was getting worn out, but many of the online reviews for the lens said the same thing. The lens was too big and too hard to handle. I was willing to make the trade off of an 800mm equivalent field of view at a very slow F/6.3, over something faster but not as long, but this lens was no fun to use at all.

Auto focus performance was on par for what I expected with fast moving birds. I had some shots in focus, but many shots were just barely out of focus; The AF system in the EM-5 Mark II is not designed to handle sports/action and wasn’t able to keep up with birds in flight. This answered, for me, the question about needing to step up to the EM-1 Mark II instead, for it’s superior AF system.

Here’s a couple quick shots made while out walking with the Olympus OM-D EM-5 Mark II, and I think the 12-40MM F/2.8 lens. I shot them in RAW, and then applied some of Olympus’ filters to them in post processing. You can, if you chose, apply these filters in camera and they can be visible in the viewfinder as well while you shoot.

Olympus OM-D EM-5 Mark II – Pop Filter II

Olympus OM-D EM-5 Mark II – Diorama

Later that evening we headed over to the seacoast for some Christmas sight seeing and a family dinner. I grabbed my Nikon D500, with the 35mm F/1.8 lens (~50mm equivalent), and my wife packed the Olympus with the 17mm prime lens. I was immediately struck by the size, the HEFT, of the Nikon! We traded cameras back and forth a bit that evening and I was very happy with the abilities of the E-M5 in low light with AF, and the images looked pretty good as well. My only issue was down to my lack of familiarity with the controls of the E-M5, but in time I’m sure I’d adapt. I can operate my D500 by feel alone, and I’m just not there yet with the rental camera.

Then, for Christmas Eve church services I used all three lenses to see what I could do in such low light. Below is a worst case scenario shot with the EM-5 Mark II at ISO6400 and the 100-400MM lens at F/6.3. I chose a shutter speed of 1/80th of a second, which sadly wasn’t fast enough to stop motion in the image. You can see in the image that the subject is moving, but for online viewing, and printing at smaller sizes I think this looks great. Very usable image and a lot of fine detail remains, in the hair, etc. (Please forgive any color casts, I’m color blind and I can’t always make sense of the colors I’m seeing to make them look natural.)

Olympus OM-D EM-5 Mark II with Panasonic Leica 100-400 @f/6.3 1/80 second

The rental also came with the tiny shoe mount flash that comes with the E-M5 II. It mounts in the hot shoe, and has a extra connector at the base that allows it to use the camera’s battery for power. Super compact. I wouldn’t want to shoot a big event with this small flash, but as fill flash, it’s excellent. I won’t share any images here, but I made good use of it Christmas morning and it worked great. I was really pleased.

So, to sum it all up micro four thirds worked great, and the only thing I feel like I’m giving up is size! I was really impressed with the system over all. Noise wasn’t nearly as big of an issue as I was expecting and I was very happy with the quality of several of the lenses I rented. The long Panasonic Leica zoom lens was too large, too slow, and too stiff for me. And, if I’m honest the EM-5 was too small for me. That brings me to two solid reasons to buy the Olympus OM-D EM-1 Mark II over the EM-5, size of the body and the AF system.

I really liked the 12-40mm F/2.8 lens. I’ve never shot much with pro level fast glass, and having the the Olympus lens be so sharp, small and light was a real eye opener for me. The 17mm F/1.8 was a great replacement for my ‘walking around’ lens and is definitely on my short list.

All that remains to do now is actually take the plunge and abandon Nikon, the only SLR or DSLR manufacturer I’ve ever used, and step into the unknown!

More on that next time…


Centos 7 – NextCloud-Client Installation Issues (September 2019)

September 16th, 2019

Quick post, which will hopefully be helpful to someone..

For the past couple of weeks I’ve been unable to update one of my Centos 7 servers. I chased it down this AM to the Nextcloud-Client software I use on that machine.

Long story short, there is a qt update in EPEL that isn’t compatible with Centos 7.6.1810. Upstream (RedHat) has already released RHEL 7.7, but Centos has not yet caught up. EPEL is tracking upstream, and this qt update will work once Centos 7.7 is released.

In the meantime, I stumbled across this page on Nextcloud-client Currently Not Installable From EPEL In CentOS7

If you already have the Nextcloud-client installed, simply exclude the qt package from updates by adding ‘exclude=qt5-qtwebkit’ as a new line to your /etc/yum.conf file. If you already have an exclusion line, just put a comma after the last package you’re excluding, and add qt5-webkit.

If you don’t have the client already installed, you will have to grab the ‘archive’ package linked in the above article.

Now that I have the updated exclusion in my yum.conf, running updates completes without issue.

Just remember to remove the exclusion when Centos 7.7 is released!


Switching from Nikon to Olympus: Part 1 – Isn’t Micro Four Thirds a Toy format?

February 8th, 2019

Though I’ve not been a very regular blogger, I have done a decent job of documenting my various photography gear related changes here since my first DSLR, a Nikon D40, back in 2006. From that first DSLR, through my foray into 35mm film cameras, medium format, and the various other DSLRs I moved to along the way, those milestones have been written about here. And, with the exception of the medium format gear, all of that equipment has always been from Nikon.

Perhaps to my detriment, I’ve always been 100% loyal to Nikon. SLRs, DSLRs, lenses and accessories, I always bought Nikon gear. To a certain extent this stems from the idea that it makes sense to shoot the same brand of camera as your group of friends, which served me well; It was great to be able to share around lenses/gear as needed.

On the flip side though, Nikon has never made APS-C lenses a priority and as they fall further behind with, to their mind, their more important full frame equipment, their DX lenses have really suffered. When Nikon released their first DSLRs, like the D1 and D2 series of cameras, they did make some inroads with professional quality DX lenses. Since those initial releases though, they have not updated or refreshed those lenses and major features are lacking like VR and newer glass coatings. As I was rebuilding my gear a couple years ago around the Nikon D500, their high-end sports and wildlife camera, I had zero on brand options for higher end glass. Since the reason I bought the D500 was to minimize weight, without sacrificing camera features, buying their full frame lenses wasn’t an option. Their weight, size, and price are just ridiculous. Have you seen some of the glass they announced for their mirror-less Z Series cameras? Sure, the camera is a bit smaller, but the lenses are bigger!

Enough is enough. I want a smaller & lighter setup, access to stunning lenses, and a company committed to a system with the innovation to show for it. The hunt was on!

I did a quick pass of all the camera brands I could think of. Fuji, Canon, Sony, Pentax, Panasonic, Olympus, etc., and did a quick scan of their current range of gear to see what I could find. Fuji has a developing business in APS-C sensor cameras, but lenses were still very large, and expensive. Canon has a LOT of product lines. They do offer a smaller line of mirror-less cameras but they have not made it a priority to release high quality glass. After some deeper looking, it felt to me like Nikon’s approach to APS-C, a second class citizen. Sony is a very popular brand these days, but I’m not willing to purchase, and lug around, huge full frame lenses. No viable options worth the hassle of changing systems.

At this point, I had more or less decided that where I was with the gear I already owned was where I had to be. There was no better options. The only other option I had was micro four thirds, and no one serious shoots that toy format.

All of this was happening toward the end of 2018 and as family events approached, and evenings spent walking around some local cities, I found myself deciding not to take my D500 and a 35mm prime lens with me. Too much weight. Too much hassle. The resulting pictures were almost never worth the trouble of carrying around even just the D500 with a 35mm prime, let alone my full kit. I could have shot more, and better, images on my iPhone.

Ok. It’s time for drastic change. My gear is officially holding me back.

After years of being in the camp of people who looked down on micro four thirds, the attractiveness of a smaller and lighter kit of gear was enough of a draw now for me to give the system a little more consideration. The knee jerk reaction for most people, myself included, was that the tiny sensor was useless at higher ISOs and not worth considering. I did some searching around on Flickr for higher ISO images and nothing I saw was alarming, so I kept digging. While my Nikon D500 could shoot at ISO 1.6 million, even by ISO 6400 the images were pretty noisy. The images I found on Flickr for the OM-D EM-1 Mark II certainly didn’t look any worse.

Next was to look into the lens options, and how they compared to Nikon’s lenses. For weight reasons, my current Nikon lens line up included some primes, but was mostly amateur zoom lenses. Image quality was fine, but AF speed is slower then with pro lenses and they aren’t built to the same standard as pro gear or weather sealed. After some web searches I came across, which has a pretty complete list of available micro four thirds lenses.

I started looking around on pricing for the same quality of lens as I had with Nikon and was a bit confused by the price. They were very very inexpensive. I did a quick look around for lens reviews and found results inline with the like lenses I had on Nikon. Not perfect lenses, definitely had their faults, but workable.

Ok, so how about the higher end Olympus Pro lenses then?

This is where things got interesting. For, relatively, short money I could seriously upgrade my quality of lens without having to sacrifice by taking on a lot of weight. At this point, my thinking is that I can protect myself a bit from the noise of the camera sensor by shooting with wider apertures. Trading my F/3.5-5.6 zooms for constant aperture F/2.8 zooms would make a big impact on keeping my ISOs lower when shooting.

Ok, so I’m working my way toward starting to maybe consider micro four thirds as my next camera system.

The next hurdle, for me, is AF. My Nikon D500 was very sure-footed in terms of auto focus. My lenses weren’t the fastest at focusing, but it never hunted around and it locked on and held focus without issue. Always. I won’t get down into the weeds too much here, but DSLRs use a dedicated system for finding focus, called phase detection, and mirror-less cameras, generally speaking, use the image sensor to find focus with a system called contrast detection. DSLRs that have live view and video modes use this contrast detection system for focus in those modes. Both systems have their positive aspects, and negative aspects. (You can click through here to a site that breaks this down in detail.)

For tracking focus of moving objects though, phase detection is important because using that system the camera knows if it needs to focus closer or further away, and by how much. Then it very quickly snaps the lens to that focus distance. With contrast detection the camera only knows it’s not in focus and has to hunt in and out to figure out how to get in focus. If a person is running toward, or away from you, phase detect focus is what you need. It’s an area of very active development for the mirror-less vendors, but only Olympus offers it in micro four thirds, but using some special, extra, sensors embeded into their image sensor. So, that narrows down my camera choices. The other micro four thirds vendor, Panasonic, doesn’t have this feature and has announced they are not pursuing it.

Ok, so I’ve identified a camera manufacturer I need to look more deeply into. And, since Olympus and Panasonic are working in tandem on the micro four thirds format, two major manufactures of lenses, as well as a myriad of other third party makers.

The next step in this journey is to get my hands on some of this gear and see for myself what it can, or can’t do. The only way to know for sure if this kind of change is going to work is to rent the gear, and use it like I normally use my gear.

We’ll dig into that in Part 2 of this series, and thankfully, that’ll even include some images instead of just a massive wall of boring text!


Books Read: 2018

December 31st, 2018

Using my (now aged) Kindle, I do a fair bit of reading. For some reason, whenever I finish a book I put it into a folder on my Kindle named for the current year. These folders exist only on the Kindle itself, so I thought I might start to keep track of them here on the blog.

At the end of 2017 I was reading a lot of memoirs of people who moved into the wilderness, both in recent years and in centuries past. That continued into 2018 and the first book I read was:

Winds of Skilak, by Bonnie Ward
This was an excellent book, written by Bonnie, about her and her husband’s journey leaving Ohio and moving to an isolated island on Skilak Lake in Alaska. Her section on driving their jeep across the melting lake was a real nail biter!

Next, I read a book I received as a gift, Chernobyl 01:23:40: The Incredible True Story of the World’s Worst Nuclear Disaster. I’ve always been facinated by the story of the nuclear disaster but I had never read too deeply into it. I really enjoyed this book, which feels more honest and unbiased than much of the reading I had done to date. It was a fast read and did a nice job explaining what happened, both on a human and scientific level, without loosing me in the finer points of nuclear power generation.

After that it was back to a number of few more ‘moving to the wilderness’ memoirs.

First was Our Life Off the Grid: An Urban Couple Goes Feral. This was a great, very pragmatic, story about a Canadian couple that left the city to live on island in coastal western Canada. This couple chooses a harder life than most, but they had far more neighbors than many of the stories I had read. It was very well written and far less stoic than some of the others I read that were written by men.

Then I read Arctic Homestead: The True Story of One Family’s Survival and Courage in the Alaskan Wilds. This one was enjoyable, though it contained a couple of sections that I found completely unbelievable. This was written by the wife and mother of the family and much of it read like she was trying to keep life happening as usual while her husband spent his time making rash or short sighted decisions that had consequences for the family down the road. Without reading both sides of the story its hard to be sure just where the truth lies…

Next I read a book about the Appalachian trail called A Walk in the Woods: Rediscovering America on the Appalachian Trail. I enjoyed the book, and the more diary-ed approach to his writing. It was a journey in multiple parts and it was great to hear about the situations and people. Many of the themes he touched on resurfaced in other books I read on the topic.

I then wanted to do a bit more reading about other trails here in my home state, New Hampshire. I decided to read Not Without Peril, Tenth Anniversary Edition: 150 Years of Misadventure on the Presidential Range of New Hampshire. To be frank, I did not enjoy this one. For someone who isn’t a hiker, or needs a graphic warning about the dangers of being unprepared and uninformed, it would be an important read. But, for me, it simply read like an unending string of people who made bad choices reaching the end of their life on the side of a mountain, or coming very close.

I decided to then read another story of someone walking the Appalachian trail, called Becoming Odyssa : Adventures on the Appalachian Trail. This was excellent. It’s written similarly to the previous book on the trail, but from the point of view of a young woman who chose to hike it more or less on her own. It was great to read about her journey and perseverance.

Next, it was time for a change of pace and I read Skunk Works: A Personal Memoir of My Years of Lockheed, which was an excellent book about the history of Lookheed Martin’s Skunk Works division but someone who was there during it’s heyday. Since I was very young I used to love the aircraft they designed, especially the SR-71, and I really enjoyed hearing about those years at the company.

Next up was quite a short book about life atop Mount Washington written by a scientist living in the observatory for a year called Among the Clouds: Work, Wit & Wild Weather at the Mount Washington Observatory I read this very quickly and wished it had been much, much longer!

Having finished that book, touching on weather, I then read a book about a hurricane that struck Texas in 1900 called Isaac’s Storm: A Man, a Time, and the Deadliest Hurricane in History. This book was truly fascinating. It goes into meteorology at the turn of the century, and how that data was used (or not…). Beyond the actual story of the storm and those involved, it was incredible to see how people were interacting with each other. Husbands, in suicidal hubris, telling their wives to stop worrying about nothing and get back to baking, when in fact they should have fled the area. Impactful for sure.

After the previous book, I needed something a bit lighter and I dug into Agatha Christie’s The Mysterious Affair at Styles. Apparently, this was her first novel and introduced to the world Hercule Poirot. I’ve long loved reading murder mysteries and this was the first of many.

First though, I read a book given to me as a gift: To the Edges of the Earth: 1909, the Race for the Three Poles, and the Climax of the Age of Exploration. This was a real slog. It was given to me in hardcover, which is Ok, though I do prefer to read on the Kindle. The book is actually pretty good, but the writing style differed greatly from my preference. I am comically bad at recalling which character is which in books (and TV shows, Movies…) and this book is written interweaving three stories about three groups trying to summit Mount Everest and reach the two poles. Had it been written in three parts, I’d have loved it but it kept switching from group to group and I could never tell who was were. Very interesting material, and I’m glad I read it… but it was slow going for me. I was reading a book every 10 days or so, but this one took me months!

After that, it was surely time to enjoy reading again! We were going on a bit of break and I wanted to fill up my Kindle with a few, light, stories to enjoy while away. I did a search for Agatha Christie’s book and snagged Dead Man’s Mirror: A Hercule Poirot Story, The Affair at the Bungalow: A Miss Marple Story, Problem at Sea: A Hercule Poirot Story, and The Witness for the Prosecution. I basically sorted by best reviews, filtered for $0.99 books and bought the top 6. I enjoyed them all but there was an interloper.

The final book of the bunch, and no doubt an advertisement Amazon slipped past me, was a book by Faith Martin called MURDER ON THE OXFORD CANAL a gripping crime mystery full of twists. I really enjoyed this book, which was a bit meatier than the Christie’s, and I then went nuts and read 11 more in the series. They all follow a DS in Oxford, England as she solved murders. They are light reading and a bit predictable in form, but I enjoyed them. The actual murder investigation is always interesting and there is a good bit of procedural detail in them. Additionally, she weaves in extra story lines with the book’s primary characters to create a larger story arc that takes place across three or four books. Each book had one such story line coming to a close, while others were percolating away. It was these extra details that kept me reaching for the next book. I read the following:

1. A Narrow Escape (2004)
aka Murder on the Oxford Canal
2. On the Straight and Narrow (2005)
aka Murder at the University
3. Narrow Is the Way (2006)
aka Murder of the Bride
4. By a Narrow Majority (2006)
aka Murder in the Village
5. Through a Narrow Door (2007)
aka Murder in the Family
6. With a Narrow Blade (2007)
aka Murder at Home
7. Beside a Narrow Stream (2008)
aka Murder in the Meadow
8. Down a Narrow Path (2008)
aka Murder in a Mansion
9. Across the Narrow Blue Line (2009)
aka Murder in the Garden
10. A Narrow Point of View (2010)
aka Murder by Fire
11. A Narrow Exit (2011)
aka Murder at Work
12. A Narrow Return (2012)
aka Murder Never Retires

There are a few more yet in the series, which I may get to next year, but I’ve put them to one side for the moment and have started reading some more varied murder mysteries.

I read a pretty good mix of fiction and non-fiction this past year, which was a surprise to me. Until the second half of 2017 I’m not sure I’ve ever sat down and read a non-fiction book for pleasure before. We shall see what 2019 holds. If I have the time, I’ll try to post monthly about my reading through the year. That being said, if I end up bingeing on one author’s collected murder mysteries perhaps a more broad digest is best…

Setting Up Nagios – Working with Cisco’s CIMC

September 7th, 2017

This was far more challenging than it needed to be. Cisco makes some SDK’s available for use with Nagios but I was totally unable to make the system see them. They are just Python packages so I didn’t expect much trouble, but I was totally at a loss. I thought, initially, that the issue was that the installer dumped the files into /usr/lib/python/site-packages/ instead of the 64bit path /usr/lib64/python/site-packages/ but no amount of copying and permissions changes made the system able to see the dependencies. For those interested, here is a link to the Cisco Nagios tools (from 09/2017) Nagios Plug-Ins for Cisco UCS. I am not using that package, or its dependencies.

Instead, I found a script written in Go that worked a treat and required no other dependencies to work. You can find it here on Github: check_cisco_ucs

Having never used anything written in Go before I had a few things to learn, but it turned out very simple. I downloaded the ‘check_cisco_ucs.go’ script to my Nagios server. I tried to run it and installed a few bits of software before I understood that I needed to compile the script into an executable. That turned out to be exceedingly simple.

First, I installed the ‘golang-bin’ package, which is in EPEL.

yum install golang-bin

I then navigated to the folder containing the check_cisco_ucs.go script and ran the following command:

go build check_cisco_ucs.go

With that, I was able to run the script. The Github page has a series of example commands which I started firing at some of the Cicso C220 M4 servers I needed to monitor. The script was last updated in 2014 and while much of it worked, not all of the items the author used as examples remain working in the updated BIOS on my systems.

This one worked fine, once you update the IP address and user/password as required. It takes 10-15 seconds to run and then reported back accurate information about the RAID setup and status of the system. The only caveat being, since we run CIMC through https, I needed to add the ‘-M 1.2’ tag so that it would accept TLS 1.2, which CIMC was running.
./check_cisco_ucs -H -M 1.2 -t class -q storageVirtualDrive -a "raidLevel vdStatus health" -e Optimal -u admin -p

With this, I knew the solution would work and I started to go through the steps to make the required checks appear in Nagios. I moved the check_cisco_ucs file into the folder Nagios expects to find the command files, which for me on Centos 7 is /usr/lib64/nagios/plugins.

I then created a new file, called check_cisco_ucs.cfg, and put it into the folder I have configured my Nagios install to look for commands. Within that file I’ve listed my commands, which are a bit messy:

define command {
command_name check_cisco_ucs_storage
command_line $USER1$/check_cisco_ucs -H $HOSTADDRESS$ -M 1.2 -t class -q storageVirtualDrive -a "raidLevel vdStatus health" -e Optimal -u admin -p yourpass

I then added the relevant check to the server object I wanted to check:

define service{
use generic-service
host_name NameInNagios
service_description RAID Controller Status
check_command check_cisco_ucs_storage

Do a quick check of your Nagios config to make sure it’s sane and working:

nagios -v /path/to/nagios.cfg

I was able to do the same for the next example check on the GitHub page, for information about local disks.

From here though, the checks I wanted didn’t work as expected. Reading light status was problematic and the power supply command needed a small tweak.

The author suggested a powersupply command as:
./check_cisco_ucs -H -t class -q equipmentPsu -a "id model operState serial" -e operable -u admin -p pls_change

But, in the 3.0 version of CIMC I’m running, the ‘operState’ option is no longer present and instead I needed to use ‘operability’. My Nagios command is this tweaked version:
$USER1$/check_cisco_ucs -H $HOSTADDRESS$ -M 1.2 -u admin -p yourpass -t class -q equipmentPsu -a "id model operability serial" -e operable

The final check I wanted was to let me know if any of the status lights were amber, instead of green. The author suggested a check to watch one LED to ensure it was green but this reported no information. I instead rewrote the command to look at all of the lights and tell me if any are amber, instead of confirming one light was green.

$USER1$/check_cisco_ucs -H $HOSTADDRESS$ -M 1.2 -u admin -p password -t class -q equipmentIndicatorLed -a “id color name operState” -z -e amber

This command looks at the lights, which gives me this output:
0,green,OVERALL_DIMM_STATUS,on (0 of 6 ok)

The key here, is that the -z means that if the state is NOT found, i.e. there is no amber light, the check is considered Ok. If there is an amber light, the check will fail and alert me. Perfect.

The final hurdle between finding the plugin, and making my checks work was a couple hours effort with the Cisco UCS Rack-Mount Servers Cisco IMC XML API Programmer’s Guide. Using that guide, and some curl commands I was able to finally get the data out I wanted and transform it into commands that plugin could use.

First step was to authenticate to the CIMC:
curl -d "" https://Server IP/nuova -k

That final ‘-k’ is required to make curl ignore the SSL cert on the system, which is just a basic self-signed.

That command outputs a cookie, which you then pass in future commands to remain authenticated.

curl -d "" https://ServerIP/nuova -k

Be careful not to mix ‘ and “. The ” is used to escape the command being sent via curl, the ‘ is used to bind together the variables in that command. Mixing them will cause the commands to fail.

That should be enough to get you going to find any other commands you might be interested in.. I’ll try to fill in some more detail once I’m finished rolling out the Nagios install.

Setting Up Nagios – Installing on Centos 7

September 6th, 2017

Long story short, I changed jobs about 6 months ago and found myself in a new position without any appreciable monitoring of the hardware for which I’m responsible. That needed to change as soon as I started to have time to put some hours into it. I did a little testing and poking around with some other monitoring tools but kept coming back to Nagios as the right choice, despite the mountain of work involved in setting it up. There is such a large community of help and plugins available for it, plus it’s a tool I’ve used at other jobs for many years.

The actual process of standing up a Nagios server is incredibly simple, but that initial server does almost nothing. Let’s go over these initial steps today and I’ll post little updates of the settings and configs I’ve rolled into Nagios over time to monitor different hardware and software platforms.

Step 1 for me was to install a minimal Centos 7 install. I’m a long time user of Centos and this is a default starting point for so many of my projects. After that initial install I usually install a few tools I like to have around and lower some of the security on the system to prevent myself getting stuck solving an issue that isn’t within the application.

For me, I run this command on basically all of my Centos 7 installs:

yum update -y && yum install epel-release -y && yum install htop wget mlocate -y && updatedb && systemctl disable firewalld && vi /etc/selinux/config && reboot

This command does the following:
1.) Updates all packages on the system
2.) Install the EPEL repository, for additional packages I like to have
3.) Installs several packages I like to have, some of which are in the EPEL repo, hence why it’s third in the list.
4.) updatedb is the command that builds a database of files on the system for easy/fast searching. Once the package mlocate is installed you can run it, and then use the ‘locate’ command to find files.
5.) This disables the firewall, which I leave off for the initial config of the machine. Once the machine is working, I switch it back on and open only the ports required. This, for me, is faster then guessing back and forth when something it’s working.. is it the software? Firewall? I’m never sure. I would only do this on a machine not on the public internet.
6.) This command launches vi so I can edit the selinux level on the system. I change it to permissive while I setup the server and then make the necessary change to selinux policy once the system has stabilized. I then move this back to ‘enforcing’. This isn’t a step to take lightly and is very important.
7.) Once you save and quit vi the system will process the final command and reboot the machine.

When the system returns to service you’re ready to start installing and setting up Nagios. With EPEL already setup, it’s just a couple quick commands.

Firstly, you need to install the required packages. This can be done with one command:

yum install nagios nagios-plugins-all

Once done, you need to enable and start the two required servies, httpd (Apache) and nagios.

systemctl enable nagios httpd
systemctl start nagios httpd

Once done, it’s best to setup a password for the default nagiosadmin user, and setup any other users you might want. To setup the nagiosadmin user:

htpasswd /etc/nagios/passwd nagiosadmin

To create a new user:

htpasswd /etc/nagios/passwd (**User Name**)

With this complete you're all set. Nagios is online and working. You can access it on your server at:

http://(**server IP Address**)/nagios

You'll see a couple standard checks running on 'localhost', one of which is in an error state due to how Centos configures httpd. Nothing to be concerned about.. I actually delete that check. Next time, we'll talk more about making Nagios do something useful, like monitoring something beyond itself.

Next »