Selecting Software Tools

After a software tool is in place, it often extremely difficult to swap it out, so choosing a tool is something you should get right the first time.  However, I find that many companies/teams choose tools for all the wrong reasons.  For example:

"This one is free."

When comparing options, you will often find at least one open source offering which can be used free of charge.  To be clear, open source is awesome.  The best option for your needs might be open source, but "free" shouldn't be a criteria in most cases. 

The Joel Test asks, "Do you use the best tools money can buy?".  Joel used hardware examples, but the concept applies to software tooling as well.  Developer time is typically the biggest expense in a dev shop, so selecting software tools that make devs more efficient is paramount.  Typically, the time cost of using the software will easily eclipse the licensing cost.  Heck, not just developers - sales people, testers, etc. are typically fairly well paid, but I often see them fighting sub-par software tools.

Open source may have hidden costs as well.  Some open source projects don't put much effort into ease of installation/upgrade.  I have lost many, many hours fighting installations of "free" tools, and of course, my time costs my employer.

There is also the issue of support.  If you are considering an open source offering, look carefully at the community surrounding the project.  Is it reasonably large?  Do people tend to be helpful to newcomers?  Remember, these are the people you will probably be asking for help if you run into problems.  Sometimes there are companies that offer paid support for open source products as well.  

"We used X at my last job"

On the surface this seems to have merit.  An employee can vouch for a tool having worked well for them in the past.  However, the needs of that team/company may be very different from yours.  Time may have passed in which that tool may have stagnated and/or other tools have surpassed it.  Tools should be selected based on your particular needs.

Some salesperson sweet-talked management or IT

This is perhaps the most egregious.  In my experience, decision makers who are not detail-oriented can fall into this trap.  Being decisive is typically a good trait for people in this position, but making decisions with inadequate information can prove quite costly.  Salespeople are paid to gloss over a product's faults, I have seen companies spend tens of thousands of dollars on software they have barely seen, let alone used. 

A Better Way

Over the years, I have done many software comparisons, and I've gradually honed my technique.  These days I typically build a spreadsheet that looks something like this:

The steps to build it typically go something like this:
  1. Search the web for as many potential candidates as possible.  If you know one or two players in the field, search for "[Tool X] vs " and let Google fill in the rest.  That will tell you who some of their competitors are.  Make these your column headers
  2. Make a list of all of the features that matter to you.  These will be your rows.  I often arrange these by importance with "must-have" features color coded.

    Note: As you do more research into each product, you may come across features you didn't know existed.  If you like them, add them to the list.

    A few of the categories I typically include are:
    * Backed by a solid community or company & actively developed - If open source, are there multiple active developers?  If proprietary, is the company actively developing the product and likely to remain solvent?
    * Reputation/Popularity - Popular products usually stick around.  Also, if whatever product we choose does eventually die, popularity usually determines if other companies make tools to import your data from the dead tool.
    * Intuitive UI - This is subjective but often a tie-breaker.
    * How well does this interface with our other tools (including future, planned tools)?
    * Reliability - Typically gleaned by reading user reviews
    * Cost - If they don't list it on their website, that's a bad sign. :)
  3. Start filling in as many boxes as possible.  Actually, when I do this, there are often a large number of tools (columns), and I am really just trying to knock as many choices off the list as possible.  If I can find an area in which a tool doesn't meet a must-have criteria, I don't spend any more time researching it.

    While filling out the spreadsheet, try the following:
    * Hyperlink each cell to the webpage where you found the information.  This is helpful if you find conflicting information later or you need to share your findings with a coworker.
    * Color code each cell.  I use Red, Yellow, & Green.  Red means the product lacks that feature.  (Red in a "must-have" row is a disqualification.)  Yellow indicates something less-than-ideal.  Green means it has the feature.  (I often enter "good", "very good", or "excellent" to distinguish between greens.)
    * Add notes to each cell with additional information.

    Try some of the following tools while researching:
    * Reviews on G2Crowd, Capterra, blogs, etc. (Just search for "[Tool X] reviews")
    * to see if the company is a startup.  If they are early stage, there's a good chance they won't be around to support/maintain the product long term.
    * YouTube to get a look at their UI.
    * Wikipedia sometimes has comparison grids (example).  While these can be very helpful, they are sometimes out of date.  If it says the product has a given feature, it probably does.  If it says it does not, verify elsewhere. 
    * Search for "[Tool X] vs [Tool Y]". Sometimes a blogger will do a head-to-head; just make sure to check the publish date.
    * For open source projects, try to find their contributors page on GitHub.  For example,  If they have had very little recent activity or they only have one or two primary contributors, I typically mark them off the list.

    Update: In discussing this with Jeremy Cerise, he described a more complete check that he uses: "how many stars on Github as an immediate metric of how many people may be using it or following it, followed by, when was the last commit to gauge how up to date it is, followed by a similar metric to yours, how many active committers, followed by, how many open tickets to resolved tickets, and the timeframes on both."
    * For closed source/paid projects, you can of course search for reviews, but another resource is  Look up the company and get an idea of how healthy it is.  For example, while researching issue management tools, Glassdoor reviews marked this company off my list.
  4.  If you have narrowed the list down to 2-3 tools and no clear winner is emerging, consider doing a demo of each tool.

    Note: Cross as many as possible of the list before you start doing demos.  This is especially true if you are looking at self-hosted options, as there is often significant overhead in spinning up test systems.  Cloud-hosted options are less work, but you still have to spend time getting/extending licenses, fending off sales people, etc.


This approach has served me well, and I hope it will be of use to you as well.  Of course, this approach is time consuming, so time spent here needs to be weighed carefully.  Be wary of paralysis by analysis.  I like to give myself a deadline for the decision, so I don't spend too much time.  Typically, after digging for a while, you have a pretty good gut feeling anyway.


Popular posts from this blog

Fixing Conan Lock Issues

Making a standard ASP.NET listbox do multiselect without holding Ctrl

Setting up Jenkins, GoogleTest, & Mercurial (with a local repository)