Choosing software your team will love

Feb 14, 2019  |  5 min read


In celebration of Valentine’s Day, today we’re focusing on the importance of evaluating usability when selecting a technology partner. In other words, how to best determine whether the intended users of a new tool will fall in love with it.

While this romantic vision may seem out of place in a business environment, listening to the needs of end users can save you time and money that would have otherwise been wasted on trial and error.

Does this situation sound familiar?

You spend months shuffling through proposals and watching impressive presentations, only to become overwhelmed by the sheer number of options available. When you finally make what seems like a sound decision and the team receives the new toy, things fall apart. It turns out to be an expensive piece of clunkiness at best; it’s difficult to manage and disliked by the users.

Now, you’re probably asking yourself, “How am I supposed to pick software my team will love?” A product demo is a good first step and can show you a lot, but it’s hard to really know how user-friendly a tool really is before it’s fully rolled out.   

In this post, we’ll cover leading indicators of usability, and related questions you can ask to objectively assess the likelihood users will adopt and be happy with a new tool.

Finding The Perfect Match

The ISO 9241 standard, the first part of which dates back to 1997, sets out to define usability as an umbrella term for effectiveness, efficiency and satisfaction. Put it simply, if these three criteria are met, you can feel confident users will take advantage of a new solution and appreciate you bringing it into their lives.

Building on that definition, here are some of the key characteristics you should look for when scouting the market for your next great business tool.

Effectiveness: tailored features and seamless implementation

There’s often a tradeoff between number of features and implementation complexity. Platforms with a long list of features may offer flexibility, but it takes a lot of work to customize everything according to each team’s needs, slowing down implementation and requiring complex maintenance, especially if (or more accurately, when) business requirements evolve.

Conversely, a platform with a more tailored set of features, designed around your industry and use cases, can be just as effective, if not more so, and with much more seamless setup and maintenance. Because it's purpose-built for a narrower group, you'll likely find industry best practices and processes incorporated directly into the platform for a better user experience.

For this reason, it’s important to look at effectiveness in the context of implementation, and a well-chosen set of features is often the smarter choice. For example, say you want to better harness retail data to improve performance of your consumer electronics business. You could turn to a business intelligence tool, like Tableau or Domo, but that could take months to customize with the right data integrations and retail metrics you care about. Alternatively, you could look for a purpose-built data, analytics, and forecasting tool for consumer goods companies, which could be up and running and providing insights in just a couple weeks.


  • Who are your current customers?
  • Do you focus on specific types of customer or industries?
  • How long does implementation typically take for a company my size?
Efficiency: automation and scale

Another important aspect that will impact user adoption is the tangible efficiency the new tool brings to the table. After all, technology is meant to help reduce the time it takes to complete routine tasks and increase productivity, isn’t it?

Automation is an obvious indicator of efficiency, as automating what were manual, time-consuming tasks should make processes more efficient. Even then, one thing to watch out for is whether the the new “improved” workflow requires fewer but more complex actions to achieve similar results, so the net result is zero.

Another way to measure efficiency is to understand a tool’s ability to scale. In other words, the extent to which it can grow along with your team and business, whether that be to more users, more products, more sales channels, etc. If the tool you’re considering can quickly and easily take on a greater “workload,” that’s a good sign that the technology underlying it is efficient. If, on the other hand, it’s not built on a robust platform or relies heavily on manual human intervention to work, it will be much harder to scale and be inherently less efficient, even before you reach the scaling stage.


  • Of the work that our team is doing today, what would be automated?
  • What would the new workflow look like?
  • How quickly could you scale if we doubled our team, sales, etc.?
Satisfaction: intuitive interface and a gentle learning curve

It doesn’t matter if it’s a music streaming service, a social media network, or a business analytics platform - if it has a clunky interface, superfluous options, or simply “doesn’t click,” users are going to be unsatisfied and switch to something more approachable. End of story.

At the same time, satisfaction is probably one of the hardest aspects to evaluate before you actually make the leap to implement a new tool. How do you judge whether an interface is intuitive, and what does that even mean? Some things to look for are an interface that looks more like the phone apps you use everyday than a business tool, that you would feel comfortable presenting to your customers, and that follows “established” design conventions, consistent with other tools you already use.

It’s also necessary to keep the learning curve in mind. Any interface can seem intuitive to an expert user who’s spent hours training on it, but the learning curve should be gentle enough that anyone can start effectively using it with a one-hour training session. There may be more complex features that require further training, but they shouldn’t be required for a user to start seeing the benefits. One reason we’ve seen demand planning tools get abandoned after implementation is the requirement for a super user trained on the nuances of the tool, a situation that’s best avoided.

Ultimately, though, the people in the best position to speak to satisfaction are existing users. Ask:

  • What is your customer retention rate? (unsatisfied customers will vote with their feet)
  • What percentage of your current users are active on a weekly basis?
  • Can you provide case studies and/or set up a customer reference call?

• • •

We get it. Evaluating the usability of software options can seem like a squishy exercise based on individual preferences and judgment calls, at best. But it doesn’t have to be that way. Take the time to discuss the key aspects of usability with your team; pick their brains on what current solutions lacks in those departments, and incorporate those learnings into your criteria.

And of course, ask any prospective vendors the questions above, try to get quantitative answers where possible, and compare them head-to-head. Any solution provider who’s confident in their usability should happily provide these stats.

Posted by Alloy