“There is nothing so useless as doing efficiently that which shouldn’t be done at all.”
This week we’ll discuss the third of six Dimensions, Tools, as we continue to gather the building blocks of the Quality Enablement Framework.
A look through studies by Forrester Research and IDC shows impressive results for business tool implementations. Each article talks about the data and assumptions behind the numbers, but for brevity I’m just reporting the calculated return on investment (ROI) and investment payback period (payback). Here are a few examples:
- BI Apps Lowers Costs And Increases Revenue (Forrester) – ROI: 97%, payback in 20 months.
- The ROI Of Project Portfolio Management Tools (Forrester) – ROI: 255%, payback in less than 12 months.
- The ROI Of Online Customer Service Communities (Forrester) – ROI: 99%, payback in less than 12 months
- HP Mission Critical Services: ROI Benefit Analysis (IDC) – ROI: 438%, payback in 6 months
- Gaining Business Value and ROI with LANDesk Software: Automated Change and Configuration Management (IDC) – ROI: 698%, payback in 5.1 months
So it seems clear that implementing new business support tools is a no-brainer, right? Well, that’s only the success side of the situation. There are also lots of studies that talk about the failed attempts. Here are some of those references:
- Top 10 Reasons for Implementation Failure (Implementation Management Associates)
- Top 5 Mistakes When Implementing an ERP System (Software Thinktank)
- Critical Failure Factors in ERP Implementation (Case study – universities in Hong Kong & United Kingdom)
- Your BPM Implementation is Bound to Fail (BPM Institute)
We see exceptional returns for successful implementation, and very high rates of failure. That’s the backdrop to this week’s newsletter. Since new tools are quite risky to implement, and that discussion could fill many newsletters, we’re going to focus here on the ways that existing tools either enable or inhibit a better workplace.
As we’ve done previously, we’ll start with the Guiding Principles:
- Tools support processes (not the other way) – In general, tools should support, and potentially enhance, current processes. This may mean that the initial implementation uses only a subset of the available functionality, with additional features implemented over time, but that approach will typically lead to more successful use by the teams.
- Tools increase team efficiency, not reduce it – I imagine that many of us have seen tools that reduce team efficiency because of all the “extra” data needed to keep the tool up to date, or because the new tool has a less efficient user interface. If that’s the case, it will inhibit, rather than enable, a better workplace.
- Tools adapt to changing processes – If a tool is too rigid to change as the business needs change, then it can inhibit an organization’s ability to respond to a competitive threat or opportunity.
Given those Guiding Principles, let’s move to the Tools dimension Subject Areas:
- Impact on user efficiency – This directly relates to the 2nd Guiding Principle. If a tool reduces the efficiency of the users, it should be questioned whether it’s the right tool. Users need to feel that the tools they use are primarily for their benefit; otherwise, they will use them reluctantly, and the tool will become an inhibitor of a better workplace.
- Non value-added effort needed to “feed” the tool – This is related to the prior Subject Area. This problem is usually manifest when practitioners need to enter more data into the system than they need for their own use, often because some other organization entity needs the data for aggregate metrics or process governance.
- Adaptability to process change – This ties directly to the 3rd Guiding Principle. The discussions center around the various tools’ abilities to adapt to changes in the business environment, and how well the tools can respond to new demands.
- Tool training – Without proper training, even the best tools will fail to produce the results expected, and the users will not be able to achieve the real efficiencies and effectiveness that the tools offer.
- Degree of acceptance by users – This is often the best indicator of the items discussed above. Quite simply, if a tool doesn’t help the practitioners, they will either find alternate solutions or work in frustration.
- Tool change management process – How are changes functionality prioritized, selected for use, and implemented across the organization?
- New tool selection process – While we will not spend time discussing all the reasons that tool implementations fail, it is worth discussing how new tools are selected. This includes stakeholder involvement, clearly defined selection criteria, agreed measures of successful implementation, and accurate determination of ROI, all before the specific tool is selected.
Next, we’ll cover the Metrics dimension, our 4th building block.
Until Then …
“A bad system will defeat a good person every time.”
W. Edwards Deming