<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=1232938&amp;fmt=gif">
Insights > Blog

Most Companies Are Looking at the Wrong Metrics

By Kyle Clubb | Posted on March 15, 2022 | Posted in Data & Analytics

The two main achievements of any analytical endeavor are solving problems and answering questions. 

Many organizations grappling with performance issues or enabling data-driven decisions fail to follow a few simple best practices. They operate with incorrect assumptions on measuring organizational performance across all levels. 

Learn more about our data & analytics solutions

Over the past 20 years,  working with a wide range of companies around advanced analytics, FP&A operations, and organizational psychology, I have summarized the most significant barriers to fully utilizing an organization's analytical talent and organizational potential. 

The first fallacy involves an unhealthy reliance on standard accounting financials to measure organizational performance. 

This is not surprising given the fact that these perspectives are ingrained into every academic business school program and are required by many compliance and regulatory bodies. After every earnings release, they are dissected and read into by every Wall Street analyst, fund manager, and the retail investor. Executive compensation packages and bonuses that rely heavily on equity options and non-salary compensation strategies are often based on these views.

Read our free data & analytics guide

It is no wonder that the majority of the budget, effort, and mental fuel is dedicated to their existence and triple-checked through expensive auditing procedures. They have given rise to an entire ecosystem and economy in the world dedicated to their reverence and loyalty. Overpriced “enterprise” software, academic degree programs, industry certifications, continuing education required to keep those certifications, auditing firms, and law firms are all dedicated to protecting this ecosystem. 

Technology’s not at fault

Over the years, directors, boards, and their respective subcommittees have asked me to provide valuations of companies' data and analytical assets in preparation for mergers and acquisitions offers. After years of researching, interviewing, and understanding how organizations improve performance with data and analytics, having these functions in a technical business line is often ineffective. I have seen many multi-million dollar,  highly scalable, sophisticated, and honestly brilliant analytics and reporting platforms that provide little to no value to the organization. Is it the technology's fault? No, it is absolutely a people and process problem. 

This goes back to the first sentence of this article and the underlying problem with most performance measures. Organizations must understand what achievements they are working toward, independent of behavior. 

When I ask most CEOs and entrepreneurs this question, they will typically point to the mission and vision. This is a great starting point. The follow-up, of course, is to show me how they measure those achievements. This is where things typically start to break down. For the sake of not drawing attention to any particular client. Here is a generalized example of how most client conversations go, this time with a nonprofit: 

Me: How would you describe your organization's achievements at the highest level?

CEO: Create life-changing educational toys for children with disabilities.

Me: Great, so you would say your main achievement from that statement is to produce a toy-child pairing?

CEO: Yes.

Me: How do you gauge how well you are producing toy-child pairings

CEO: We have reporting that gives us all kinds of numbers.

Me: Like what? What is most important?

CEO: We know the total toys produced. We know total toys given away. We know cost of goods. We know R&D costs. We know donation amounts.

Me: Your mission has the term “life-changing.” What does that mean exactly?

CEO: Toys should help raise the educational capacity of the children we help. 

Me: Great, how do you measure that?

CEO: We have a product group that determines the most effective toys and features for educating. 

Me: Do you know your toys’ effectiveness in elevating cognitive abilities this year compared to last year?

CEO: No

Me: Do you even know if the toys are trending up or down in this regard?

CEO: No.

Why would a CEO not have the slightest idea as to the answer when it is so essential to the organization's mission? Because it can’t be found in the income, cash flow statements, or balance sheet, and a considerable deficiency of the managerial accounting workflows. 

To summarize, the organization's achievements must be formalized from the top down. A bottom-up approach always produces metrics and measures that are not cohesive and become myopic in scope. A simple method to determine appropriate measures is the ACORN method, which is a simple and powerful way to test whether your measures are beneficial and valuable:

AccomplishmentA is for accomplishment

Behavior at this stage is a distraction. What distinguishes between accomplishment and behavior is not so simple as it often sounds. There are a few questions we can ask to test for accomplishments.

Can we observe this thing we have described when we are not actually observing the performer or when the performer has gone away? Descriptions like “pallet of toys” are accomplishments because we can see them even after the warehouse worker has gone home. If we are not describing behavior, then the person responsible for the accomplishment need not be present.

ControlC is for control

If people do not control the accomplishment, it cannot logically be associated with their mission. I have seen many performance measures attached to people or departments with no control or partial control of the achievement.

 

ObjectiveO is for only objective

When you can ascribe more than one goal to a role, all but one (and perhaps all of them) are subgoals. You have either failed to identify the mission altogether, or you have not distinguished it from a subgoal. 

This one can be tricky. An internal training department with the candidate measure of “people trained” seems straightforward and benign. It looks like an accomplishment but does not constitute the complete mission.

Key questions to ask would be: If the accomplishment were perfectly achieved, would performance be perfect? Would anything more be desired from the performer or department? This is referred to as the ultimate performance test. If the training department did a perfect job of keeping people in training, there would be no one working—and that indeed isn’t what we would expect from a perfect training department.

reconciliationR is for reconciliation

Usually, when the missions of two roles or departments within the same organization conflict, one or both of them are ill-conceived. 

For example, Amazon had a metric called Unregretted Attrition Rate. This measure required Amazon managers to fire (unregretted attrition) a certain percentage of their direct reports. Yes, you read that correctly, required to fire a certain percentage irregardless of performance. 

This type of measure is an extreme example of the ridiculous measures I have encountered in my consulting work. What if you had a high-performing team of 10 superstar developers and, looking at your yearly bonus calculations, realized that you have not met your quota for fires? What is a manager to do? Amazon managers would hire people just to fire them so as to not affect the core employees they couldn’t imagine parting with. Can you imagine the wasted time in hiring, onboarding, and offboarding absorbed by multiple departments to facilitate this wholly unnecessary and conflicting measure? Needless to say, this metric failed the reconciliation test miserably.

 

NumbersN is for numbers

If we cannot measure performance—especially at the level of the mission—we have not described the mission. One department head of quality control once described the mission as “systems monitored.” It is difficult to say to what degree a system has been monitored without resorting to such descriptions of behavior as “10 minutes to example each system.” Remember that the ultimate test of whether we have identified a measurable accomplishment is to determine whether something remains after the department or performer has left the building. When the performer or department goes home, no “monitoring” systems are left. But, “errors detected” is an observable output, and we can read the measurements without ever watching anyone monitor the system.

Having a cohesive organizational achievement matrix that is presented and measured accurately takes proper planning and help from consultants experienced with best practices and pitfalls. Armed with a proper plan and exemplary measure for analytics and reporting is a breath of fresh air to our clients that we partner with. 

For more information on advanced analytics and business intelligence best practices, contact us today.