The inaccurate allocation of expenses to product and SKU levels is a serious business operating problem. It opens the door for competitors to drive a price wedge between your customers and your overpriced products. It leaves you drowning with underwater pricing on loss making products and services.
Enterprise Resource Planning (ERP) Systems
Accounting disinformation at the level of Standard Costs is rampant in the United States and many expensive Enterprise Resource Planning (ERP) accounting modules are either ‘hotwired’ to sidestep the basic functionality of building Standard Costs from the ground up…or they are incompletely configured.
The true cost of introducing and maintaining strong compliance in a new ERP system is typically underestimated. Take whatever number you were given and double it. Then add the ongoing training and education costs. Our experience is that many ERP systems are never effectively installed.
A smart competitor will leave the incompetent to bleed from thin or negative contribution margins. After all, these are unforced errors. Concurrently, they will surgically undercut overpriced products and SKUs, while managing excellent margins for themselves at lower prices.
Customers will certainly acquiesce.
Because accounting software packages and financial reporting are often a ‘black box’ to those outside of accounting and finance functions, the analyst must dig deep to expose long standing practices which misinform management as to the real costs of production by SKU and thereby endanger competitive position.
Typically, each SKU has a ‘Standard Cost’ which reflects a standard production routing, bill of materials, up to date business input prices, and the allocated overhead expenses associated with producing the good or service.
This allows a budgeting process to ensue. Costs are estimated. Prices are set (some to meet the market, some to ensure a viable margin). Revenue and profit margin are forecast. Production and Material Variances result when actual costs don’t align with Standard Costs. These variances are tools for identifying and managing problems. A problem is any event which represents a variation from the assumptions implicit in the Standard Cost.
At least that’s the way it supposed to work. However, it’s an open secret that long standing variances against Standard Costs aren’t well tolerated by many finance function or senior management teams. Nobody wants to keep reporting them. They are a red flag to corporate overseers. Is it easier to remove the variance or the reason for the variance? No variances, no problems?
It’s common to see finance functions opt to simply absorb repeated variances into a revised Standard (as a dollar amount) while steering clear of any revision of the operational assumptions underlying the original Standard. Go check the age of some of that original data.
Alternatively, it’s common to see a finance function determine that “it’s too difficult to accurately allocate variances (especially overhead expenses) back to the SKU level…because an accounting fiction would result”. This typically results in leaving the variances in an undifferentiated ball which is ownerless at the level of product or SKU.
Which is the bigger fiction; leaving production and material variances ownerless, their root causes unresolved…or an imperfect allocation of variances to the SKU level to drive operational accountability and preserve the logic of pricing?
This is a watershed in the existence of any company. The point in time where financial managers decide that it’s easier to nix the variance than address its root cause.
Competitive advantage is lost when a company loses control over the accuracy of cost allocation. Standards become unreasonable. Prices are misdirected or margin is sacrificed. Some production capacities become excessive, while some are in chronic short supply.
Many businesses are a country mile away from maintaining accurate estimates for Standard Costs at the level of Product and SKU. Not surprisingly, this is often a big business malady, even though big businesses tend to be able to afford the systematic means to manage cost data effectively.
‘Mom and pop’ shops are almost always closer to an accurate understanding of their true costs…even if only intuitively. This is how they get their toes in the door.
As we work with clients to improve their Return on Invested Capital (ROIC), we find more and more examples of poor practice in developing and maintaining Standard Costs. Reporting and review disciplines relating to solving underlying operating problems are the first casualty.
Competitive pressure has pushed big companies to introduce more products and packaging variants in a bid to serve consumer preferences and lift capacity utilization. Additionally, there is pressure to allow extreme flexibility by acceding to customer requests for reduced minimum order quantities and shorter delivery service cycles.
A related problem is that it’s common to price in accordance with the view that for incremental production volumes, pricing anywhere above variable unit cost will result in a more profitable business outcome. Capacity utilization in the US is at an historic low.
Higher costs result from shorter production runs, increased downtimes (more set ups and changeovers), the additional capital costs of buying expensive finishing equipment and related maintenance regimes, as well as costly training and education programs and reduced labor skills flexibility. Original Standard Cost data are quickly superseded.
‘Throughput accounting’ and ‘lean thinking’ adherents tell us that under utilized production capacity represents ‘sunk’ fixed costs. They reason that shorter production runs and additional set ups and changeovers are therefore ‘free’. Hence, it’s OK to embrace smaller production runs and additional downtime (which enables reduced inventories and handling) because there are no additional fixed cost associated this production policy.
In fact, we are told that pricing a penny above variable cost will add a penny to profit. This is only true if a business decides to hold excess productive capacity, together with related fixed overheads indefinitely. If the excess capacity can be shed, these margin sacrifice justifications disappear. It’s always going to be more profitable to remove the expense and overhead associated with excess capacity, than to partially recover it with prices that result in a tiny positive Contribution Margin but a whopping Unit Loss.
Yet, the irony is that after years of margin sacrifice, the additional sales volumes concocted from margin sacrifice take on the appearance of reliable demand. Then we reason that ‘demand’ justifies available capacity. Circular logic. So, we need to break that vicious circle with accurate costing data and Standard Costs. Otherwise, the entire Sales & Operations Planning Process becomes dysfunctional. We end up buying capacity to generate losses, generating losses to utilize capacity.
In short, there is an economic element to measuring capacity utilization. Loss making products don’t ‘utilize’ production capacity, they destroy it. Hence, you can’t measure utilization without direct reference to product and SKU profitability. Again, this drives us back to accurate cost allocation and Standard Costs.
We need to remind ourselves that excess production capacity should be removed in many circumstances, especially where there are multiple production centers. We’ve written about the tendency to sacrifice margin to maintain revenue in previous blogs.
Put another way, one reason why low capacity utilization is such a severe issue is that margin sacrifice is widely used to destroy excess production capacity. Capital is cheap. Capacity is cheap. Why knock yourself out to optimize an operational measure of utilization when the financial cost of low utilization in apparently inconsequential. You can destroy it by giving it away or storing it, only to write off its economic value later…maybe in another reporting period altogether.
Combine the tendency toward margin sacrifice with inaccurate cost allocation and absorption methods and you have a recipe for producing many loss-making products. Indeed, we find some companies are running with up to 25% of their total productive capacity directed at the production of loss making products/SKUs. That’s an opportunity to lift prices or rationalize product offerings.
It’s not difficult to see how this view of ‘free’ fixed costs translated into carelessness regarding whether, or in what manner, such costs are allocated to the products which generate them.
Check with the accounting department as to whether ‘Standard’ Costs for products and SKUs are derived from up to date business input prices, Production Routings, Bills of Materials and Minimum Order Quantities. Are they instead arrived at by simply allocating historical dollar costs? If derived from history, ask if/how variances against Standard are allocated back to product and SKU level.
In addition to the accuracy problem, another underestimated problem is the volatility of product mix over time. Large swings in mix can radically alter the utilization of certain cost centers and result in rising or falling unit costs for individual products and SKUs. This can result in formerly profitable units becoming unprofitable and vice versa. In addition to the cost implications, changes in product mix can also move production system constraints over time from one routing to another. This can choke throughput and lead to revenue, profit, supply and service problems.
A fish rots from the head down. Check the math used in your Standard Costs.
Productivity Step Change (PSC) is a global Management Consulting Group based in the US dedicated to maximizing Capacity Utilization, improving Return on Invested Capital (ROIC), and increasing productivity for its clients across industry. Our enterprise-level, data-led, cross-functional business analysis typically requires 4-6 calendar weeks and is designed to provide evidence of your opportunities for overall growth. Please contact us if you would like to schedule time for a discussion and presentation.