June 23-26, 2013
INFORMS Healthcare 2013
October 6–9, 2013
2013 INFORMS Annual Meeting
June 5-6, 2013
Customer Analytics Summit 2013
June 10-14, 2013
Predictive Analytics World
September 8-14, 2013
2013 ASE/IEEE International Conference on Big Data
“Data science begins with data. Nothing gets built without data. Data science continues with science. Accurate, persuasive and effective prediction requires patterns. The process of discovering that pattern is science. Any product worth building requires a reliable pattern to exist in the data.”
– Christopher Berry, co-founder and chief science officer of Authintic, in his article on recommendation engines in the current issue of Analytics.
Industry NewsSmart grid analytics ROI to exceed $121.8 billion globally by 2020
Utilities worldwide must maximize efficiency for constrained energy resources. Many are realizing the smart grid vision by using SAS Analytics and SAS Data Management to discover powerful insights buried in volumes of new data. SAS enables utilities to harness data for pinpoint control and monitoring, usage and demand forecasting, rapid diagnosis and repair, as well as predicting output from renewable sources such as solar and wind. For those capabilities, business analytics leader SAS is ranked No. 1 for smart grid analytics and data management/movement in the recently released utility industry report, “The Soft Grid 2013-2020: Big Data & Utility Analytics for the Smart Grid,” by GTM Research.Read More
Industry NewsFICO analytic cloud to enable real-time customer engagement
FICO will deliver its analytic-powered customer engagement services via the new FICO Analytic Cloud, for creating, customizing and deploying analytic-driven applications and services. Application developers, FICO clients and FICO partners will be able to take advantage of these services to rapidly create, execute and manage high-volume campaigns that engage customers in real-time with mass personalization across channels including brick-and-mortar, social and mobile.Read More
Special ArticlesStudy: Who can best manage ‘voice of the customer’?
Over the next three years, global organizations will make understanding and interacting with the customer their top priority. So says a new study from The Economist Intelligence Unit titled, “Voice of the customer: Whose job is it, anyway?” Yet only 56 percent of respondents to the survey, sponsored by SAS, believe their companies clearly understand the customer today.Read More
The Evolution of Pricing
From paper reports to spreadsheets to business intelligence, smarter pricing algorithms are being developed every day. So what’s next? Real science.
By Neil Biehn
The business-to-business (B2B) landscape is littered with a vast array of manufacturers, distributors and service providers. Each one, unique in its own way, has a process to create pricing for the organization. At one company, a new salesperson makes a gut-feel estimate of how much to increase the price of a high-visibility item. In another, a staff of scientists uses analytics, data mining and science to price all aspects of the business. Every company has its own story and its own place on the evolutionary path of pricing. Each step of the way provides new insights, new ways of doing business and a clear, measurable increase in revenue and profit dollars.
The question that hounds most executives is exactly how the tools and new ways of thinking drive value to the bottom line - paper reports, data specialists, spreadsheets, databases, business intelligence software and pricing science. Paper reports provide the macro-level information to understand the overall business performance. Data gurus create new tools that generate information previously unavailable. Business intelligence and analytical software packages bring data to a common, easy-to-use platform that is accessible to the entire organization. And finally, there is “pricing science” – the power to recognize opportunities with minimal effort, providing your team a direct set of marching orders specifying where action must be taken. In this paper, we’ll examine the benefits and frustrations associated with each step along the evolutionary path.
The inspiration for this article was a presentation delivered in the summer of 2008 by a SAS executive, Jim Davis . In that talk, Davis spoke about a company’s competitive advantage as it compared to the degree of intelligence using various analytical tools, including standard reports and science based optimization. There have been many books that touch upon this subject, but one in particular stands out – “Competing on Analytics: The New Science of Winning” by Davenport and Harris . In that book, the authors recognize (and give credit to SAS) of the various levels of complexity analytics can take on. Moreover, there is discussion on pricing optimization and the power it can wield across many industries. Finally, as an implementer of pricing software, I’ve gained a detailed view of how companies actually evolve, including the reasons to start down the evolutionary path and the benefits of each step. The evolution of a pricing organization is mired in huge pieces of paper, quirky data specialists, information overload and “behind the scenes” equations which explain why companies embark on the road to pricing excellence. The journey begins with the use of paper reports.
Most people reading this article have some sort of analytical or scientific background and might think negatively of companies who rely on paper reports. But the simple fact remains; paper reports are an extremely common and valuable approach to understanding the performance of various markets, business units and product lines. A quirky attribute of paper reports is their unique names as many of the reports are purposely named with an ironic sense of humor. At the top of the list is the “Timber Report.” Its name comes from the simple fact that company employees believe an entire tree is killed just to create it (good news for environmentalists, this company recycles). A common favorite is the “Kevin Halpin Report” – named after an innovative manager who began generating his own information that was subsequently adopted by the entire company (the name has been changed to protect the innocent). This last one is ubiquitous – I have probably seen reports named after individuals more than a dozen times.
The most common report, especially in manufacturing, is the detailed profit and loss statement (P&L). The P&L allows category managers, pricers and even sales people to clearly understand the relationship between price and cost. As revenues and costs change, price setters use the P&L to analyze the resulting impact to margin. More importantly, the P&L offers a glimpse into what must be done to meet budget and forecast goals. Over and over again, executives ask their managers to hit targets. And where do these managers go to first? The answer is the P&L. Using a calculator, it’s not too difficult to determine the types of price increases you’ll need to achieve the corresponding goal. As most P&L reports are spread across product lines, business units and key markets, it can be an effective tool to understand how price impacts bottom line.
Despite their efficacy as sources of more general information, paper reports are fundamentally limited. To start, they are at a very high level. There are few details to support the underlying numbers. As a result, more and more columns or alternative reports are created, often at varying levels of granularity, in an attempt to gain further insight into the business. To inject more and more information on the same report, the size of paper used to generate these reports can take up the entire length of the analyst’s desk. To overcome the lack of timeliness, the burden of generality and the lack of paper real estate, many companies turn to spreadsheets and database as the next step when using price as a key lever to drive business.
Spreadsheets and Databases
At some point, managers want access to the data behind paper reports. Since it is kept in a warehouse or more typically, disparate systems, the data must reside somewhere. In a quest for more information, executives begin demanding more elaborate and complex reports. Enter the data guru. In many cases, these experts have an enormous amount of power – and often for good reason. With only paper reports to rely on, executives and managers clamor for more detail, specifically around why a product line or business unit is underperforming. “Widget sales are down this quarter and we recently had a big price increase – which customers have reduced their spending with us?” The P&L only points you to the problem, the data guru can help find out why.
The tools of the data guru are clear – spreadsheets and databases. These specialists are the only members of the business team that know how to query a database and can write Visual Basic code. To some, they seem to wield almost magical powers. Typically, a request comes from senior management to join two very different data sets in order to find an answer to an old pricing problem. If I match our internal cost tables, customer databases and everyday transactions, can I determine those customers that cost the most to serve, element by element? Which West Coast customers get the biggest discounts? What is the year-over-year volume growth for my highest selling products? With this information, managers and executives are armed with the power to actually do something targeted and extremely valuable – take immediate action. As quickly as the need for the data guru’s expertise emerges, the challenge of manual report gathering soon appears.
As the backlog of requests grows, data gurus begin to realize the enormity of their obligations. Ad-hoc reporting will not scale.
The bliss of being one of the most important people in the organization turns to outright terror as the workload grows exponentially. More and more technically skilled individuals are hired as the need for tools becomes evident. A major milestone in the evolution of pricing reaches a critical juncture – companies must empower their sales organizations, pricers and managers with the tools they need to answer their own questions around pricing. The epiphany strikes every level of management and it begins with a proliferation of spreadsheets.
Data gurus become spreadsheet manufacturers. With these new spreadsheets, non-technical analysts can use software they are familiar with in order to achieve granular reporting. Paper reports are now more interactive. Some spreadsheets are over a hundred columns wide and replace their smaller paper report cousins. Others are simplified what-if analysis tools that quickly show the profitability of a price increase. Still more spreadsheets become quoting tools, allowing managers to set hard floors of profitability. An exciting new stage in pricing has begun as the entire organization begins to wield the power of the data guru. There’s only one problem – the same problem of scale we see in the creation of paper reports applies directly to spreadsheets.
I recently worked with a very forward-thinking manufacturer with an entire team dedicated to pricing. They are an extremely intelligent group that knows their business down to the stock keeping unit (SKU) and customer ID level. After three days of intensive discussions, we finally understood the phenomenally rich language they use to describe their products, customers and pricing practices, including the tools they use to manage pricing. This manufacturer used no less than 34 spreadsheets to manage pricing. Needless to say, it’s not easy to maintain 34 spreadsheets. Many were rock solid. Some had data problems. Most were quite useful but lacked the key information to direct a fully informed decision. A standard workflow for this manufacturer had the sales agent and pricer in and out of more than 10 spreadsheets. Some analysts barely used any of the tools while others relied heavily on all but a few. The gurus could not keep up. Users began forming customized versions, halting company-wide innovation in its tracks and, these spreadsheets had to be maintained, thus increasing the need for Information Technology support.
As information technology specialists and data gurus continue to bear the increasing burden of spreadsheet and data warehouse maintenance, their inboxes swell, inhibiting their ability to create useful and timely information in a scalable way. An outward view of capabilities provided by the software marketplace quickly leads technology specialists to business intelligence tools and analytics software – the ability to view all pertinent sales and cost information in one, accessible software program.
Business Intelligence Tools
The market has spoken about the power of business intelligence (BI) tools. Some of the largest software companies in the world paid huge sums of money to acquire this technology. Users of BI software are enabled to take the next step on our evolutionary path – institutionalizing company data into day-to-day pricing decisions available through a simple user interface. Moreover, these systems allow several data sources to be unified under one roof, giving users a single point of inquiry. Want to know which customers who spend under a million dollars a year give you the lowest margin? No problem. Which products have seen their volumes drop by more than 6 percent over the past six months? Easy. Negotiating a new contract and need to know if this customer has met their volume commitments? Done. The power of BI tools is obvious – if you have a specific question that needs answering, just log on and find out. No need to flip through pages of reports or spreadsheets that may or may not tell you what you need to know. You can take the data guru’s mobile phone number off your speed dial. The information is now at your fingertips.
In the early stages, the BI tool is a savior. The benefits are measurable. The costs of maintenance are manageable and scalable. However, as the BI tool becomes a natural part of daily life in an organization, the low-hanging fruit gets harvested. After a while, analysts have already looked at the data every way they know how – the problem of timely information and granularity no longer exists. A whole new set of concerns set in. I hear it over and over again: “I don’t know what I don’t know.” It’s a funny tautology that rings loudly from pricing executives. “We can analyze our transactional data in almost every way possible, but I simply don’t know the right questions to ask.”
With the immense power that BI tools provide, they cannot find new pricing levers by themselves. Alerts have been set. Standard workflows are in place. Yet, the power to uncover new opportunities eludes the most diligent and data-driven managers. Further, markets begin to change. “We can no longer dramatically improve pricing by simply looking at the standard workflows in the BI tool. There are too many variables in play. Every time we find an outlier in how we price, it is explained by an unforeseen wrinkle in our markets. Is there a tool in the marketplace that can mine our data, take into account market trends and eliminate the excuses of price deviation?” You bet there is.
Without a doubt, the best part of my job is seeing company after company use pricing science with amazing success. There is no better place to find the value of science than B2B distribution. What makes science so applicable in distribution? Complexity. Distributors provide a unique value proposition to their customers – everything you need to run your business can be held, managed and bought through them. But with this incredible value proposition comes massive complexity. Distributors must manage tens/hundreds of thousands of products as well as tens/hundreds of thousands of customers – each combination requiring its own unique pricing methodology, strategy and reasoning. Let’s list a fraction of concerns that distributors have:
• Customer A is highly competitive while Customer B is very loyal.
• Product No. 54 has been commoditized while No. 54A is offered only through us.
• Customers in the West are willing to pay much more than those in the Southeast.
• High cost items are highly scrutinized while lower cost items are often ignored.
• We need to sell our image items (e.g. milk in a grocery store) at a very competitive price – if I’m not competitive on those products, I’ll lose the business. Furthermore, these high visibility products vary for each customer.
These are complex issues that quickly become problems too onerous for an analyst and a BI tool to solve. Even more importantly – am I even looking at our markets theright way? We could have also wondered – “do blonde salespeople get better pricing than other salespeople?” The power of science now becomes evident – with 95-percent confidence, I can tell you that hair color has no impact on price. So, what variables truly impact pricing for your company?
Statistical algorithms (CHAID, cluster analysis, model selection to name a few) automatically mine your data and reveal those key attributes that describe how pricing differs between customers, products and buying environments. Optimally combining these key attributes produces a pricing segmentation – distinct groupings of customers, products and environments. Pricing segmentation software utilizes the collective knowledge of your pricers, sales agents, sales managers, dealers, customers and competitors to determine the factors that drive pricing behavior. We recently worked with a distributor that had more than 200,000 distinct pricing segments. After software implementation, the distributor began pricing very differently. Large-scale price increases gave way to a much more granular process with only those customers and products that underperformed within
their own micro-segment getting the call to move in-line with their peers. The result was an 80 percent jump in profit from the previous year’s price increase.
In another scenario, an aftermarket auto parts manufacturer needs to understand the effect price has on volume. It’s a classic question that every pricing executive would love to answer. If I know how price reacts to volume, then I know precisely what will happen once that 3 percent price increase hits the market next month. If I have a company directive to move more volume, I know exactly how much cheaper I must go to achieve that goal and not a penny more. That’s powerful. There’s only one problem: If you plot your transactional data with volume on one axis and price on another axis, you get complete noise. Understanding this fact is the first step to using elasticity correctly – your data is messy. It has the collective consciousness of the entire sales force in it. The hurricane that ripped through Houston, Texas, last September is in your data.
The recent banking crisis reveals itself in every report you’ve been given over the last six months. But there is good news. These external market factors can be statistically identified in your data and then surgically removed to leave the one last major variable that effects volume – the price. The transformation is quite remarkable. What once looked like wild changes in demand patterns now settle into a much more consistent and understandable pattern. What remains tells you how price changes impact volume.
Why was this of great import to the auto parts manufacturer? Take a second to think about the last year of new car sales. The volume of auto parts is extremely correlated to the number of new cars sold. Consequently, volumes are down across the board (except for maintenance parts that older cars require to keep running). To understand a price change that was made this year, we must remove the effect that new car sales have on volumes. When this happens, uncertainty moves to clarity as the true effect of price on volume becomes transparent. Here, science enters your data with a handful of external market effects in its possession. It exits your data with clear marching orders of where to correct poor pricing decisions and where to continue with the positive ones.
Science is not a panacea. Just like all analytical analyses and software it relies heavily on data quality and assumptions. Even still, I can’t help but reiterate its importance on the evolutionary path of pricing excellence – the ability, on its own, to provide a laundry list of action items that will generate profits for your company.
Paper reports, spreadsheets, databases and business tools all have their place on the journey toward pricing excellence.
Companies evolve from one step to another for clear reasons – the pain of the previous rung and the promise of the next.
These first four steps are all improvements on the same basic concept – give someone the numbers and they can make great decisions. BI tools are the pinnacle of this slice of the evolutionary chain. However, BI tools require you to know the right questions to ask. Without an intelligent user, BI tools become useless.
Science is different. It’s statistical and mathematical algorithms mine your data to give you direct marching orders –while taking into account several variables and mining your entire data set: here is a list of products and corresponding customers that require a pricing action. Nobody had to be smart enough to ask the right questions; it’s already been figured out.
Now that’s powerful. But it certainly isn’t the end of the evolutionary scale. Smarter pricing algorithms are being developed every day, and the most innovative companies are asking tougher questions as science becomes an ever-increasing part of how companies look at pricing.
1. J. Davis, 2008, “The Resilient Enterprise: Leveraging Information for Profitability and Longevity,” presented at the Revenue and Profit Management Summit, Walt Disney World, August 2008.
2. T . Davenport and J. Harris, 2007. “Competing on Analytics: The New Science of Winning,” Boston, Mass., Harvard Business School Press.