The path to predictive analytics success for distributors is not short and your teams will go through multiple iterations of each program before you maximize your opportunities. However, following a handful of steps will get you started on your path to success.
1. Define your objectives — What are you attempting to uncover?
Although the discussion in part 1 centered on customer acquisition and retention, predictive analytics can be used to answer any number of questions:
- Which prospects are most likely to be receptive to our product/service offering?
- Which other products are customers most likely to purchase when purchasing product X?
- Which customers are most at risk of defection in the next 90 days?
- How much more is a customer likely to spend annually if we can improve their satisfaction scores?
- Which zip code will most likely experience delivery delays next year?
Remember to be reasonable with your expectations and don’t forget that while predictive analytics predict what is likely to happen, there are no guarantees. Start with objectives that touch the paths most traveled by your customers so that you focus on the improvements that will impact the most customers. Once you’ve defined the questions to be answered, it's time to start data collection.
2. Define data sources and collect data — What data is relevant to meet my objectives?
Before you start data collection, it’s best to ensure you have enough data to give you reliable results. For example, if you’re looking to predict customer attrition, you’ll see better results from most models when the following criteria are met: at least 50 unique products, over $50 million in revenue, 10,000 or more customers, multiple years of historical data and order frequency of 5+ annually for at least a quarter of your customers. These are basic guidelines for one type of analysis, but should help to demonstrate the fact that the more transaction volume you have, the better you can predict the future.
Many organizations will struggle to gain access to all of the data required; focus on using what you have rather than delaying a program due to insufficient data. Data may come from spreadsheets, ERPs, CRMs, websites or data merchants; the difficulty in routinely collecting data will in large part dictate your willingness to drive predictive analytics programs. Even if the effort to gather the initial data set is tedious, it will serve as a learning experience for what to expect in the future and may lead you to take the actions necessary to ease the process.
To help in easing the process, make sure you define ways in which you can automate data collection. Regardless of the data integration method, the easier it is to collect the data, the more likely you’ll incorporate the data into the analysis.
Determining where to store the data is also important, as you want to ensure that both associates and external partners have controlled access to the data as needed. Also consider the skills required to best manage data aggregation; often teams attempt to do more than they are capable of supporting on their own and efforts stall quickly out of frustration.
3. Improve data quality — Which attributes are critical for success?
This may be the most difficult step in the process as it requires execution by most members of the organization. Regardless of the data source, if the data entered was incorrect, use of that data will skew your results and lead to a lack of support by associates that know bad data is fueling the model.
Many organizations lack adequate data governance leaders or policies, so consider assigning a single individual to oversee all data activities, someone capable of defining all reliable and trusted sources of data within the organization. This step will help to ensure not only good predictive analytics programs, but also will reduce the risks associated with data mismanagement of personally identifiable information, product content, etc., that can lead to poor productivity or fines.
With or without a data governance leader, data formats must be defined so there is a consistent structure in which the data is organized so cleansing, normalization, de-duplication and data enhancement efforts can be easily executed. Simply ensuring that products are assigned to the right category, customers are assigned to the correct SIC/NAICS and non-standard orders are identified with an attribute distinguishing them from others are just a few examples of how data quality can be improved, ultimately leading to better predictive models.
4. Buy/build a model — How can you make progress immediately?
If there is a step in the process that will create internal discussion, this is the one. Questions will arise about who makes the decision about how to move forward, who will do the analysis, who will use the analysis, etc. This highlights the importance of the first step, defining your objectives. It is critical that these answers are clear before moving out of the first step. The reality is that these programs will often overlap multiple functional teams, so it is important that these teams be involved in the decision-making process as objectives and associated commitments are defined.
Some organizations will be inclined to build their own models, something that often requires data scientists to effectively execute. If you decide to start simple or find yourself on a limited budget, internal resources may be enough until you get leadership support to expand the program and illicit external expertise. The availability of predictive modeling and analytics software in house will help to get
you off to the right start given you have the talent to leverage the software.
If the decision is made to bring in external resources, there are several options that will allow you to get started without significant levels of commitment while still providing results almost immediately. For many, this is the easiest path to quickly determining the viability of these programs. Often, providing just a few years of transaction data can be enough to get started with an attrition program, for example.
In some cases, the initial analysis can be presented back to you within weeks so that you can review the results and then make a decision about if/how you’d like to move forward with an ongoing program of analysis.
5. Begin the program and validate the model — What seems to be working ... or not?
Now you’re ready to see the results of your efforts. At this step, you’ll be testing the model using any number of methods to ensure the model works well in different circumstances. In particular, you’ll be looking for characteristics in the data that create anomalies that should be addressed so as not to create inaccurate results. In the wholesale distribution world, these could be factors like customer status (new to company, credit hold, unassigned rep, etc.) or product status (out of stock, discontinued, recalled, etc.).
In all cases, you’ll most often do regression testing to validate the model. Over several weeks or months, the model will become more accurate as modifications are made to the models. Be sure to include all impacted parties in the review of the results so buy-in to the subsequent actions are not a struggle. Plus, getting different perspectives on the results of the analysis will help you uncover any challenges with the model’s accuracy. Use regression testing as your friend to outline the accuracy of the model; this will be critical as opponents of the program may attempt to devalue the results based on prior perceptions about their customers’ behaviors.
6. Establish an internal communication plan — What messages are important to disseminate?
All of this modeling is a wasted effort if the output of the analysis is not utilized to improve performance and increase customer satisfaction. Even before the first step, begin to create an internal communication plan that encompasses all associates in the organization. Whether its sales reps pushing back on attrition analysis because they know their customers better than some new predictive model or a merchant that doesn’t believe an online tool can better select the right product to be paired with another, the truth is that associates will naturally push back. This is healthy and should be managed such that all parties involved are reaching a similar conclusion by the time the program goes live. The analysts must remember that years of experience and industry knowledge from other associates must be leveraged as they develop and refine the program.
A communication plan that ensures transparency throughout the process is a great step in the right direction. Consider a regularly scheduled communication, like a brief video that shares progress on the program and the feedback being received. Even after the program launches, continue to communicate the impact to the business and the assumptions that have been validated and/or disproved.
The Future is Yours to Predict
There’s a lot of noise about big data, artificial intelligence and whatever other buzzword comes up this week; the truth remains that somewhere in all that jargon lies the reality of improved performance if you harness the information appropriately. Drop the expectations that you’ll have a world-class program in place in six months and instead simply commit to starting with the basics. After you’ve been in the game for a while, build on the knowledge you’ve gained to create programs that will ensure your prospecting and retention efforts are generating record levels of sales growth.
Oh, by the way, the predictive models and associated program designs are the easy part; the cultural change is much more difficult. I’ll cover that another day.
Mark Linder is CEO of Charleston, South Carolina-based MARV LANDERS, LLC. He has more than 20 years of experience leading multi-billion-dollar distributors through rapid changes on their digital journeys. Linder has achieved digital success by relying on the people and strategies required to effectively move organizations through significant strategic and operational changes. Reach him at email@example.com.