3.13.2019
Maximizing Customer Revenue With Analytics
By Tim Connell
The key to an effective customer experience is to know your customer and individualize your interactions to best fit their needs and situation. Doing so allows you to maximize revenue per customer. With a small customer base, this can be done in person and based on ongoing relationships. With a larger customer base, it gets much more difficult. You may need to customize interactions with customers you’ve never met in person, and likely will never meet. This is where analytics, especially predictive analytics, comes into play.
Paying Attention to the Present
Analytic tools can help you segment your customer base, identifying groups of customers that are more similar to each other than to other customers. Analyzing the buying patterns of those groups gives you models that can help to predict the likely purchases of new customers. All you need to know is which group the new customer is most like.
Similarly, you can look at the pattern of product purchases (sometimes called market basket analysis) to see what pairs of items are often purchased together. When you see interest in or purchase of one side of the pair you can recommend the other.
The more data you have and the more analytic tools you apply, the more individualized you can be. Combining customer grouping and item analysis gives you different item pairings for different groups. Adding in sales channels and other additional data helps refine these predictions even more.
Learning From the Past
In most situations, predictive tools are ‘trained’ using historical data. This historical data is run through an algorithm, which is then told what the ‘correct’ answer is for each record. The algorithm then identifies patterns in the data that best predict these ‘correct’ answers. When given new data, the algorithm looks for these same patterns and returns a likelihood based on the best match.
Finding the Best Approach
Point-and-click analytical tools that can generate moderately good predictions are increasingly available to non-technical users. Some of these tools even use predictive models to help you build your own. However, there are a lot of ways to build a bad or mediocre model, and fewer ways to build one that really works. Accurate predictions involve a level of knowledge about data and statistics that may require the help of an experienced expert.
Avoiding the Pitfalls
The biggest pitfall to watch out for is ‘overfitting’. An overfitted model is one that relies on patterns in the training data (used in development) that don’t exist in the real-world data. Experienced modelers follow a defined methodology and bring a number of statistical concepts and tests to bear to help reduce the likelihood of overfitting.
The old saying “Garbage In, Garbage Out” also applies. Predictive analytics tools are highly dependent on the quality of the data they use. Predictive algorithms will find whatever patterns exist in a data set, even if those patterns are based on inaccurate, mis-entered, missing, or otherwise invalid data. As a result, the quality of the underlying data set has an enormous impact on the accuracy and efficacy of the resulting tool.
Further, it is very important to ensure that the data consumed by a predictive algorithm exists prior to when the prediction needs to be made. This may seem obvious, but you’d be surprised how often this is missed. Because we use historical data to create predictive models, it is possible to build a model that relies on data that is not entered or available until after the prediction needs to be made. Such models will look great in development and testing but will fail spectacularly in the field because you are asking the algorithm to make predictions using data that doesn’t exist yet.
Effective but Always in Flux
Predictive analytics are not 100% accurate, and never will be. Ultimately, they just tell us when something has a higher likelihood of happening. A big part of implementing a predictive tool is determining what to do when it is correct and how to handle when it is incorrect. For example, an online retailer wouldn’t automatically add something to a customer’s cart just because the algorithm said they are more likely to purchase it. They would, however, add it to a “you might also like…” screen that makes it easier for the customer to add it themselves.
Similarly, predictive analytics is not a “one-and-done” effort. The context and data patterns that models rely on is subject to change over time, especially as new products are offered, new markets are entered, and customers churn. Models should be evaluated regularly to ensure predictive accuracy and should updated if they don’t meet established standards.
About RedMane
RedMane provides software solutions and systems integration services that address complex, real-world challenges in human services, healthcare, and the commercial sector. We are a problem-solving company. Technology is just one of our tools.
RedMane’s analytics practice helps organizations use data to improve outcomes, reduce costs, and drive new revenue. Our experts help clients collect, aggregate, visualize, predict, model, and manage their data to enable real insight.
To speak with a RedMane expert about your needs, email info@redmane.com or call 773-331-0001 today.
About Tim Connell
Tim Connell is a data scientist and solution architect at RedMane Technology. He’s been performing data analysis and implementing solutions for over 25 years. Tim’s work spans commercial firms and the public sector. He holds a Ph.D. from the University of Wisconsin.