I recall my first week on the job at Lowe’s in September of 1992. I had worked with Lowe’s as a consultant (Ernst & Young) for nearly six months before joining as an employee, so I was well aware of Lowe’s ambitious growth plans. I had a meeting with the newly appointed head of technology (at that time, we called it Management Information Systems) to discuss a recommendation I had made as a consultant. I proposed building a data warehouse to support Lowe’s transformation from a chain of small footprint hardware stores to big-box retailing.
At the time, my recommendation was considered “out there” by the leaders of the business. Needless to say, those business leaders viewed the proposal with a good bit of skepticism. Luckily, the new head of IT, Bill Irons, saw the potential in data warehousing and gave me the green light to start the project. I immediately attended a seminar hosted by the father of data warehousing, Bill Inmon, to get more information on the proper way to build a data warehouse. After the seminar I sat down with Bill at a bar and had a great conversation on different ways to approach data warehousing in retail. By mid-October of 1992, we had assembled a small team to begin the effort. We tried to convince the business to invest in Teradata, but honestly, it was too early in our evolution. Without Teradata, we built our initial data warehouse on IBM DB2. The team also built a custom graphical query builder and BI tool that launched with the data warehouse in June of 1993.
The Lowe’s data warehouse became an integral part of the technology used to support a massive business transformation. From 1992 to 2000, Lowe’s revenue grew over 5X to nearly $20B as Lowe’s completed a successful transformation to a national chain of DIY superstores. The original Lowe’s data warehouse project would be the first of many in my career that focused on analytics. I have had the good fortune of working with incredibly talented teams that built sophisticated business intelligence (BI) solutions in retail. From data marts, data lakes, big data, complex tactical reports, data mining, graphical dashboards, data visualization, alert-based exceptions, refined forecasting models, and embedded analytics, our teams tackled it.
Throughout all of these efforts, building trust and organization change management was a consistent challenge. Building trust in analytics is not a trivial task. Mark Twain popularized the old saying, “there are three types of lies: lies, damned lies, and statistics.” The context behind that saying has always been alive and well in business. I could write multiple articles on the challenges of building trust in data, but suffice to say, in any form of BI application, it is job number one. Without a foundation of trust, BI applications simply do not have any chance of success.
Other retail leaders I spoke with repeated this theme when we discussed BI. The leaders often cited culture as an inhibitor to adoption, especially in those organizations that were traditionally paper based. One leader noted users trust “their” data but are often highly suspicious of new or unknown data sources.
Change management is often just as difficult. Incorporating analytics into a developed decisioning process is not as straight-forward as it might seem. Using new tools, changing processes, and stoking the fires of curiosity in business users sometimes requires innovative approaches. Over the years, we developed clever acronyms and names for new tools, employed a variety of training methods, and partnered with business leaders to move projects “over the hump” and into production. However, the one method that consistently produced results, we called, “embrace the incumbent”. In other words, we used what was familiar to the user as a means to get to a new way of operating.
Our approach to rolling out a new graphical BI dashboard to Lowe’s store managers is a great example of this approach. Initially, these dashboards went over like a lead balloon with the store managers. The store managers were very accustomed to using an old report we called “Vital Statistics.” Vital Stats (for short) was definitely one of the more convoluted reports I have ever seen. However, it contained a good amount of the information store managers wanted to see on a daily basis, hence the name. Our BI team felt the dashboards were a huge step forward but getting the store managers to change behavior was more difficult than envisioned.
We decided the best approach was to “embrace the incumbent.” Our team painstakingly recreated the Vital Stats report into our MicroStrategy BI tool. The team developed “hot spots” int the body of the report where the store managers could click to see more information (which coincidently happened to be our new dashboards). We made the newly developed Vital Stats report the launching pad for the store managers. When the store manager clicked on the icon to view daily performance, they saw their familiar Vital Statistics Report in a new digital format. They soon learned they could click on various parts of the report and see much more targeted information about their performance in an easier to digest manner. At the next year’s national sales meeting, a large number of store managers went out of their way to find me and tell me how valuable the new dashboards were and how intuitive it was to use them. Had we not embraced a familiar interface as the launching point, I am not sure we would have achieved the same outcome.
From my discussions with other retail and technology executives, it is obvious the behavior I observed at Lowe’s is not isolated. Every executive I spoke with said user adoption was the biggest obstacle in obtaining a positive return on their BI investments. We all know that the user experience with interactive dashboards and alerts are vastly superior to crunching Excel spreadsheets or ruffling through hundreds of pages of printed reports. Yet, many business users cling to what they consider to be “known and comfortable.”
When confronted with business users that were reluctant to change, many executives worked to create a sense of urgency or need. As one executive put it, “necessity often drives change.”
Let’s fast forward to our current state of business intelligence. Slowly and surely, digital reporting and dashboards have replaced the paper report. While there are still some holdouts, most executives have concluded that digital and interactive trumps paper and static. Today business leaders can open interactive dashboards on their desktops, laptops, tablets, or smartphones and receive much more targeted information than was ever conceivable with paper-based reports. In addition, these platforms allow progressive drill-down capabilities to zero in on anomalies.
However, for all these advances, the vast majority of business intelligence platforms still focus on two types of analytics: descriptive (what happened) and diagnostic (why it happened). While having the benefit of accurate history and diagnostics aids in the decision process, for complex multi-variant decisions with no obvious correct choice, it is woefully inadequate. Decision-makers often rely on their experience (past choices) and “gut feel” to make these complex decisions. The results are generally quite mixed, especially in today’s digital economy, where customer preferences and behaviors are changing more rapidly than ever.
Artificial intelligence is a natural fit to help make these types of decisions. Computer algorithms can be tuned to bring together millions of pieces of information, understand relevant trends, and simulate multiple approaches to arrive at a “best” decision. Consider a few examples from the retail industry. What products are to be promoted and when? How much product to make or purchase? How much inventory to allocate to specific stores versus how much to retain in the distribution network? What product prices to reduce (markdowns), when, and where? How much labor is needed in a specific store to achieve the best sales and service outcomes?
All of these decisions require insights across multiple parts of the business in addition to external data such as market conditions and consumer trends. Yet, in retail today, the vast majority of these decisions are made primarily by relying on past performance as the sole predictor of future outcomes. This phenomenon isn’t for the lack of technology. As I walked the NRF Expo 2020 floor, I observed many solutions that leveraged the power of AI to make these types of decisions.
With the continued drops in computing prices and storage, the computing infrastructure for AI is more affordable than ever. AI-powered algorithms are finding their way into many packaged software solutions. Also, AI and machine learning (ML) platforms from the “big 3” providers (AWS, Azure, Google) are more powerful and more accessible than ever before. However, even with the availability of high-quality AI solutions at more affordable rates, we are not seeing widespread adoption of AI in the critical areas of retail decisioning yet. Why? I believe part of the answer lies in the application of the lessons we have learned along the BI journey.
Any seasoned retailer will tell you, “retail is part science and part intuition (or feel).” Many decision-makers see AI as a “black box” that spits out a decision. They don’t believe a machine can have the same level of intuition or insight into complex retail problems. In other words, many retail decision-makers are not ready to turn over critical decisions to computers and algorithms.
As we learned with BI, sometimes we have to “embrace the incumbent” to enact change in an organization. Hence the emergence of the intelligent digital assistant. An intelligent digital assistant (IDA) leverages predictive analytics and AI to build a set of recommendations for review by a decision-maker. The decision-maker can interact with the IDA to refine the rules and parameters governing a decision and execute simulations or “what-if” scenarios to optimize decisions.
In other words, instead of making a decision, the IDA makes recommendations that can be reviewed and enacted by a human decision-maker. The difference is subtle but significant.
In part II of this article, I will describe the characteristics of the IDA and how it serves as the perfect bridge between business intelligence and artificial intelligence.