It takes gritty work to make an AI application succeed—but it can be worth the effort
After the flurry of hype, artificial intelligence is on the rise and enabling a smarter, more efficient and more effective business. Once-scrappy experiments have now matured into high-impact enterprise applications.
Many organizations, however, are still unclear about how to build AI solutions that can deliver business value. AI success stories often focus on futuristic themes with machines aiding humans and complex-sounding concepts such as deep learning and evolutionary algorithms, but they offer little insight into the foundational work required to bring these concepts to life. In this article, we’ll take you through one such glamorous AI story but then delve into the non-glamorous (yet essential) underbelly of the same solution. Our intention is to arm you with information that can help set realistic expectations across your organization. And while our story is mostly about a biotech company, the concepts are applicable across all industries.
AI success story: Orchestrating promotions to maximize customer engagement
One of the areas to feel the greatest impact of artificial intelligence is in coordinating customer engagement: channels, content and timing.
Consider this example: A biotech marketer saw that the customer’s affinity for engaging with the sales force, an expensive channel, was decreasing. At the same time, engagement rates for alternative channels such as emails and mobile push notifications were low and declining. The marketer’s task was to orchestrate personal and non-personal touches and leverage the content that’s available to improve overall engagement. What the marketer needed was a series of recommendations for each customer so that effort across channels is coordinated most effectively and efficiently with a view to increasing customer engagement.
The glamorous side
To make this move from complex and uncoordinated to streamlined and orchestrated, the biotech marketer needed an elaborate machine learning pipeline that, 1) quantified customer channel and message preferences, 2) translated the promotional sequence to engagement value, and 3) optimized the recommended sequence. This required algorithms like those that drive Netflix and Amazon and are used to supercharge product recommendations, run on integrated data that is intricately stitched together and hosted on a data lake.
But has it worked? Did the overall engagement increase? How exactly would we measure this?
The solution has been in place for more than a year. It was first deployed in historically poor customer engagement situations and it did well. Customer engagement scores improved by nearly 25%. With these promising results and a new infrastructure in place, the marketer is now looking to expand the scope. As for the ROI, as measured by incremental sales resulting from the improved engagement, we see significant returns even in the limited-scope engagement.
AI success has not been limited to this application alone and several others are in the works. For example, AI is now being used to predict different patient events ahead of time, including disease incidence, treatment progression and churn, so that appropriate proactive interventions can be planned. In the clinical trials space, brands can now use AI to predict a trial’s progress and the timing of its eventual completion, so that the necessary corrective or preventive actions can be taken to ensure timely commercialization. In manufacturing, streaming data from devices can be used to predict batch non-compliance, which if addressed in a timely manner would prevent significant product waste.
The non-glamorous side
There’s a lot that AI and machine learning can help your business with, but to unleash its power, you first must cover off on essential components that are, well, not as glamorous. These three components contribute to the foundation on which the glamorous side rests.
1. The necessary data: Data is at the heart of any successful AI application. If you’re a tech giant like Google, you already have vast data sets about
Common data sources for promotional channels
your customers at your disposal. But in a typical enterprise, customer data is much harder to coalesce. It’s scattered across different vendors, each of them holding a piece of the customer puzzle in their databases. This necessitates procuring data from every vendor, signing contracts with each that include agreements on frequency of delivery and quality levels. To make them analytics-ready, all these different data sets must then be stitched together by applying several business rules to clean up the “noise,” or the inevitable data-integrity issues. If some of these data sets are unstructured, (free text, audio, video, etc.), then they require advanced techniques to extract relevant information to power analytics. All of these efforts can mean huge investments and significant effort from key personnel.
In the example that we shared earlier, data from 10 different vendors covering various promotional channels was integrated and configured so that it could be easily refreshed weekly. It took more than a year to negotiate the data contracts with the vendors, and to align each of them onto a shared schedule. The increased frequency of data updates also resulted in increased data costs.
Data governance was another issue, as elaborate business and analytical processes had to be designed and developed to stitch together all of the data sets from disparate sources, and AI demanded that this be a continuous rather than a onetime process.
2. The algorithms: There is no master algorithm—at least not yet. Algorithm modeling is a time-consuming process with no short cuts. Data scientists consider a broad set of available algorithms and pick the one that’s best suited for the problem at hand. This can be a tedious process, and often, 10 or more potential approaches are modeled, considered and evaluated for fit and accuracy before a final choice is made. Ideally, algorithms are set up to enable self-learning with incoming data.
A next best action/engagement prediction problem like the one that we described earlier will include a range of algorithms (see Glossary below): recommender systems that help predict a user’s preference (collaborative filtering, mixed effects models), sequence pattern mining methods that hunt for patterns in data (SPADE, RNN/LSTM, convolutional neural networks), and finally execution optimization (evolutionary algorithms, dynamic programming).
As this solution matures, new metrics to better understand engagement must be designed and implemented.
3. The technology infrastructure: Because data is increasingly digital—and bulkier, less structured and generated at faster rates—you need to upgrade your infrastructure to manage it. This means storing the data more cheaply, processing large volumes and a variety of data quickly, and having the flexibility to run AI algorithms.
Given the current state of technology, there is no single, unified tool that supports both data processing and algorithms. This is primarily because the optimization techniques underlying the performance of databases are not precisely those that support optimal execution of algorithms, thus prohibiting the creation of a unified platform. The AI technology community has recognized this problem and is working towards solving it, but a solution is a few years away. Until that time, today’s technology platform needs to have the components that are necessary to support data processing and algorithms, and the seamless transfer of data between these two engines.
Organizations are increasingly building data lakes to bring all of the enterprise data into a single, unified platform that can support its analytics needs. For an AI engine to keep generating recommendations on a frequent basis, data needs to be refreshed with increasing frequency. Building automated ETL (Extract, Translate and Load) processes helps avoid the need for humans to manually process the data. Building a data lake is a multi-year journey that involves significant investment from enterprise IT.
If AI could solve every problem with five times the return on investment, buy-in would be a snap, but that isn’t often the reality. Often, the first step is a proof of concept to firmly determine potential ROI. You’ll need multiple discussions with various channel owners to get their buy-in for an orchestrated solution. And finally, implementing such solutions broadly is an art unto itself.
Despite the difficulty, many problems are worth solving with AI. But don’t underestimate the time and effort required to accomplish large-scale AI initiatives. There are no simple and easy solutions. Be willing to experiment. Even more importantly, be willing to fail. Be impatient when it comes to quality and demand the level of effort the organization deserves. But be patient about results, too, because there’s much to be gained for those who persist in building a true capability.
About the authors
Arun Shastri leads ZS’s Analytics practice for multiple market sectors. Arun’s work on a broad range of commercial effectiveness issues spans multiple industries, including healthcare. He is an expert in analytics organization
design, data science and advanced analytics.
Srinivas Chilukuri is an associate principal in ZS’s Evanston, Ill., office. He is a leader in advanced data science with a focus on helping clients on their AI journeys.
Glossary of AI terms
Operating Without a Commercial Blueprint: Empowering the Field for Niche Therapy Launches
October 24th 2024How emerging biotech companies can create fruitful partnerships between home office commercial teams and the field force to enable this intelligence gathering, while driving commercial success.