Social Media Analytics, a Blessing in Disguise for Traditional Market Research

Anurag Sachdev
Businesses across the globe are waking up to the need for putting their customers at the heart of every decision. The need to better channelize the innovation budget to address what customers really want, has never been more acute.

A PC giant with over $30BN in revenues, has been looking to incorporate customer-centricity deep into their product development and planning cycles. Traditionally very strong in the enterprise PC segment, they want to replicate and augment their success with the consumer PC segments – hence the need for customer-centricity and the call to Core Compete.

Disclaimer: We don’t have Sherlock Holmes and Dr. Watson on our payroll…yet!

Dr. Watson: Sherlock, how do we get to know what customers want?
Sherlock Holmes: Well the old school would say – ask them. The traditional way has been to leverage market research campaigns to interview relevant candidates and categorize their responses.

In a typical campaign, you interview existing or potential customers, trying to understand their preferences, likes and dislikes to landscape potential innovation areas for the next product cycle. Towards the end of the survey, you go about identifying the behavioral type of the interviewee so as to slot their responses into relevant customer segments. A good example is Conjoint Analysis where the questions are designed to evoke customer responses to potential product profiles and feature trade-offs.

These campaigns also involve “tracking studies” that comprise of periodically conducted surveys. These are typically shorter and are good for maintenance of the preference models created earlier, just like you would service your car.

Dr. Watson: So, what’s the catch?
Sherlock Holmes: Nothing, it’s awesome but don’t you want to be more awesome. It’s got a few glaring limitations that have intensified in today’s fast paced world:

Dependence on the sample quality: The preference scores thus generated from survey responses inherently contain the noise caused by low sample quality
Lack of volume: In today’s world where even an abacus would generate gigabytes of data, a marketing research sample can rarely be large enough to be conclusive on all the questions you have
Expensive: Scaling up the sample size of the research effort is expensive with costs increasing linearly for every additional response

Dr. Watson: So is this “old school” a passé then?
Sherlock Holmes: Not at all. I am apparently old school too, remember? Well, this technique has existed eternally for a reason and iterations over time have only improved it statistically and economically. Maybe there is a way to best use its most powerful offering, which is, getting exact responses for limited but targeted questions. Hold on to that thought and we will come back to it.

Dr. Watson: OK. So what’s next?
Sherlock Holmes: The new age my friend – the art of listening to customers. The mention of “Big Data” instantly strikes up an image of bits and bytes zooming around at supersonic speeds to the dreamier folks. To the realists, it comprises of high volumes of data, being tapped at even higher frequencies and from a wide variety of sources. Whichever side of the personality trait you might be, one bitter truth that does exist still with big data is that the more you think you got it under control, the closer you inch towards a data fusion bomb, ready to explode. Only few analytical initiatives have been able to successfully harness its power so far, and Social Media Analytics is among them. By the way, this now is the cue to link back to the thought you so patiently held on to a couple of minutes back. So how about making the best of both worlds collaborate with each other.

Dr. Watson: But how do we do that?
Sherlock Holmes: Convergence! Traditional market research helps you categorize customers into meaningful segments and consequently provide the precious dimensions that one needs to know to understand the DNA of a customer segment. And this is where you leverage social media analytics to scale up massively! Calibrate these dimensions to build models on big data that categorize the voice of current and potential customers, based on what they have been expressing online across various social channels. Tap into what they have been tweeting, posting on Facebook, expressing on various blogs or through product reviews on different e-tailer websites. The mission is convergence, of knowing what each customer persona has been expressing online, everywhere. Scanning for everything that a customer segment writes about, starts yielding unimaginable level of granular insights. You get an ocean of knowledge on offer because you are not restricted by the limited number of questions in a conventional market research anymore. Even the “tracking studies” conducted for model maintenance can become less frequent now since you are already validating on the fly.

To sum up, use a short but crisp Market Research to discover and categorize, and Social Media Analytics to amplify.

Dr. Watson: Excellent!
Sherlock Holmes: Elementary, my dear Watson.

Insights

Chief Analytics Officer: Unbounded

You are a new Chief Analytics Officer and you have the mandate to create business impact using advanced analytics and create an analytics driven culture. You are then faced with the realities of budgets, timelines, competing IT priorities and the lack of resources to get things done fast. What do you do next?

In 2013, Anthony Volpe, joined Lenovo as the Corporate Chief Analytics Officer with a similar mandate. He said “I had the challenge of delivering measurable results within the first 12 months, I could not afford to spend 12 months getting my infrastructure and team figured out, we needed to move fast, and needed an approach that will allow us make some quick decisions and re-visit them as we gathered more information. However, we were not looking for a quick fix or a PoC environment, we needed something that could scale to our global needs if the pilots succeeded. After evaluating various alternatives, we concluded that Core Compete’s A3 service really offered us the best of option.”

Leveraging Amazon Web Services (AWS), Core Compete created an elastic analytical environment. Core Compete is unique in the way that it’s team has skills not just in AWS but also in enterprise analytical tools like SAS, SPSS, Tableau and Hadoop and combines it with their experience in helping enterprises capture financial value from deploying predictive analytics.

In 30 days, Core Compete delivered an infrastructure and team that allowed Lenovo to confidently execute on a range of initiatives. Anthony says “In my first six months, we were able to pilot initiatives on a big scale that involved understanding quality using social data, optimizing inventory using internal supply chain data, and doing what-if planning on our sourcing networks. We could not achieve this without the agility and flexibility we got through AWS and Core Compete”.

Do you want to be a Chief Analytics Officer that is Unbounded?

Insights

Using Cloud for Analytics Driven Innovation

Kumar Majety
In an increasingly big data intensive environment, most organizations want to crack the code on analytics driven innovation. They’re eagerly trying to get big data projects off the ground, but often run into hurdles according to eWEEK Enterprise 2014 Big Data Outlook. The key challenges include supporting large data volumes and new types of data (53%) with their existing corporate IT infrastructures and budgets, complexity of software/data integration (54%) and making analytics easier for business users (53%).

So, what is required to deliver big value from analytics to your business? A well-defined use case is a good first step but not enough.  Setting high expectations for your data analytics team and asking them to quickly generate conclusions by working with the IT department seldom translates into success.  A clean sheet approach to the underlying technology infrastructure (including database, compute & storage) and enterprise analytics/visualization software tools combined with deployment agility and flexibility is essential for successful innovations as described by Martha Bennett of Forrester Research. However, these enterprise grade analytics platforms are often anything but agile and carry a hefty price tag of complexity and inflexibility.  This leads to Catch-22:  to get investment dollars, you need to show proof of value but to have an air-tight business case, you will need to pilot the initiative and see if the idea can really generate value.

How do Cloud Analytics fit into this equation? Cloud analytics provide an ideal solution to get your big data initiatives up and running quickly and inexpensively.  These solutions offer high scalability, reliability, and flexibility to run any vendor’s analytics tools without lock-in or the headaches of managing hardware and software.  They allow you to only pay for what you need, and easily scale up or down.  “Software as a Service Offerings” from Analytics or Database software vendors are not ideal for innovation projects as they lack the flexibility to support your specific needs or tools of choice.  Cloud analytics solutions on the other hand are a better fit for innovation projects as they address end-to-end analytics lifecycle needs on a pay-as-you-go basis.  That means, specific advice on turning your data into business insight, hardware and software architecture designed for your analytics/data/business requirements, pre-built templates for agile deployment, end-to-end administration of analytics infrastructure & applications, data management and integration of different data sources in the cloud, and enterprise security and compliance.  A great way to start would be to kick tires with a partner that offers try-n-buy of the complete cloud analytics platform and service at no cost to you.

Lenovo is running Analytics Innovation Studio on Cloud Analytics.  A global manufacturer with more than $30+ billion in revenues is using SAS’s advanced analytics, text mining and visual analytics tools deployed on Core Compete’s Agile Analytics Platform.  In 2 months, Core Compete delivered the cloud analytics solution (on AWS) that enabled the client to pilot several big data solutions that integrate internal and external data in a cloud environment and prove innovations in: Quality Analytics, Channel Analytics, Sourcing Analytics, and Social Media Analytics.  In addition, Lenovo has also leveraged the analytics innovation platform to successfully scale these for enterprise wide usage.

We hope that these examples gave you some ideas on advancing big data innovation projects in your organization.  E-mail us to see how we might be able to help you get started.

Analytics Data Science Insights

Generating Operational Insights from Social Media Analytics

Shiva Kommareddi
Willie Sutton, the famous bank robber, when asked “Why did he rob banks?” answered “That’s where the money is”. Business Executives who want to learn more about their customers are forced to listen closely to what their customers are saying in Social Channels, because that’s where they are sharing their unbiased feedback on their product and service experiences.

In most organizations, Social Media Engagement is confused with Social Media Analytics. It is worth distinguishing the two to truly benefit from the power of these ideas. Social Media Engagement is active participation in social communities to shape the dialogue and to utilize it as a customer service channel. Social Media Analytics, on the other hand is about analyzing data available through social channels (or created utilizing social channels) to improve business (including marketing but not limited to it).

The first generation of Social Media Analytics initiatives were squarely focused on understanding the trends in customer advocacy, i.e., creating variants of the Net Promoter Score(NPS) using Social data instead of survey data. Back in 2009, Dell started to tout the benefits of this approach. Our own conversations with current and former Dell employees suggest that while the promise was there, the idea of running the organization using NPS itself was new, it was not practical for them to muddy the measurement with other metrics that may distract from the real goal (which is improved advocacy, not innovations in measurement).

However, since then, Social Media Analytics have moved beyond being a poor augmenter of NPS into areas where it can truly, uniquely solve with the richness and timeliness that other data sources do not provide. We have provided a few examples to spur your own answer to the question “How can we apply analysis of Social Data to improve our own business operations?”

Lenovo asked How Can We Improve Product Quality? A global manufacturer with more than $30 billion in revenues, early detection of quality problems can save millions of dollars annually for Lenovo. Their tech-savvy customers are more likely to report an issue first in an online forum before they reach out to Lenovo’s official support organization. Lenovo decided to aggregate the feedback across these global channels to create consolidated views of the emerging quality problems. These highly operational dashboards enable internal teams to manage identify issues in a timely manner, and significantly reduce quality concerns and warranty costs. Lenovo leveraged SAS’s powerful text mining and visual analytics tools and deployed them on Core Compete’s Agile Analytics Platform.

McDonald’s asked How Can We Improve Customer Experience? It might not be legal to text while driving, but it may be OK to post on Twitter while going through a drive-thru. McDonalds tracks vast amounts of data from social media in order to improve operations and boost customer experience. The company uses these insights to design its drive-thru outlets, list items on menus, plan order sizes and ordering patterns to customize their offerings to perfectly match the expectations of different micro markets.

Toyota asked How Do We Improve Our Sales Forecasts? CIO Magazine reports a number of interesting use-cases on how Toyota is thinking about social media analytics for operational improvements. One of the interesting examples they talk about is of improving sales forecasts based on trends in sentiment. They can further analyze these sentiments to see what type of demographic characteristics are leading to the improved/ decreasing sentiments.

We hope that these examples made you think of your own ideas that are relevant to your organization. It does not have to be hard or expensive to get started and add value to your organization. E-Mail us to see how we might be able to help you get started.

Insights

Big Data Analytics Paves Way for a New Era in Healthcare

Medical professionals, hospitals and related healthcare organizations are facing challenges to reduce costs, provide coordinated patient care, standardize healthcare quality and deliver effective patient outcomes. Standard medical practice is moving from relatively ad-hoc and subjective decision making to evidence-based healthcare. A McKinsey study identifies a set of converging trends in the healthcare industry to a tipping point where Big Data and Analytics will play a major role. The trends are:

Demand for better data owing to cost-reduction pressures
Availability of relevant data at scale that includes
Clinical data in the form of EMRs and information exchanges
Non-healthcare consumer data
Technical capability
Government enabling and catalyzing market change

As US healthcare providers have dramatically increased the usage of Electronic Health Record system, according to data shared by US Department of Health and Human Services, “More than half of eligible professionals and 80 percent of eligible hospitals have adopted these systems, which are critical to modernizing our health care system.” This digitization drive is leading to generation of large volumes of data. Additional sources adding on to healthcare data include:

Development of new technologies such as capturing devices, sensors, and mobile applications
Extraction of genomic information has become much cheaper and quicker
People have become very active on social media channels
Interactions with various healthcare organizations through digital forms are increasing
Enormous amounts of medical knowledge/discoveries are being accumulated

In 2012, worldwide digital healthcare data was estimated to be equal to 500 petabytes and is expected to reach 25,000 petabytes in 2020.

Building analytics competency can help healthcare organizations harness “big data” to create actionable insights, set their future vision, improve outcomes and reduce time to value.  Leading healthcare organizations use analytics to differentiate, see the future and drive revenue growth. In a series of interviews with healthcare executives conducted by IBM Institute for Business Value, health care professionals feel that three business objectives can be addressed by Analytics in the Healthcare industry:

Improve clinical effectiveness and patient satisfaction
Improve operational effectiveness
Improve financial and administrative performance

How are Health Care Organizations Realizing these Benefits?

Improved Quality of Patient Care: As Healthcare Quality is one of the primary concerns, Analytics can assist in maintaining high quality of patient care by analyzing health outcomes data for different services to pinpoint lags in providing effective treatments consistently across a patient population or geographic area.

GE Healthcare leverages SAS technology to analyze patient health datasets and look for specific patterns and trends that help hospitals prevent adverse medical events. The SAS software mines patient-related data and provides critical insights, best practices, and benchmarking to enable clinicians to make informed decisions aimed at reducing medical errors and improving the quality of care. GE’s Patient Safety Organization provides its members a single common medical event-reporting platform, with comprehensive data analytics and advisory support to identify the root causes of risk, and help hospitals make lasting safety improvements.

Baptist Health’s CHF reduction initiative is a successful example of combining data analysis with new approaches to care delivery to improve quality and reduce costs. Data was merged from multiple sources across Baptist Health to present a full picture of the causes of CHF (Congestive Heart Failure) re-admissions. This data was analyzed to identify at-risk patients, determine resource utilization rates, and assess progress on a set of quality benchmarks. Dashboards offered providers the tools needed to use this data at the point of care, and education affirmed new roles and responsibilities. As a result Baptist Health was able to achieve its goals and reduce re-admissions for CHF patients at relatively low costs.

Improved Revenue Cycle Management: Revenue cycle management (RCM) refers to the process of managing claims, payments, and revenue generation and relies heavily on a combination of claims data, clinical data, and analytics technology. Analytic tools can help healthcare organizations determine patient eligibility, validate coverage, authorize services, assess payment risk, manage submissions, and track performance.

In a recent SAS podcast, Graham Hughes, the chief medical officer of analytics provider SAS, talks about the difference analytics can make in examining the cost and quality of health care in the United States. Hughes discusses a new analytics program SAS has launched to collect and compare health care payer claims data. By collecting this information in a data warehouse, the software allows users, including state agencies, health care providers, researchers, and eventually consumers, to analyze and compare health care cost data within a county or state. This solution can go a long way in eradicating opaque medical bills and establish transparent pricing policy in healthcare.

Better Resource Utilization: Analytics can build effective processes which result in removing system bottlenecks and reducing wastage. By appropriately estimating patient volumes, length of stay, and/or waiting times, inventory control systems and supply chain management processes can be effectively redesigned. Real-time data interpretation helps tremendously in developing efficient and optimized workflows.

Fraud and Abuse Prevention: Fraud refers to a calculated misrepresentation of facts aimed at convincing payors to process a false claim for financial gain while Abuse refers to neglect of accepted business or medical practices resulting in higher reimbursements. Cost trending and forecasting, care utilization analysis, and actuarial and financial analysis are commonly used analytic applications for preventing such cases.

Population Health Management: Healthcare providers carry the responsibility to educate people, spread awareness about lifestyle changes leading to disease prevention as well as its treatment. Their timely action in reaching people before, during, and after they need specific medical attention can save many lives. Analytics plays a key role here by assisting healthcare organizations in recognizing populations consuming the most resources or at greatest risk for hospital readmissions, enabling them to target high-risk groups to reduce costs and improve outcomes. Real-time data insights can help identify trends in disease prevalence, compare the effectiveness of different treatment options, and derive best practices.

The Louisiana Department of Health and Hospitals, for example, recently teamed up with geographic information systems (GIS) software vendor ESRI to map epidemiological issues, such as babies with low birth weights. Using ESRI’s GIS mapping software, the LDHH plugged in 354 points of data for every live birth in Louisiana. Sophisticated algorithms identified clusters among locations and then generated a map based on this data. For example, by crunching low birth weight records with geospatial data, the LDHH discovered correlations between low birth weight rates and crime-riddled neighborhoods. By flagging neighborhoods with low birth weights, preventative healthcare measures can reduce the number of high-risk births, thereby cutting healthcare costs.

According to the CDC, 26 million Americans currently have asthma that costs $3,300 per person annually in treatment costs. Asthmapolis, one of a new generation of digital health startups, has designed snap-on, Bluetooth-enabled sensors that track how often people are using their inhalers (along with location and time-of-day), along with analytics and mobile apps to help them visualize and understand their triggers and trends while receiving personalized feedback. In turn, the data collected by the solution enables doctors to identify patients who are at risk or need more help controlling its symptoms. This allows them to potentially prevent attacks before they happen, saving them the cost of hospitalization or a trip to the emergency room. In fact, Asthmapolis’ early studies found that this access to real-time data was able to reduce the number of people with uncontrolled asthma (or those not regularly using inhalers) by 50 percent.

Despite the fact that there is a lot of data that the hospitals already have (through their EHR systems), there are important pieces of data that the hospital system does not have. e.g., they do not know if the patient is a smoker or not (unless the patient fills that in their form), the size of their households, their consumption of various luxury, health and vice-goods etc. HealthVue, a Raleigh, NC based start-up is bringing together data from consumer sources, geo-spatial data, CMS data and combine it with EHR data to deliver a comprehensive and timely view of the populations.

Big Data initiatives can transform the healthcare industry to make it more coordinated and streamlined and be readily available at the right time, saving millions of lives and making high-quality patient care the new norm.

Analytics Forecasting Healthcare Insights

Cloud Analytics Empowers Big and Small Businesses without Bias

Traditionally, large businesses have chosen their directions and planned their future strategies based on insights derived from limited structured data created by or stored in large enterprise systems, such as enterprise resource planning (ERP) or CRM systems. But, today, vast majority of data is being generated by the Web, smartphones, and the Internet of Things that is available to both large and growing enterprises without any bias. Thinly staffed IT organizations are unable to cope with the demands being placed by analytics organizations. The traditional approach to analytics from IT is neither feasible nor desirable! A more flexible and agile approach is absolutely required to deliver on analytical innovations for large and growing enterprises.

This is where cloud based Big Data Analytics technologies come into play. There are vast sources of untapped data such as data from distribution channels, social media, client surveys, contracts, emails, white papers published by research firms, government studies and customer social media data that might explain gross patterns and root causes. Big Data exploitation and monetization cannot happen through a single technology, technique or initiative. Rather, it is an amalgamation of technologies and initiatives that help in creating, storing, retrieving and analyzing data that is remarkable in terms of volume, velocity, and variety.

As Terence Ngai describes in his blog post, “By storing data and performing analytics in the cloud, you can take advantage of distributed processing to greatly increase the speed of your operations. Because you don’t have to move data over the network, you reduce traffic requirements and lower latency rates. The massive scalability available on an as-needed basis can save you the cost of building your own data centers and provide the flexibility you need to meet changing business conditions. The net result? Faster business decisions.”

Traditional Business Intelligence/ Data Warehousing applications involved high infrastructure requirements, high development and maintenance costs and needed to go through a long route for provisioning and were heavily dependent on internal IT team. On the other hand, cloud based Data Management and Analytics applications can become operational in a few days with negligible initial investment and overhead costs, which is why it becomes a lucrative opportunity for growing businesses to harness the power of Advanced Analytics without having to invest in any on-premise solutions.

The 4 main advantages that cloud delivers for storing, synchronizing and analyzing data include:

Distributed processing to provide agility
Elasticity
Scalability
Cost savings

Agility is about being flexible and responding quickly to change. However, it has been difficult to increase the agility of BI, analytics, and data warehousing systems through traditional development cycles where traditional procurement, budgeting, planning and prioritization act as bottlenecks. Therefore, analytics applications on the cloud provide the much-needed on-demand IT to support internal as well as external projects. Cloud enables businesses — not just IT operations — to add or provision computing resources just at the time they’re needed.

Here is an example of how data analytics solutions on cloud deliver on that dimension.  Royal Dutch Shell was working on not only reducing energy costs but also to be more agile in deploying IT services and planning for user demand. To reach those goals, Shell in 2010 began using Amazon Web Services.  Shell leverages sensors to find oil in wells formerly thought to have run dry or in places where previous exploration indicated there was no oil. These sensors create massive amounts of geological data. Shell’s IT shop has to figure out how to effectively manage the giant files and to deploy these sensors quickly and profitably.   They provision compute capacity themselves, run their models on AWS cloud platform and then return the cloud compute capacity, getting charged only for what they used. Shell says that two to three hundred project teams could be up and running in a day versus the weeks it would take them prior to AWS.

Customers today are more informed, less loyal, and expect to be treated the same way across different channels, and so companies need an effective omni-channel strategy. This has been a trend for a while, but using cloud-based solutions make it much easier and faster to integrate and scale. As Dr. Fern Halper discusses in her article, “Many companies process the external data sources in the public cloud and then bring the reduced (analyzed) data set on-premises to make it part of a bigger analysis. These are specific, targeted horizontal or vertical applications that can be called upon from the cloud when needed. For example, a credit card fraud application might be run in the cloud and then the results are pulled back into the rest of the analysis on premises (say, in a private cloud). Another example: a campaign management system or an analytics service that does retention analysis. One way to think about it is almost as a skills-as-a-service model, given the lack of skills in the advanced analytics space.”

Cloud provides access to infrastructure resources on a pay-as-you-go basis enabling organizations of all sizes to explore and use large volumes of information without making a heavy upfront capital investment in infrastructure. This elasticity helps businesses to efficiently innovate and test-run projects that could not be as easily accommodated by an on-premise deployment. As shared in a the Economist-IBM survey, 31% company executives shared they like the cloud’s  “pay-as-you-go” cost structure as it takes away the need to fund the building of hardware, installing software, or paying dedicated software license fees. This was the appeal for Etsy, an online marketplace for handmade goods that brings buyers and sellers together and provides recommendations for buyers. “Using cloud-based capabilities, the company is able to cost-effectively analyze data from the approximately one billion monthly views of its Web site and use the information to create product recommendations,” the report notes. “The cost flexibility afforded through cloud provides Etsy access to tools and computing power that might typically only be affordable for larger retailers.”

Cloud analytics is the key that can unlock the treasure trove of data for businesses of different sizes, in various sectors. Those who are able to drive above-market growth, though, are the ones who can effectively mine that gold.

Core Compete is a provider of Cloud Analytics. Click here to learn more about our service.

Insights

It’s Official: The Data Scientist is Dead

Shiva Kommareddi
At the recently concluded Wharton Customer Analytics Initiative annual conference, Tom Davenport suggested it’s time to move on from the title. Peter Fader, Wharton Marketing Professor echoed the thought and asked everyone to pay attention. Hailed as the sexiest job of the 21st century, is it really time to write the obituary of this breed?

Let’s follow the evolution.

Birth: Around 2006, the term data scientist was devised to mean: a truth seeker, a story teller and an unabashed data geek. I am sure if you could actually find such a person they would be astonishing (and probably sexy too), but it was just so hard to find them. This led to an industry around predicting the shortages and the solutions to the problem.

2012. Youthful Dreams: This phase was marked by a general excitement in the market and a significant uptick research reports that marked the huge shortage in talent, stories about the large salaries that such professionals commanded and of course. Articles calling this the sexiest job of the next century were penned. Did I hear you mutter.. the peak of inflated expectations…

2013. Middle Life Crisis: On top of the shortage, it was not actually clear where this person fits in and how to fund this resource. So, people started to use this as a fancy title for all kinds of traditional jobs: business analysts, statisticians, database architects etc. causing further confusion on what exactly this person was supposed to do.

 2014. The Family Man Things seem to be finally settling down, latest thinking suggests that the data scientist was never really a person, it was a team. It is only now dawning upon people that maybe we were expecting too much from one person, and they were really describing a department.

The story follows the often repeated hype cycle in technology (we can all rest easy that it finally makes sense).

Before we move on, let’s pause and see if we were really wrong (and stupid) or why did this happen?

I believe that the Data Scientist is a real person, more than anything else, this is a person with technology and business savvy, who’s curious and capable. The problem is that this person is really in short-supply, hard to train (its more of an attitude) and not too many organizations can actually nurture this personality. I would guess that a a survey of data scientists in large organizations, would reveal that the data scientists are increasingly disappointed with their lack of business impact.

The lack of business impact is mainly due to the lack of consistent executive engagement with these efforts to drive value. What we need more of is business leaders who are willing to engage and drive value (not just demand it). Data scientists and data science teams will come in due course.

Insights

Big Data, Big Opportunities for Communication Service Providers

You may have read our case study on how we helped create a Data Integration foundation for a mobile operator to enable Hyper Local Marketing, this video (from IBM) does a nice job to tell the story.

Insights

A Single Demand Forecast (Part 4): What Should Retailers Ask For?

After reading the preceding posts, you probably agree that a single forecast for all retail business processes is not the right thing to ask for.

But then how do you address the original issues that prompted the discussion:

Each business function is working with a different forecast in mind. Can’t all of us work off a single number?

The technology and personnel investments required to address these differing forecasting needs are mind boggling. Can we consolidate them?

These are valid issues that should be addressed. Our recommendations:

Establish a Sales & Operations Planning process to achieve alignment

Create a central forecasting methodology group that serves different organizations

Seek a data and technology infrastructure that serves multiple forecasting needs

Forecasting Insights Retail Supply Chain

Bottom-Up Forecasting: Be Wary of the Sophists

It is common for some retail guru’s/ practitioners to speak to the virtues of forecasting bottom-up as the best way to get a good forecast as a universal truth . Let’s examine the mathematical fallacies of this argument with an example.

Scenario: You are interested in creating a forecast for how much a store is going to sell in a given month. You want to use this as one of the inputs for the district management team. Your arch-nemesis (Dr.Z) and you are given the task of creating the forecast, the person with the best forecast is going to be promoted. The store sells only three items. You have 12 months of data (shown below). The race is on.

Dr. Z is a believer in bottom-up forecasting and is going ahead and creating forecast models using the finest software he has. Although the item sales figures are rather volatile, he confident that he can find enough causal factors to explain a lot of this variation. So, he sends his ace data folks to collect all the causal data he can to figure out what is causing the ups and downs in the forecast. He’s off to a flying start. Dr. Z successfully develops a model, where he has a forecast error of 10% for each of the three items and is very thrilled with the accuracy he’s able to get with the bottom-up forecasts (results below).

You just came back from vacation, and you learn about the fancy work that Dr. Z and his team have been doing. You have only one day left before you are expected to make the presentation. What do you do? You add up the sales for the items to look at the pattern of sales at the store level, as that is what you have been asked to forecast (shown below):

It’s evident that forecasting at total store level for this store is relatively simple and you may not need to do a lot of work. So, you submit that the forecast for this store is $100/ month.

From the example above, it’s evident that forecasting at the lowest level of the data (you have) is not always going to yield the best result. It may yield a good result, but is always a good idea to try multiple methods (top-down, bottom-up or a middle-out approach) to determine which method (or combination of methods) will yield the best result.

This is not just a clever example, but the nature of most processes in business (and nature). The lower level is always more noisy and harder to predict than the higher level (as noise across different time series cancels out like in the example above).

Forecasting Insights Retail Supply Chain

A Single Demand Forecast (Part 3): Be Careful What You Ask For!

A point I made earlier is worth repeating. The premise that I am challenging is a single forecast for all business processes, NOT the premise of a single forecast for a single purpose across the supply chain (which I think is a good idea, worth pursuing). The lack of clarity on this issue is what I am seeking to clarify here.

There are two problems with a single forecast that supports all business processes:
a)      A single demand forecast for all business processes is sub-optimal
b)      It is not possible to create such a single demand forecast (note that (a) trumps (b))

A single forecast for all business processes sub-optimal
When a business process requires a forecast, it has a certain expectation of forecast granularity and forecast horizon.

Forecast granularity: At what level of the business do you need the forecasts (to take the operational decision)? For most purposes in retail, the granularity is presented in terms of Merchandise – Location – Time.
Forecast Horizon: How far out do you want the forecast for, and how long do you need the forecast for. This is determined by the time it takes to act on a forecast, e.g., if you need to order an item from a supplier who has to make it, you generally need to send them a forecast way in advance in comparison to an item that the supplier keeps in stock.

Some examples in the table below:

Let’s consider the extreme examples here: Strategic Planning and Labor Scheduling. It is intuitive for managers to recognize that it is better to forecast at a higher level for strategic planning purposes rather than to roll-up hourly-item-store level forecasts to arrive at the forecast for the next four years. Intuitively we know that, while forecasting for the next 4 years, you need to consider the trends in your overall business, competition and macro-economic factors as opposed to your current merchandise mix, or last month’s hourly sales patterns. On the other hand, when you are doing labor scheduling for the next two weeks, it is critical for you to understand hourly sales patterns not only of similar time periods from past years but also of last week.

Beyond the fact that it intuitively does not make sense, it is mathematically sub-optimal. For more on this topic, read my previous post on this topic: Bottom-Up Forecasting: Be Wary of the Sophists.

It is not possible to create such a single demand forecast
Once again in our strategic planning and labor scheduling example, if the approach to a single demand forecast is roll-up the bottom up forecast. One needs to know the merchandise mix for the next four years at the store level. It is not possible. In many cases, you don’t know the precise merchandise mix that will be in the stores six months from now. So, it is not possible to create a strategic plan based on a roll-up of hourly forecasts. As a more practical example, it is not possible to create a merchandise financial forecast for the next 6 months based on hourly forecasts for the next six months.

In summary, the idea that a single forecast at a store-sku-week level will serve all business planning purposes is mis-conception. It intuitively does not make sense, mathematically incorrect and ultimately impractical.

Now that I have described what you should not ask for, I am sure you are thirsting to read what the solution is. We cover this in Part 4: What should retailers really ask for?

Forecasting Insights Retail Supply Chain

A Single Demand Forecast (Part 2): Why do Retailers ask for it?

Forecasting is a necessary evil in the retail industry (or for that manner in any industry). In the past two decades, retailers have realized that there is significant value in making science an integral part of their decision making.

Starting with Computer Assisted Ordering(CAO), retailers have continued to gain benefits from applying scientific approaches to processes such as: Replenishment, Allocation, Pricing, Promotions, Markdowns, Assortments, Size mix, Labor scheduling and Financial Planning. Forecasting is a critical component of all these areas.

Without exception, in all the areas mentioned above, a good forecast is atleast 60% of the answer. Hence, all these solutions come up with a forecast, and recommend how to control the operational levers (e.g., staffing) to meet such a demand forecast.

In many cases, making individual processes smarter, has provided rewards and the money is in the bank. Now comes the next series of questions:

Why do we need so many forecasts? If all of these processes are creating forecasts, is it not correct to assume that there’s one answer that’s better than all others?
Can we consolidate the efforts into something more meaningful (like a single forecast)? Forecasting requires skilled individuals who are quantitatively adept and have a solid understanding of the business. So, the inevitable question:
Should we be buying a forecasting system and put all these business processes on top of that technology? Good forecasting requires a lot of infrastructure – good granular sales and inventory data, capture of causal information, good master data for products and stores etc., In addition, large volumes of data need to be crunched in short weekend windows. All these  require large (and on-going) investments in hardware.

It is easy to lead to the conclusion, of course, we need a single forecasting system that produces one answer and the entire business runs on it. We all wish for a simple world, but Retail is not meant to be that easy, it wouldn’t be fun, would it?

Read my point of view on why these are the right questions, but a single forecast for all processes is not the answer in the next installment of this blog: A single demand forecast (Part 3) – Are you really sure you want one?

Forecasting Insights Retail Supply Chain
Our site uses cookies to improve your experience. You can control cookies by adjusting your browser or device settings. By using our site and closing this box, you consent to our use of cookies.