PART 1 – Getting from Data to Actionable Insight
This is the first white paper in a two-part series exploring how data is revolutionizing digital marketing. This document examines some of the difficulties business faces in meeting the data challenge and what steps must be taken to effectively leverage data-centric methods to gain actionable insights from the huge volumes of data now at our disposal.
PART 2 – Using Data-Centric Insights to Drive Effective Media Strategy builds on this document and examines how business can successfully leverage the insights gained from their analysis to power and optimize marketing operations in the quest for greater ROI in digital media.
Why read this paper?
Big data is big business these days. Research firm IDC forecasts that big data services and technology will grow at a 27% compound annual growth rate (CAGR) to $32.4 billion through 2017 – about six times the growth rate of the overall information and communication technology market. And the performance improvements, increased revenue, and operational savings promised by effectively leveraging big data are expected to dwarf that number. As companies scramble to implement big data strategies, marketing has been one business function where the potential benefits of increased efficiency loom particularly large. And much is at stake here. Industry experts like eMarketer and Forrester predict global spending on digital adverting will grow about 15% annually and top $200 billion in the next five years. Achieving even a small increase in efficiency could therefore deliver huge ROI.
Software vendors have flooded the market with countless products, all promising to help meet the big data challenge. Many are capable tools that perform well in their respective areas but address only part of the challenge. They can also be difficult to implement, hard to integrate, and often require deep technical and analytical skills to configure and operate. Today’s marketer needs solutions that are complete, integrated, easily implemented and intuitive to operate, requiring a minimal learning curve, if they are to deliver on the big data promise.
But most organizations will require more than tools and technology to succeed. Many lack a data-centric culture and will struggle to attract and retain talent with the deep analytical skills needed. And beyond mere talent, companies will need to change operational procedures and structure workflows and procedures to optimize the use of data-driven insights.
Big Data Is About Data, Not Size
But first let’s take a closer look at why the big data revolution is so significant and, while offering significant promise, also presents some unique and vexing challenges.
The challenge with big data isn’t just its size, but also its lack of structure, diversity and fluid nature. These factors, more than sheer size, make it hard to aggregate, integrate, manage and analyze.
While storage and processing capacity have increased greatly and costs have plummeted, there are other factors that limit our ability to manage, index and cross-reference large data sets. It used to be that companies collected data primarily as a part of their daily transactions and stored it in structured databases. This data was used mainly to track operations or forecast sales, inventory and the like. It was structured, well understood and relatively easy to manage and analyze. With the explosion of data sources, we have seen not only the volume of available data increase drastically, but also the very nature of the data itself change. With more and more business processes, operations and stakeholder interactions digitized, business now has data on just about every customer interaction, including click-streams from Website visits, search activity, mobile apps, digital advertisements, social media content streams, videos and countless more. And, as the “Internet of Things” becomes a reality and digital technology invades our homes, our cars and even our bodies (Google recently announced a contact lens with embedded microchip and sensor to monitor glucose levels for diabetes patients in real-time and send the data to an application in the cloud), virtually everything we do will generate data.
Analysts estimate that fifty billion sensors will be connected to the Internet by 2025, each one adding to the data pool available for analysis. While our refrigerators may not be networked (yet), our televisions, home security and thermostats are (note Google’s recent $3.4 billion acquisition of Nest) and all are generating data points. Companies will soon not only be able to collect information about every conversation people are having about their brand, but also monitor interactions with their products anytime and any place they occur.
But that alone will not make them better marketers. In fact, it could do just the opposite. Few executives lament a shortage of data as holding them back. Instead, like modern versions of Coleridge’s Ancient Mariner, they are awash in data, yet thirsting for true insight. Adding even more data – without corresponding improvements in the ability to organize, analyze and interpret it – could well prove counter-productive, leading to analysis paralysis or providing false cover for bad decisions. What harried business leaders seek are simplicity, inspired guidance and clear, actionable insight. And no amount of data can provide this on its own.
Getting from Data to Actionable Insight
Henry Ford once famously proclaimed that if he had asked people what they wanted, they would have said faster horses. And there would have been ample data to support this request. More skilled at engineering than equine breeding, Ford didn’t take this raw data at face value though. Instead, he analyzed it carefully, interpreted it creatively, asked deeper questions and innovated based on the insights he gained from his analysis. And herein lies the crux of the matter: it’s not the data itself, nor its size, that provides the value. Instead, it’s the insight we derive from it and the actions we take based this insight. Data is just a tool, a resource, nothing more than an inert ingredient until we add the catalyst. As Hamlet observed, “there is nothing either good or bad, but thinking makes it so”. Whilst he was musing on things unrelated to data analysis, his ideas are just as valid and applicable here: without human reflection, analysis and thinking, we will drown in data while actionable insight remains beyond our grasp.
So how do we get from raw data to actionable insight? How do we harness this flood of information to understand our customers better and take our business to the next level? The late Dr. Russell Ackoff, once described as the “Einstein of modern problem solving”, shows us the way in his famous article, “From Data to Wisdom”.
In it Professor Ackoff states “Data is raw. It simply exists and has no significance beyond its existence.” We have to enrich it by providing context, relevance and relationships, at which point it becomes information. Take the statement of “we sold 1,000 units last year”, a classic example of raw data that tells us very little. Are sales of 1,000 units good or bad? Is it more than the year prior, indicating an upward trend, or are we falling behind? Were the sales profitable, indicating a healthy business? Are we out-performing our competitors, indicating a comparative advantage? These contextual aspects are what give the data meaning, and thus render it into more useful information.
But information alone provides limited benefit, as it lacks purpose. While it may be interesting to note that your company doubled profits last year while sales increased by 20% to 1,000 units – more than twice your closest competitor – it still doesn’t give us any understanding of how or why this happened. Only by aggregating enough information, to where we can discern patterns and are able to draw conclusions, does it become knowledge. Acquiring this knowledge is valuable, for it gives us insight into how things work but it is still not actionable. It tells us that something works and how, but we don’t understand yet why – or, more importantly, how we can take action to influence it in a desired way.
For knowledge to become actionable insight, we also need an understanding of causality, the why. Achieving a true understanding requires us to distill data, aggregate it into information and then knowledge, and then actually synthesize new knowledge from it. To build on our example, if the analysis of our data shows us that the regions that performed best – the ones that contributed not only the most to our overall sales growth but also the most profitable sales – all had sales people that had completed the new sales training program, then it stands to reason that this was a major contributor. We can then reasonably assume that if we were to train the salespeople in our other regions, they might also see an increase in overall sales as well as profit. With this new insight we now have an understanding of how certain regions perform better as well as why this is the case: their salespeople are better trained. This enables us to take action and train the rest of our salespeople. The analysis of subsequent data should then, presumably, show an increase in performance in these regions as well. This is a classic example of data-driven optimization: gathering data, aggregating and analyzing it to gain actionable insight, acting on it and then measuring the results to validate the effectiveness of our actions.
And, once we do this often enough, with enough data from across the enterprise, and marry these new insights up with our industry experience, professional judgment, and our understanding of business principles, we arrive at what Professor Ackoff referred to as wisdom. Wisdom, in his use of the word, refers to the ability to anticipate outcomes, to predict certain results based on historical data. It also enables experimentation by providing the basis for developing new hypotheses that can enable new business ventures, product extensions and new approaches that can be defined, developed, executed and then tested and measured.
Meeting Today’s Challenges
Professor Ackoff developed this approach back in the 1980s and published his insights in 1989. Back then our data sources consisted mainly of transactional systems commonly employed in business at the time, systems that managed sales, supply chain functions, financial systems and such. Reflecting this, there was far less data, it was clearly delineated by source, well-structured and stored mostly in relational databases. All of this made it relatively easy to collect, aggregate, refine and analyze. The biggest challenge faced by business at the time was the lack of integrated systems, which meant that the data had to be collected from numerous separate systems, converted into common formats, normalized, harmonized, and prepared for analysis.
These are all the same challenges we still face today. With the explosion of data sources and the corresponding increase in volume and complexity, however, they have grown exponentially.
Technology Only Goes So Far
Luckily, new technology has emerged to help us with our efforts. We have seen vast increases in our ability to store, manipulate and process data thanks to tremendous increases in computing power over the last decade. These have been further enhanced by developments like NoSQL databases, Hadoop and other distributed data storage and processing frameworks, new standards like PMML, advanced analytics platforms, and many others. Combined, these factors give us an unprecedented ability to collect, store, manipulate and process data electronically. The one aspect that has not increased significantly though, remains vital today: skilled, inspired human analytical skill.
Without careful analysis and inspiration we simply cannot advance from knowledge to actionable insight and beyond, no matter how big our data pools are or how powerful and sophisticated our technology might become. Professional analysts, systems designers, and programmers are also needed to design and develop the tools, platforms and programs needed to make the data volumes manageable, devise the logic by which we can process and analyze it and then present the results in a clear and actionable manner.
As companies and brands embark on big data projects, the shortage of skilled analytical talent is frequently cited as one of the biggest impediments they face, along with a lack of integrated technology platforms that are easy-to-use and yet flexible and powerful enough to be able to meet the needs of business. Developing talent and changing mindsets to become more data-savvy will take time, of course. Meanwhile, the AdTech market is scrambling to answer the call for better technology and provide integrated platforms able to meet the needs of digital marketers, freeing them up to focus on what they do best: develop effective targeted campaigns.
The Market Responds
With so much at stake and the potential rewards so large, the advertising technology sector has seen a flood of investment in recent years. The result are digital marketing LUMAscapes littered with dozens of startups, most less than five years old. And, as always happens following huge waves of innovation, necessary consolidation has begun. Weaker entities are being acquired and integrated by players that are stronger or have deeper pockets. And signs of promise are emerging, with several vendors now offering complete integrated solution platforms able to target, execute, monitor, analyze and optimize efforts across various media, channels and touch points.
And, recognizing the need for consulting services, campaign strategy and integration, a number of companies and agencies now offer targeted services, often coupled with platform solutions that combine several third-party components into a bundled offering. These service providers promise to get marketers up and running quickly, providing external staff where needed and helping to shorten the learning curve.
The next few years will be an exciting time in digital marketing, as brands and agencies alike embrace data-centric approaches for greater accuracy and efficiency. And, presumably, better business results.