Disney Marketing ROI Case Study
DMA Conference
Presented by Disney & SAS
October 2012
Defining A Marketing ROI Solution
Reach the right audience
Through the right channel
Maximize Return
on investment for At the right time
marketing spend
With the right frequency
At the right price
Stand-alone studies often fail to achieve long-term
success—trying to implement a project instead of a process!
Presentation Agenda
• Introduction
– Disney Management Science & Integration
– SAS
– The Science Behind Marketing ROI
• Case Study Overview
– Project Goals & Organization
– Data Management
– Science Integration
– Tool Development
• Lessons Learned
• Questions & Answers
Disney Management Science and Integration
4 employees - 2008 30 employees - 2012
• Consulting support for analytics, data and reporting needs
• Technology integration for reporting and data tools
• Development and management of decision science tools
SAS® Company Overview
SAS® is the largest
independent software vendor
in the world
SAS Annual Revenue 1976-2011
• 2011 & 2010 Fortune
Magazine: #1 Place to Work
• 2011 Revenue: $2.73 billion
• SAS® reinvests ~25% of annual
revenue into R&D
• 90 of top 100 companies on
FORTUNE Global 500® use
SAS®
Science Behind Marketing ROI – Modeling
Marketing Effort Response
Measurement
Model For Each Channel vs. Variable
(spend, impressions, etc.) (sales, leads, etc.)
More
TV effective
Sales
Less
Radio effective
Marketing Spend
Science Behind Marketing ROI – The “Right” Model
Selecting the right modeling approach is critical for success!
Regression / Time Series Model Econometric / Panel Model
R2 = 97% R2 = 67%
Sales (t) = … Sales (t) = …
+0.7 * Sales (t-1) +0.2 * Sales (t-1)
-0.2 * Price -1.0 * Price
+0.06 * TV +0.1 * TV
-0.005 * Online +0.02 * Online
+… +…
Heavy weight on lagged sales; sales not Less weight on lagged sales; price &
responsive to price & media changes media elasticities more reasonable
Better for FORECASTING Better for MEASUREMENT
Science Behind Marketing ROI – Measurement
Analysts pay careful attention to data considerations and
choice of models to robustly fit the data for measurement
Impressions by Saturation
Media Type Curves Goodwill
Cable
Impressions
Print
Ratings
Ratings
Model
Radio
Time Spend Time
Model Input Model Output
Science Behind Marketing ROI – Optimization
Planners leverage model output and their insights to adjust
and optimize marketing plans per business constraints
Impressions by Saturation
Media Type Curves Goodwill
Cable
Impressions
Print
Ratings
Ratings
Model
OptimalRadio Optimal
Media Mix Flighting
Radio
Cable
Time Spend Time
Impressions
Print
Ratings
Spend
Radio
Spend Time
Case Study Overview
A television network is seeking decision science
support to improve return on investment for the
marketing of primetime television shows
• How effective is our current marketing spend?
• Which shows should get more marketing dollars?
• Which channels are the most effective? Most efficient?
• Based on current practices, where are we over-saturated?
Case Study Challenges
Previous attempts to answer these questions have yielded
valuable insights, but have not created sustained changes
• Avoid the temptation to answer all questions with a single model
• Ensure inputs into the solution are readily available and cost effective
• Avoid bundling decisions that are controlled by separate teams
Limited data availability prevents the network from getting
accurate measures of performance for marketing efforts
• Data is warehoused in multiple systems, with few connection points
• Impression-level data is extremely difficult to capture, with actualized
data existing in combinations of spreadsheets, e-mails, and faxes
• Given the state of the data, common reports can take days to generate
Disney and SAS® Partnership
Project Management 15% 15%
Data Management 30% 15%
Science Integration 30% 30%
Tool Development 25% 40%
Project Timeline
Established a separate timeline for each work stream, inclusive
of milestone and reports out to key stakeholders
Data Collection Overview
Data collection ultimately took four times longer than
originally planned, due in large part to data quality issues
• Identified over 30 potential data sources and almost 250 variables
• Data sources ranged from databases, spreadsheets, e-mails, and faxes
• Established weekly meetings with key stakeholders and implemented
dashboards to review data collection progress
• Placed an analyst in the media agency office for four weeks to speed
data collection and improve understanding of the data
Data collection is never really over—continue to find errors
or missed opportunities even months later!
Data Collection Challenges
Model database changed 17 times during a 1-year span,
most often due to missing data or data collection errors
Bad circulation Magazine Cume based
estimate for on all publications
Entertainment instead of purchased
Weekly
Nielsen P3
vs. C3
Duplication from
Misclassified OOH
SQL Errors
support as Events
“Week 53”
Issue
Data Visualization
Showing clients the relationship between impressions and costs helped
to identify likely errors in the data (e.g., misclassification of spending)
Data Visualization (cont.)
Exploring flights enabled us to recognize the need to model
certain media types differently than others
15% 70% 15%
Data Transformation
Often necessary to transform the data for measurement
variables in our models to avoid creating misleading
insights or recommendations
Episode Promos in Promos in
Air Date Calendar Week Past 7 Days
Sunday S M T W R F S S
Monday S M T W R F S S M
Tuesday S M T W R F S S M T
Wednesday S M T W R F S S M T W
Thursday S M T W R F S S M T W R
Friday S M T W R F S S M T W R F
Saturday S M T W R F S S M T W R F S
Transform to a
full week
Data Handoff to Science
Key milestone was the go/no-go decision on beginning
the development of the measurement model
NIELSEN PROMOS & MARKETING AWARENESS
Program Name On-Air Promos Survey Respondents
TRPs, Seconds, # of Spots
Air Date Digital Aware Respondents
Impressions & Clicks
Start Time Cinema % Aware
Impressions, Seconds Per Spot Unaided & Aided
Duration National Cable Intent to View
TRPs Top Box, Top 2 Box, Non Committed,
Bottom Box
Program Type Newspaper
Impressions & Circulation
Program Rating Magazine
Total & Weekly Impressions
Lead-in Rating Spot Cable
TRPs & Impressions
Competition Spot Radio
TRPs
OOH
Impressions
Data Handoff to Science (cont.)
Future iterations of the model will incorporate new data that is either
unavailable right now or represents a higher level of complexity
MISSING DATA MISSING COMPLEXITY DATA RECONCILIATION
Network Radio On-Air Promos On-Air Promos
Day-of-Week, Promo Length
Synergy Cable Nielsen Digital Impressions
Reach, Share, HUT, PUT
Synergy Online Print
Size, Placement, Inserts
Emails & Newsletters National Cable
Channel, # of Spots, Promo Length
MODEL EXPANSION
Public Relations Spot Cable & Radio
# of Spots, Seconds of Promo Geo-Panel Data
Local Market Ratings and Marketing
Affiliate Promotions OOH
# of Units, Size, Media Form On-Air Promo Precision
Minute-by-Minute Ratings
Digital
Size, Placement, Pillar Efficiency
Costs for Marketing & Promotions
Social Media
Facebook, Twitter, Blog Mentions
Science Integration
Integration between the team managing
data collection and model development
is critical to the success of the project
Science Data
When it doesn’t work well—each revision of
the data model would delay the science
timeline by 3 weeks!
Critical to integrate science team with tool
developers to ensure alignment with the
expected input and outputs of the models
Science Tool
Overview of Planning & Optimization Tool
The tool is designed to become self-sustaining to support updates
to the measurement model and to allow media plan comparisons
Historical Data Measurement
Data Model Model Model
(one time) Adjustments
Actualized Optimization Goals &
Media Plans Model Constraints
Agency Approved Recommended
Media Media Plans Media Plans
Plans
Optimization Goals
Objective is to maximize total ratings for the premiere episodes
of all shows within a marketing campaign portfolio
• Provide recommended spending by channel for each show/week
combination
• Allow users to input constraints on total spending by show/channel/week
• Define spend thresholds that reflect minimum purchase amounts for
each channel
• Compare optimal recommendations against manually created plans
Critical to understand relationship between spend and
impressions; some channels have a significant delay between
purchase and delivery!
Evaluating Media Plans
Ability to compare different plans by measuring the number of
new households generated for each incremental unit of spend
Recommended Plan: Week Cable Radio Print Outdoor Cinema
(balanced by optimization) t = -5 20 N/A 20 20 20
t = -4 20 N/A 20 20 20
t = -3 20 N/A 20 20 20
t = -2 20 20 20 20 20
t = -1 20 20 20 20 20
t= 0 20 20 20 20 20
Media Agency Plan: Week Cable Radio Print Outdoor Cinema
(incremental opportunities) t = -5 70 N/A 80 110 10
t = -4 105 N/A 5 170 4 15
t = -3 160 N/A 5 3 5 30
t = -2 240 5 5 10 80
t = -1 355 1 75 2 25 30 125
t= 0 25 10 50 50 150 5
Key Lessons Learned
Designing a
Creating Clear
Structured QA
Requirements
Process & Team
Having a Test “Shadow”
Environment Implementation
Questions and Answers