What Your Company Needs to Know About Potential Online Privacy Regulation

[Part 1 originally published in Colorado Biz on 01.18.11.
Blog post updated on 03.06.11 to include Part 2 of the article.]

The internet is a wonderfully measurable place. Businesses are able to use online data to drive strategy and measure return on investment. However, the wealth of data that makes the online world a prime space for analysis is also leading to consumer concerns over privacy. Facebook privacy controls, data capture, ad targeting and mobile applications have all been the subject of privacy discussions in the media.

A recent survey by USA Today and Gallup suggests that only 35% of respondents believe that “the invasion of privacy involved [in behaviorally targeted online ads] is worth it to allow free access to websites”, with younger respondents (40%) being more willing to accept this than older respondents (31%). However, while only 14% would allow all companies to target ads to them, another 47% would be willing to allow the advertisers they choose to target ads.

With so many concerns out there about data capture and privacy, what is a company to do to ensure their behavior and data practices are not called into question – or made front-page news?

What kind of data does your company use?

First, your company needs to differentiate between types of data capture, understand what you are leveraging, and the current climate around different types of data use.

Understanding the current landscape

Recently, the Federal Trade Commission released their draft report on Consumer Privacy. The FTC’s report distinguished first and third party data capture, with different views as to what consent and regulation should be required for each.

First party data includes web analysis done through tools such as Google Analytics, Webtrends and Adobe Omniture, for the purpose of improving consumers’ online experience and a company’s profitability online. First party data use also includes first-party marketing: a company recommending products or services based on a consumer’s previous purchases. The FTC recommended that this type of data capture not require specific consent, as these are considered commonly accepted business practices.

Third party data capture, however, is considered separately. This includes companies that deal in the buying and selling of information. For example, ad networks who buy and sell data to allow delivery of highly-targeted advertising. The FTC’s main concern regarding third party tracking is not banning the practice, but rather, allowing for informed consumer choice. While the FTC declined to declare opt-in or opt-out as the appropriate method for expressing consumer choice, the FTC did call for a Do Not Track mechanism, enforced through either legislation or industry self-regulation.

Legislative vs. Self-Regulatory approaches

The FTC’s recommendations open the door for potential legislation of online privacy and data capture. However, the Commerce Department has recently recommended self-regulation.

The Commerce Department disfavored prescriptive rules, noting the need for an approach that allows for rapid evolution and innovation, while enhancing trust. The Department called for voluntary but enforceable codes of conduct that promote transparency in data collection and use, and recommended enlisting the “expertise and knowledge of the private sector” in this regard.

The web analytics industry in fact recommended this very thing back in September 2010. A voluntary code of ethics for web analytics practitioners was proposed and drafted by Eric T. Peterson and John Lovett of Web Analytics Demystified, in conjunction with the Web Analytics Association, and a second initiative has begun regarding consumer education.

New medium, same challenges

While online data and privacy may seem new and uncharted territory, this is simply a new medium for similar challenges faced off-line. For example, consumer acceptance of tracking and targeted advertising in exchange for free online content is not too different to accepting grocery store data capture via loyalty cards in exchange for discounts. The difference is that online data capture is a newer, without a well-established procedure for privacy safeguards, and a lack of education about what the benefits or exchanges for tracking may be.

What is required in the industry is two-fold:

  1. Finding the appropriate way (not necessarily legislatively) to establish and regulate those safeguards online; and
  2. Educating consumers about the types of data capture and use, and potential benefits, to allow for informed consent.

How can a company protect itself?

So how can a company protect itself, in light of the current uncertainty around online privacy?

Safeguard consumer privacy as if you are already legislated to.

  1. This has two benefits: If enough companies voluntarily safeguard consumer privacy, legislation may not be needed, leaving flexibility for companies to find the right way to protect privacy within their own business model; and
  2. If legislation does occur in the future, your company should not require major changes to your privacy model to be in line with the requirements.

Follow FTC recommendations and integrate privacy considerations into your company’s daily business practice, by:

  1. Taking reasonable precautions to protect the safety and ensure the accuracy of data;
  2. Only collecting data required for a specific, legitimate business need (rather than capturing data “in case” it can later be monetized); and
  3. Ensuring your data retention periods are reasonable.

For first party data capture and marketing, ensure that you have a plain language, non-legalese privacy policy that allows consumers to understand what data you’re capturing, how you use it, and clearly distinguish your first-party data use from third-party data use.

For any third-party data capture, make your practices transparent (this means not burying information behind legal jargon!) and educate your consumers. Advise what data is being captured, the benefits to the consumer, and provide an easy way to opt out. (A hint: if the only benefits are for your company, and not the consumer, you should expect a high opt-out rate.)

Additional considerations

For companies in business overseas, keep in mind that privacy laws may differ between countries. For example, Europe’s privacy laws are already stricter than the United States, and will potentially receive further overhaul in 2011 to modernize the 1995 Data Protection Directive.

Prepare for the future

Online privacy is not likely to quiet down in the coming months. However, by being proactive and considering consumer privacy in your daily and long-term business strategy, your company can set itself up on the right side of proposed legislative or self-regulation.

Measuring a successful visit to a content site

So you have a content-based site, and you want to know whether your visitors’ time on your site was successful.

You have two options:

  1. Attempt to measure this via their on-site behaviour; or
  2. Ask them, via one of the many “voice of customer” solutions.

This post will deal only with #1.

Content sites can be challenging to measure the success of a visit, simply because there’s not necessarily one path. Rather, revenue is often generated via advertising, where page views = ad impressions = revenue.

If you are trying to measure the success of your content site, there are a few ways you can go about this.

  • Page Views per Visit: Seeing a large number of PVs/Visit could indicate a visitor has found information that is useful to them and has had a successful visit. However, a lost or confused visitor would also generate a large number of page views. How do you distinguish the two?
  • Time on Site: This too could indicate a successful visit. However, it could also indicate that someone is spending time searching for (and not finding) what they want.

So how could you better measure success?

  • Focus on valuable pages. A high number of page views to actual content suggests a more successful visit than a high number of page views that might include, say, site searches. Therefore, focus on PVs/Visit (or Time Spent) to a subset of pages. This can be more valuable than site wide PVs/Visit or Time Spent.

But you can do better. First, you need to assess why your content site exists. What behaviour can a visitor perform that would indicate they successfully found what they were looking for?

  • For example, your site exists to provide information X – that’s the goal and purpose of your site. Therefore, a visitor seeing content X achieves that goal, and suggests they had a successful visit.
  • If your site exists for reasons X, Y and Z, a successful visit could be a one that saw one or more of of X, Y or Z.
  • Setting up goals or segments around these behaviours can help you measure over time whether your visitors are performing these behaviours. Can better navigation drive up the percentage of visitors successfully completing this task? Which tasks are more popular? Are you even doing a good job of communicating what your site exists for? (If very few actually complete that main task or tasks, I’d suggest probably not!)

A final note: the intention of measuring a successful visit to your site is to measure this success from the point of view of the visitor. Is your site doing a good job of providing what visitors want?

This “success” doesn’t necessarily tie to short-term revenue for a content site. After all, a successful visit might be one where the visitor comes in, finds what they’re looking for immediately, and leaves. However, that visitor might generate more ad impressions by getting completely lost on your site. Good for you … in the short term. But doesn’t mean they had a successful visit to your site, nor does it bode well for your long-term revenue.

Therefore, measurement of visit success should be analysed alongside measures of revenue success, while carefully weighing the long-term benefits of successful visits (and happy visitors) against the short-term revenue generated by “lots and lots of page views”.

Simplifying vs Oversimplifying vs Acknowledging Complexity

As analysts, we need to work with complexity, while simplifying for end-users, yet avoid oversimplifying. Naturally, this is easier said that done …

Simplifying for others: This is incredibly important. If you can’t explain the problem and findings to someone in 25 seconds or less, you 1) will likely lose their attention, and 2) possibly don’t understand it well enough yourself to explain it yet. That’s our job. We work with the details and bring others in on the conclusions.

Oversimplifying: The balance required is to simplify the final conclusions without oversimplifying the problem, the data, or your analysis. The struggle, however, is that our brains are hard wired to simplify information.

Think about the amount of stimuli your brain receives every day. For example, you are crossing the street. I am outside. I see a long stretch of gray. That is a road. There is a red thing coming towards me. I hear a noise. The red thing is making the noise. That red thing is a car. Cars can hit me. It is going at 45mph. I am stationary. It will reach my location in 4 seconds. I will take 10 seconds to cross the street. I should not walk yet. And of course, I’m completely understating all that goes through our brains for even simple tasks. If our brains didn’t find a way to make sense of a high volume of inputs, we simply wouldn’t function.

Acknowledging Complexity: The challenge for analysts is to try to simplify the answer, without oversimplifying the questions along the way. If you make erroneous assumptions because it (over)simplifies your analysis, you could end up drawing the wrong conclusions. You will probably make your analysis easier, but render it less valuable.

We need to acknowledge, work with (and enjoy) complexity. (And we had better get used to it, because the digital measurement space is not getting simpler.) However, we need to avoid oversimplifying more than is necessary to sift signal from noise. We need to question what we know, evaluate what we assume, and separate fact from opinion. And if in doubt, invite someone else question you or poke holes in your analysis. Chances are, they’ll spot something you didn’t.

What analysts can learn from group fitness instructors

Les Mills RPM

I am an analyst and a certified Les Mills group fitness instructor for BodyPump (weight training), RPM (indoor cycling), BodyCombat (mixed martial arts based group fitness) and BodyJam (dance based group fitness.)

While analyst and group fitness instructor seem very different, there’s actually a lot that analysts can learn from instructors.

When we are trained as instructors, we spend a lot of time thinking about how different people learn, and how to teach to all of them.

Visual learners need to see it to understand. In group fitness, these participants need you to demonstrate a move, not explain it. In analytics, this may mean visually displaying data, using diagrams, graphs and flow charts instead of data tables – and perhaps even hitting up the whiteboard from time to time.

Auditory learners need to hear it. In group fitness, they rely on verbal cues from the instructor. In analytics, you may have a thousand beautiful visual displays or PowerPoint slides, but it’s your commentary and explanation that will help these people understand.

Kinesthetic learners need to feel it to understand, to experience what you’re talking about. In group fitness, you can show them and tell them, but what they need is to feel the difference between “the right way” and “the wrong way” (for example, “Oh, now I can feel how muscle x engages when I turn my heel!”) This is the same group that tend to need repetition to perfect what they’re doing. In analytics, these are often the people that need to be led through your logic. It’s not enough to show them your findings, and to display the final results. They need to see the steps along the way that you used to answer your questions.

Now here’s where it gets trickier. When you are presenting to a group, they won’t all be the same type of learner. Which means that a good group fitness instructor and a good analyst needs to explain the same thing in different ways to ensure that everyone understands. For an analyst, this may mean using visual displays of information on your slides, talking through the explanation, and giving a step-by-step example to put everyone on the same page.

Keep in mind that you too have your own learning style. Your analysis and presentation style will likely match your learning style. (If you are a visual learner, a visual presentation will come easy to you.) It may take a more conscious effort to make sure you incorporate the learning styles you do not share. However, by tailoring your message to ensure you hit all learning styles, you stand the best chance of getting everyone to the same understanding.

Are you an expert?

Guru. Ninja. Rockstar. Expert. These descriptions are all over the place. (*cough* Twitter bios *cough cough*)

Done learning. This is what I hear.

Here’s the deal. Calling yourself an expert sounds like you think you’ve got nothing left to learn. How can you be an expert in web analytics, or social media? These fields have been around for all of about forty-five seconds. (And they’ve changed twenty-seven times since then!)

My $0.015:  Don’t ever call yourself an expert, a guru, a rockstar. (And don’t just replace it with samurai or swami. You get my point.) Someone else may call you that, but let’s be honest, even then you should shrug it off.

The most appealing trait is a desire to learn, improve, to continue honing your skills. Focus on that. Let your work and development prove yourself. Not a self-appointed noun.

A month of fun with ClickTale

As a self-confessed geek, when I hear about a new tool and there’s a free option I can play with, I naturally implement it right away. After hearing the Beyond Web Analytics podcast discussed ClickTale, I took the opportunity to implement it on my blog for a little “new analytics tool fun”.

After a month playing around with this tool, here are some of my impressions:

Unique value

It is not merely repeating with what your Google Analytics, Omniture, WebTrends, etc tracking tells you. There is a definitely unique value. Think of the standard web analytics tools as telling you visitor’s behaviour between pages (e.g. visitors go from page X to page Y, exit at page Z.) ClickTale tells you what they do within a page (how far down do they scroll? Do they start filling in your form? What do they click? Where does their mouse move?)

Recordings of a site visit

One of the original offerings of the ClickTale was simply recordings of your visitors’ behaviour on your site. You literally watch users scroll up and down, click a link, go to another page, etc. It’s like sitting over their shoulder, or user testing without the option to ask them why they did something. However, no one can possibly watch all the videos, especially on a large site. The real value of the tool is what ClickTale added next: the aggregation of all of those videos into heat maps.

Aggregated heat maps

You can view an aggregated heat map of:

  • Mouse moves
  • Clicks
  • Scroll reach
  • Attention (via mouse attention)

The scroll reach is pretty interesting, especially on a blog, since normally the main page is a six-mile-long history of previous blog posts, and it’s interesting to see how far down people scroll.

ClickTale uses mouse over attention to estimate eye tracking (due to the correlation between the two) but are pretty clear that they don’t intend this to be a perfect replacement of eye tracking, merely an affordable way to get close to that kind of information.

Example heat maps:

So as you’ll see, the value of the heat maps is:

  1. Not having to troll through multiple videos for insights. That’s just not possible with a site with millions of visitors.
  2. In-page information that definitely complements what is provided by your standard web analytics tool.

For a blog specifically, the heatmaps can also be a way to see what of your content people are reading. If a visitor clicks to read a specific post, obviously you know they took this action even in a standard web analytics tool. However, where they read a blog post on the main page, ClickTale can fill in the gaps of what they read via scrolling and attention. This is a great insight missing from a standard tool.

A concern for frequently updated sites (such as blogs) might be the impacts of a site changing throughout the day, via adding new posts. Never fear, that has been taken care of: you can choose which version of a page you want to view the heatmaps for, if there are multiple versions throughout a timeframe.

Some other nifty features:

Form analytics

I only saw a demo of this, as my site isn’t really form-heavy, but to be honest, this thing rocks. Who starts filling in a field then stops? Who has to refill in a form? How much time do they take to fill in each field? What is form engagement vs form submission? This information is much richer than a “X number saw the page that had the form on it, then Y% saw the thank you page.” It’s pretty awesome. Check out this demo of it on ClickTale’s site:  http://www.clicktale.com/product/form_analytics

Page and Site Analytics

ClickTale will also tell you which of your pages are the:

  • Most engaging
  • Most clicked
  • Most errored
  • Least scrolling
  • Slowest loading

The engagement time is pretty sweet also. In a world of tabbed browsing, a visitors may come to your site, read your post, but not close the page. (“Tick, tick, tick” goes Google Analytics time on site.) I myself tend to have multiple browser tabs open with links I’ve clicked from Twitter. ClickTale measures the time they actually spend engaged with the page, via mouse moves etc. It’s a richer metric than time on site.


I’ve only mentioned some of the functionality of ClickTale that I enjoyed. There are also options to search, find and watch videos matching certain criteria (e.g. videos from visitors coming in through search, or seeing a certain page – great for watching playback of site errors.) There is also an option for Omniture integration, which I didn’t try (as my blog doesn’t use Omniture) but is nice knowing it’s there for enterprise use.

All in all, my conclusion: ClickTale doesn’t replace a standard web analytics tool (nor do they claim it does, or should) but it’s a great supplement to give you more in-depth information about what people do on a page. I believe a clear competitor for this product is TeaLeaf, which I have seen a demo of, but not been in a position to use. The main thing that sways me towards ClickTale, even on an enterprise basis, is the price tag. TeaLeaf definitely seems a more costly solution. Now, it’s completely possible that TeaLeaf’s price tag is justified by the functionality; I haven’t used it so I can’t speak to the differences. But unfortunately, the reason I haven’t used TeaLeaf is because I couldn’t get past the price tag …

The best part?

Very easy to implement: Two lines of code (I could implement it. ‘Nuff said.) And seriously – all of this with only two lines of code. No special click tracking or form tracking. It’s as easy as implementing Google Analytics.

FREE! They have a free version that you can use on small volume sites. Plenty for us web analytics geeks to play around with! Now, there’s some functionality you miss out on with the free version, but it’s plenty to get you into the tool and allow you to evaluate whether the insights may be worth a small investment.


I was fortunate enough to be given the opportunity to use ClickTale on my blog at an enterprise level access (thank you, ClickTale and Shmuli Goldberg!) Some of the features may only be available via paid subscriptions, and not in the free version, but the free version is definitely of value and worth checking out.

However, also keep in mind I used it on my blog, not on a larger corporate website. There may be some functionality that I’d like to have on a larger site that I didn’t notice was missing, just because of the size of the site I was looking at. I don’t claim this tool is the magic cure for any of your analytics ills, but it’s definitely worth looking at, to see if it might help answer some of the questions left open by your standard analytics tool.

Plus? It’s fun!

The Web Analytics Community

Like many web analysts, I pretty much “fell” into this field. Coming from Law and Psychology degrees, it certainly wasn’t a clear path. At the time, I happened to be an assistant, and the web analytics team happened to need “temporary” help. Five years later, here I am: incredibly grateful to have fallen into a field that is fascinating, in high demand, and full of smart, engaged people.

Here’s a great example: the Analysis Exchange. This program brings together “students” of web analytics, mentors to support them, and non-profit organisations needing analytics assistance. (Note: I say “student” because you don’t need to be an actual student.) There are currently 1,000 members, and no open projects. As many needs as there are out there, the analytics community is rushing to fill them.

The fascinating thing about the Analysis Exchange is the skill level of some (probably most!) of the “students”. My first project (and currently only project – see above, no open projects!) went a little like this: Wow, my student is smart. I hope there’s even anything I can help him with! (He later said there was. I hope he wasn’t humouring me…!)

In fact, as Analysis Exchange mentor Jason Thompson commented on Twitter recently, many of the people we look up to in the analytics community are offering their services as students! When I myself first signed up, I was torn between student and mentor. Yes, I currently manage a team of analysts. However, would I be knowledgeable enough in a completely different setting and business model? Not to mention,  any good analyst can always use more hands-on experience! So I won’t lie – going in as a student definitely crossed my mind. The basis for my eventual decision to be a mentor was that I wanted to leave the full-on student experience for those interested in web analytics, who may not have access to data or real life examples.

So here’s how our community works. You have a group of people who are in high demand, flat-out at their “real jobs”, with a real life outside of work. Despite this, they volunteer their time for 1) their own continued education and development and 2) to assist others in their growth. Most of these people never really think they’re ready to be a mentor, because no one considers themselves to be an “expert”. We think there’s always more to know, more to learn.

Web Analytics is developing fast. But with these people in our community, there is no worry that we won’t keep up. In fact, we’ll keep pushing, and moving our field even further forward.

PS. You know you’re addicted to Twitter when you have to consciously write “the web analytics community” rather than the “#measure community”.

Analytics must-read list

Having just spent approximately $150 on books on Amazon.com, I got to thinking what else could fill up that list for my next order. (My husband is cringing as I write this.)

So here are my read and “up next” lists. I would love to hear your recommendations, both for myself and others to add to their reading lists.

Been there, read that:

  • “The Big Book of Key Performance Indicators” by Eric T. Peterson (FYI, available for free! at Web Analytics Demystified)
  • “Web Analytics Demystified” by Eric T. Peterson (also available for free! at Web Analytics Demystified)
  • “Web Analytics: An Hour A Day” by Avinash Kaushik
  • “Web Analytics 2.0: The Art of Online Accountability and Science of Customer Centricity” by Avinash Kaushik
  • “Competing on Analytics: The New Science of Winning” by Thomas H. Davenport and Jeanne G. Harris
  • “Tipping Point” by Malcolm Gladwell
  • “Buyology” by Martin Lindstrom
  • “Five Dysfunctions of a Team” by Patrick Lencioni
  • And for fun, “Freakonomics” by Steven D. Levitt and Stephen J. Dubner

Looking forward to my Amazon delivery for:

  • “Analytics at Work: Smarter Decisions, Better Results” by Thomas H. Davenport, et al
  • “Being Wrong: Adventures in the Margin of Error” by Kathryn Schulz
  • “Social Media Metrics: How to Measure and Optimize Your Marketing Investment” by Jim Sterne and David Meerman Scott
  • “How to Think Like An Economist” by Roger A. Arnold
  • “Why Can’t You Just Give Me The Number? An Executive’s Guide to Using Probabilistic Thinking to Manage Risk and to Make Better Decisions” by Patrick Leach
  • “Blink” by Malcolm Gladwell
  • And for even more fun, “Superfreakonomics” by Steven D. Levitt and Stephen J. Dubner

What are your must reads? I’d love to hear them.

UPDATE: Thanks for the comments and recommendations! Here is a summary of the recommendations:

  • “Only the Paranoid Survive” by Andy Grove (from Rudi Shumpert)
  • “Cult of Analytics” by Steven Jackson (from Jennifer Day and Emer Kirrane)
  • “Cartoon Guide to Statistics” by Larry Gonick (from Jennifer Day)
  • “The Book of Think” by Marilyn Burns (from Jennifer Day)
  • “Reading Virtual Minds” by Joseph Carrabis (from Jennifer Day)
  • “Most Harmless Econometrics: An Empiricist’s Companion” by Angrist and Pischke (from Michael Healy)
  • “The Myth of the Rational Voter: Why Democracies Choose Bad Policies” by Caplan (from Michael Healy)
  • “The Richness of Life: The Essential Stephen Jay Gould” By SJG (from Michael Healy)
  • “Zero: The Biography of a Dangerous Idea” by Seife (from Michael Healy)
  • “Yahoo! Web Analytics: Tracking, Reporting, and Analyzing for Data-Driven Insights” by Dennis R. Mortensen (from Emer Kirrane)
  • “Made to Stick, Why Some Ideas Survive and Other Die” by Chip and Dan Heath (from Meng Goh)

OMMA Metrics #2: The full story

For those who were not as fortunate to attend OMMA Metrics in San Francisco, here are my key takeaways from each of the sessions. For those who did, I would love to hear yours.

Yes, it’s long. Feel free to skim what is of interest to you. There is no pop quiz!

Note: below are two fun, random facts that I enjoyed learning!

Evolving Analytics: Measuring and Analyzing the Digital Ecosystem at Lightspeed
Judah Phillips, Sr Director Global Site Analytics, Monster Worldwide

  • Everything in analytics is evolving: the skills needed, the size of teams within a company, the importance in an organisation and exposure to executive management, technology and tools and the scope of what we’re analysing (eg social, mobile, video, and tying traditional media back to the site.)
  • To compete on analytics requirements investment in people and technology.

Digital Measurement: A Retrospective and Predictions for the Future
Eric T. Peterson, Web Analytics Demystified

  • 50:50 rule: invest half your analytics budget on technology, half on people.
  • At scale, a centralised analytics group is all that ever works. Analytics should be in the center of marketing, operations, management, etc, but have “superusers” within each of the other departments. Decentralized analytics does not work.
  • We need to develop faith and trust from stakeholders by having more answers than questions.
  • Analysts are a service organisation. We have forgotten this! We need to serve to provide business intelligence: deliver incredible value to drive revenue; this will build trust.
  • We need to move beyond first generation tools and reporting and need to generate insights and recommendations.
  • Commit to create a measurable impact! “If you give me a testing tool, I will delivery a 5% lift in X.” At worst? If you fail, they’ll fire you and you’ll move on to another position (probably with a salary bump!)

Measuring Social Media: From Listening to Engagement to Value Generation
with Jascha Kaykas-Wolff (Involver), Anil Batra (POP), Jonathan Corbin (Wunderman), Taddy Hall (Meteor Solutions), Rand Schulman (Eightfold Logic and Schulman + Thorogood Group)

  • Think about 1) Reach, 2) Engagement, 3) Impact of social
  • To strategically enter into social, need to identify your objective, pick the appropriate channel (e.g. are the users you’re trying to reach on Facebook? Twitter? YouTube? etc), then find the right KPIs that take into account objective and channel.
  • Scalable measurement and monetisation is what is currently missing from social media.
  • Companies need to identify who the influences are, and who they influence.
  • Integrate social with other channels, and understand it in the context of all your marketing.
  • Don’t be afraid to do something different!
  • To measure success, ensure you take a baseline.

Analysing Across Multiple Channels: What Works and What Doesn’t for Multichannel Measurement
Akin Arikan (Unica), Roger Barnette (SearchIgnite), Casey Carey (Webtrends), Kevin Cavanaugh (Allant Group), Terry Cohen (Digitas), Andy Fisher (Starcom MediaVest Group)

  • Some channels are more involved with certain areas of the lifecycle. E.g. Mass media to attract attention, online to engage consumers and persuade, offline to grow and retain.
  • There are forty years of multi channel experience, but digital breaks all those rules. How do you mix it in?
  • Maturity of multiple channel measurement is mixed – some companies are doing a lot, but many are not, for a variety of reasons (e.g. silo nature of the organisation, perhaps using multiple agencies, etc.)
  • Financial services is ahead of the multi channel game, because they have statisticians, data, tools, etc.
  • Traditional media measurement has 30-40 years experience. In comparison, the techniques in digital are laughable. Digital needs to learn from this. However, in the digital space we embrace change, are fearless, and figure out how to benefit from the change. Need to combine these two: increase mathematical rigor in digital, and embrace change in traditional media. [Aka “Andy’s grumpy rant”]

A Measurement Manifesto
Josh Chasin (comScore)

  • Future of digital is not in selling clicks and click throughs.
  • Digital has a seeming ability to measure everything, but in some ways this hurts us. We’ll never be the most simple medium. The landscape is not simple, and it’s not getting simpler. However, we have opportunity for doing great and ground breaking things with metrics.
  • Strengths of digital: portable, affinity (consumers cluster around content of interest, and even create that content!)
  • Targetability makes audiences small. Affinity makes audiences relevant.
  • 20th century was the generation of the shouting brand. 21st century will be the listening brand.
  • Digital order of operations: Ready, aim, fire, measure, aim, fire, measure, aim, fire, measure …
  • We need to measure: audience size, ad effectiveness (across platforms), attribution, engagement, voice of the customer and brand robustness.

Engagement the Mobile Experience: Effective Mobile Measurement Strategies
Raj Aggarwal (Localytics) , June Dershewitz (Semphonic), Joy Liuzzo (InsightExpress), Evan Neufeld (Ground Truth), Virgil Waters (Acceleration), Jamie Wells (Microsoft Mobile Advertising)

  • Mobile often has different methods of data collection, as the common javascript tags may not work.
  • Some mobile metrics are the same as site analytics, but some are different. Mobile web is similar to desktop web, but applications can be unique.
  • Benefits of mobile analytics: you can have a more accurate reach metric, since there is a device ID, and people rarely share devices. Location is more granular and valuable.
  • Mobile is unstable right now – we are trying to figure out mobile analytics in a shifting environment. E.g. What will succeed: mobile web or apps? Will one succeed the other? Will there even remain a distinction between them?
  • Tools for mobile analytics: 1) Traditional web analytics tools and 2) Niche vendors (or a combination of both.) The benefit of traditional tools is the integration with your site analytics. The benefit of niche tools may be higher-value, mobile-specific data.
  • Third party measurement and Apple: feeling from the panel is that Apple will be forced to play by the market, and likely change its policies over time.

Metrics and Measurement at eBay
Bob Page (eBay)

  • eBay has a huge range (and volume!) of data (e.g. marketing, finance, customer service, user behaviour, web analytics, etc.)
  • There is no silver bullet. No one product will solve all your needs.
  • They have a huge datawarehouse that contains virtual data marts for different groups (e.g. marketing vs. finance) rather than silos.
  • They also have an internal web analytics community, building a type of “Facebook for analysts”: an internal social network where analysts can subscribe to each other’s feeds, look at the latest videos, discuss issues in forums, share PPTs etc.
  • Have a centralised technical team under the CTO, who is responsible for infrastructure, support etc.
  • Centralised business analytics team under the CFO, responsible for common, standard “north star” metrics.
  • Distributed product analysts in each business.
  • Note: size of the technical teams to support this is similar to the size of the core analysts.

Managing Analytics: An Executive’s Perspective on What Works, What Doesn’t, Best Practices and Lessons Learned
Phillips (Monster Worldwide), Matt Booher (Bridge Worldwide), Yaakov Kimfeld (MediaVest), Dylan Lewis (Intuit), Jodi McDermott (comScore), David L. Smith (Mediasmith)

  • Executive sponsorship of analytics is changing: “We used to have a megaphone, now we have a seat at the table.”
  • Centralisation enables standardisation, and helps with the evolution of analytics.
  • Where Analytics lives in the organisation: sometimes with the CFO, sometimes within marketing – differs within different companies. Analytics needs to own the technology and the data, though technical teams may actually implement.
  • Challenge: Lack of standards, lack of an organisational body.
  • Challenge: Executive distrust in the data or its validity. Jodi at comScore spoke of 4-6 months of having to explain data capture and constantly evangelizing before executives would place faith in data over gut.
  • Challenge: Hiring/recruiting. Companies want to find everything in one person: a technologist, a marketer, a statistician. Region can make hiring even more difficult. General sense is to find the right individuals/hire for instinct. You can always teach people the finer points of being an analyst (e.g. how to use a particular tool.)
  • Project management: Ticket-type system, scrum process. But no matter the project management used, requires ruthless prioritisation

Online Measurement: The Good, the Bad and the Complicated
Joe Laszlo (IAB)

  • The good: Online measurement is competitive, we have many vendors options. Vendors have integrity and are continually innovating. We can measure nearly anything.
  • The bad: Contradictory metrics from vendor to vendor, and changes in methodology can render dramatic fluctuations in measurement.
  • Online is managing to capture direct-response dollars, but not branding dollars. This is because brand marketers want to understand what their spend did for brand awareness, purchase intent, etc. What they get is “engagement”: view throughs, time spent, etc. There is a disconnect between what measures of success digital offers them and what they want.
  • Traditional media measurement allows calculation of reach and frequency. Also has years of experience of what matters, and has well-accepted metrics.
  • Lack of online measurement standards makes accurate data comparisons impossible. This can not be solved by any individual company, therefore the IAB is tackling through a cross-industry task force.

Modeling Attribution: Practitioner Perspectives on the Media Mix
Cesar Brea (Force Five Partners), Gary Angel (Semphonic), Jason Harper (Organic), Drew Lipner (InsightExpress), Manu Mathew (VisualIQ), Kelly Olson (Red Bricks Media.)

  • Attribution: What campaign/medium is responsible for the sale?
  • But there are more questions now: Is it better for someone to touch campaign a and b? What about b first then a? It’s not just the attribution, but do the two campaigns contribute together, in what order, or are two overkill, etc? E.g. Evidence that display with search adds value to search: someone searches after seeing a banner ad.
  • Can get a lot of benefit from evaluating click attribution, but even more from impressions optimisation.

Understanding the Multi-Screen Consumer: What’s on their Screens, What’s on their Minds
Alison Lange-Engel (Microsoft Advertising)

  • We now access online content via a variety of screens: computer, mobile device, TV, gaming consoles. We are always on and always connected.
  • Microsoft Advertising conducted survey to answer these questions.
  • The most active segment is 24-35 year olds.
  • Consumers are rapidly adopting technology and want control of the experiences.
  • Online gamers are the “game changers”. They do more of everything, all the time, social influencers. They spend the most time blogging, viewing, texting links. They view their game console as a communication device.
  • A linear funnel is not relevant anymore, as all screens impact purchase and allow an impactful story to be told.
  • Computers and smartphones are the key points of purchase.
  • The younger segments are accepting of advertising across multiple screens, actually want information and entertainment. They find ads helpful when they are targeted to their preferences and interests. They want a consistent experience across screens, and like the ability to access content across multiple screens – it actually improves their opinion of the content provider.
  • The key to success is: Consistent messaging + connected to other mediums + relevant = engagement and results.
  • Full report at advertising.microsoft.com/multiscreen.

And fun facts for the day:

  • The birthplace of web analytics is Hawaii!
  • Web Analytics is still small. All the web analytics companies sold for less than DoubleClick!

For those who did not get to attend this event, I highly recommend checking it out next year. It was interesting, informative, with great choice of speakers and a nice mix of presentation vs. panel discussions. Learning has never been so fun!

OMMA Metrics #1: Fun with word clouds

I’m back from a fantastic experience at MediaPost’s OMMA Metrics, organised by Judah Phillips.

I am putting together a full write up on the takeaways from the amazing sessions, as well a review of the event as a whole (all good, I promise!) However, because I’m a nerd and wanted to take in as much info as possible, I took copious notes at each of the sessions (5,001 words in total over the one-day conference – and no, I’m not kidding) and put it together into a delightful word cloud using Wordle.

Enjoy, and stay tuned!

[Click to view larger image]