The Year of the Analyst

[Originally published in Colorado Biz]

The advent of digital brought with it the incredible measurability of the online channel. When coupled with a recession, where every dollar counts and profitability of every move is questioned, data-informed decisions have become critical to many companies. Analytics is not just reserved for companies at the top, but is becoming a cost of successfully doing business.

It’s not just about the collection of copious amounts of data, but on the integration and use of it. Ultimately, companies need the right resources in place to analyze, interpret and recommend new courses of action. A heavy investment in tools, without investment in people, is seldom successful.

Welcome to the new breed of analysts. Whether companies are hiring a “web”, “digital”, “cross-channel”, “marketing” or “business” analyst, there’s no doubt that it’s a great time to be a data geek.

Increased demand

The demand for analysts is directly related to the growth of online business, aided by the proliferation of mobile devices such as smart phones and tablets. The responsibility of the “Web Analyst”, whose role was initially focused just on behavior on a company’s website, has already expanded in scope, evolving to include online, mobile, social and traditional channels, as well as the integration of online and offline.

Those already working in digital analytics can attest to the barrage of recruiter calls to coax experienced analysts over to a new company, as the demand exceeds the number of analytics professionals available in the market.

Greater awareness at the college level

Growing awareness of the profession at the college level will help to (slowly) fill some of this demand. Educational institutions are starting to introduce courses tailored specifically at this new field. The University of British Columbia began offering an award program in web analytics in 2005, and other schools are following suit, with certificate programs or course work within marketing focusing on digital analytics.

The existence of these programs can help make students aware of a career in digital analytics. Programs such as Marketing, IT, Business, Economics, Mathematics and Statistics continue to lay a great foundation for a career in digital analytics, and still represent the majority of the entry points into the field, but these new dedicated courses allow students to learn enough to hit the ground running in a junior role.

An abundance of resources

For someone interested in joining this growing field, there are a number of ways to get involved. The Digital Analytics Association (DAA) has chapters across the U.S. and provides local events, education, conference discounts, research, standards, training, awards, certification and great professional networking opportunities.

The Analysis Exchange is a program that provides a “student” of analytics with hands-on experience tackling the analytics challenges of a non-profit, supported by an experienced mentor. Another option to gain experience is to volunteer your services to a local charity or small business. The availability of free tools means anyone can get their feet wet in this industry.

For professional networking and to talk to those already in the industry, attend a Web Analytics Wednesdays or DAA local symposium, or getting involved via social media. There are digital analytics groups on Twitter via the #measure hashtag, Facebook, Linked In and Yahoo. Ask questions, and you’ll be surprised at who will take the time to answer them.

In addition, companies are creating more opportunities for those looking to break into the field. For example, Red Door Interactive created an internship program that helps students get hands on experience in a variety of areas, including analytics. The interns not only help collect and analyse data, but they learn how an agency works and how to be a part of a cross-functional team.

For companies needing to hire analytics professionals, it can be tough, and it’s not likely to change soon. Good analysts are typically happily employed and frequently recruited, so companies need to be open to developing entry-level or junior analysts on the job, consider internal candidates with compatible skill sets, allow flexible working arrangements (like remote employees) or make an offer too good to refuse. As long as demand continues to exceed the availability of resources, it will continue to be an analyst’s market.

Career Development for Digital Analysts: WAA Resource Available

Are you a new analyst looking for a way into the digital analytics industry? An analyst looking for advice on how to grow in your role or be promoted? Looking for a new opportunity?

For the past year, the WAA Membership Committee has been working on an initiative to provide WAA members with information regarding careers in the digital measurement industry.

Now available is the WAA’s Career Guide for Digital Analysts, an overview of careers in the digital measurement industry, including:

  • The types of companies analysts can work for;
  • Typical hierarchy and responsibilities for each role;
  • Educational and skill set requirements, including the importance of emerging skill sets; and
  • Advice for those looking to break into the field, be promoted or find a new opportunity.

This was compiled from the insight of a variety of industry members kind enough to spend some time giving an overview of careers in their area – client, vendor, agency and consulting side.

If you are a WAA member, please use this link to download the Career Guide:
Download the WAA Career Guide for Digital Analysts

You will need your WAA website username or password.

If you have forgotten your WAA website username or password, click here to reset your password.

Not a member yet? Join the WAA here.

If you’d like to read more about the WAA initiatives around career development and encouraging careers in Analytics, please read more at the WAA Blog.

Business Analysis and Technical/Implementation skills

There has been some discussion of “Analytics” vs/accompanied by “Implementor” skills in the web analytics industry of late. Given I am far from an expert myself, I’ve enlisted the POV of some clever folks to through in their 5 cents about the need for, or benefits of, implementation and technical skills for web analysts.

Thomas Bosilevac: I love it all
Mashable Metrics
@Bosilytics on Twitter

To paraphrase the Cluetrain manifesto:  “I am not a developer, or a programmer or a code monkey.  I am an analyst.”  That said I am a hell of a Geeky analyst, one that isn’t afraid of digging into some JS code, scraping page variables and utilizing server-side scripting.  However I would be quite depressed if that is all I did.  That said, the wonderful world of web analytics applies both to my left and right brain creating the dream correlation.

Web analytics tagging (ie. Implementation) is a fine art between assuring that data is passed to the reporting platform effectively but in a manner that will not shoot you in the foot later.  I have worked with some of the most talented developers out there, however, explaining that the event tag needs to only fire off on the INITIAL hit seems to pass clear over their heads.  The impact and significance of the matter might as well be the final Space Shuttle already in orbit.  Making a page work well is much different than assuring data is collected well.

For that reason I love my trade.  I get to discuss process management, KPI development and marketing scenarios with top brass at Fortune 100 companies during the day and staying up late assuring the landing page is using the correct eVars or doing server-side scripting to push initial cookie values out of a hit stream.  It is with this duality that each day is different and usually more interesting than the one before it.

Jenn Kunz: Know enough to work with each other

http://blog.implementalytics.com/
@Jenn_Kunz on Twitter

I do believe, like I suspect most people do, that ideally any analyst will have a good understanding of implementation and any implementer will have a good understanding of analysis. Do I expect an analyst to know the code? No. But a good analyst will know how cookies work and how they affect the data; what the difference is between the different types of variables and how to use them; what kind of configuration settings are available and when to change them. Since they are the ones IN the data, once they have an understanding of how things are set up, they are the most likely to come up with the kind of questions that make an implementation evolve into something better.

As for implementors- you can easily find implementers who have never done analysis. And it’s a shame, but that’s all the industry often expects of them: take a list of business requirements, turn it into a solution, see the code deployed, then wash your hands of it. But what the world needs is some sort of “uber-implementor”: a “tech guy” (or gal as the case may be)  who can tell you the best practices as well as technology limitations as you map out requirements, and who is involved beyond just the deployment of the code, all the way until after the end-users have done their first deep-dives in the reports. No one is going to know how to use those reports better than the person who created them.

Bryan Cristina:  The knowledge will benefit you

http://bryanalytics.blogspot.com/
@BigBryC on Twitter

I tend to avoid absolutes such as “need” when referring to someone’s skills. I think everyone is different and has their own strengths and weaknesses, so saying someone needs to have implementation skills to be hired might mean you miss out on an excel wizard or just an overall brilliant analyst.

I think Web Analytics involves a full process that begins with measurement planning and moves on to a tagging strategy, an implementation, report building/configuration of the analytics tool, to data gathering, then analysis, onto reporting, presentation, and finally helping facilitate a discussion on recommendations or changes that need to happen.

Implementation is just a part of the analytics process, but it still is an important, crucial part of the overall whole that I think someone would benefit from knowing. At times there are issues with data or a conclusion that doesn’t make sense other than realizing there is an implementation issue (or pick any point along the process). Are you willing to do a lot of unnecessary investigation with just the data you have, wasting precious time, and then finally having to ask someone that knows more than you about how the tags are implemented and how the data is being captured? Or would it be better to be able to identify the issue yourself, know how to fix it, and get it moving towards a resolution immediately? The more you know about the overall process, the more you’ll know where the issues can be, how to fix them, and can ultimately spend more time doing the right kind of analysis.

Michele Hinojosa:  Focus on your strengths

http://www.michelehinojosa.com
@michelehinojosa on Twitter

I won’t deceive anyone. My experience and skills rest far more on the “business analytics” side than on the “technical” side. Would I love to have the mad tech and implementation skills that some folks have? Sure. Do I fare okay without it? Most of the time.

I do agree with Tim Wilson about Analyst/Implementor skills being more of a spectrum. And yes, having skills that span both can be very valuable. (It’s safe to say that the more skills you can have in your back pocket, the better.) However, that doesn’t mean that if you are more “business analyst” or more “implementor” (aka focus more on one than the other) that you can’t have a successful career.

If you do tend to skew more to one side or the other, you likely will have good opportunities in a company with a larger analytics team, where you’re able to focus on your strengths. Someone who falls more in the middle, with a balance of both skill sets, might suit a “jack of all trades” role, where both skill sets are needed in the one resource.

As our field grows, and teams get bigger, I think we’ll see more specialisation – people focused on more specific skills. (After all, doctors aren’t surgeons and anesthetists and cardiologists and general practitioners. They specialise, but they do all have the same basic knowledge at the core.) My overall advice would be to do what you’re interested in. Doing what interests you means you’re more likely to excel at it, and add value to your company. If you love the business side, focus there. If you’re a code geek, focus on the technical side. That doesn’t let you off the hook entirely – you should still be learning what you can. You’ll want to know enough to work with your implementation or business-focused folks, understand what they’re doing and make the (informed) decisions they need from you. But ultimately, your time is well spent honing your strengths.

Lee Isensee: Titles don’t matter – the team is responsible together for success

http://www.omlee.net
@OMLee on Twitter

Having first started my career as an early practitioner, there really wasn’t the idea of implementation and it wasn’t until a couple years later that I understood why – I was consuming what I had in front of me and thought that it was the sky. Woah, information!

Since moving from a practitioner into the vendor realm I have changed my opinion substantially and believe that the solution is not as black and white as “analyst” or “implementer” but rather a unique combination of skills to meet the business requirements, technical requirements and on-going strategy of the customer.

Not once have I ever been in a situation, that resulted in true success, where the “implementer” did not have some level of engagement in the needs of the “analyst” and never have I seen the “analyst” truly believe that the “implementer” was not, at some level, invested in helping them get stuff done.

By creating isolated roles you are setting up your team for lots of finger pointing. Ultimately, your success will not be defined by who to staff on your team, but rather the building-out of your initial strategy, business requirements, technical requirements and phased expectations. It doesn’t matter what the titles, roles, experience, etc. are, rather it is the responsibility of the entire group to take ownership.

Statisticians + Web Analysts = Awesomeness

One thing I have found can work very successfully is a hybrid team of web analysts and statisticians. When you combine the business and website knowledge that the analyst has with the “mad stats skills” that the statistician brings, you can create some truly powerful work.

There are a lot of different things that a web analytics team can leverage a statistician’s help for. This is by no means an exhaustive list, merely a place to get started.

1. Significance Testing

So you’ve run an A/B or Multivariate test. While your testing tool will likely also advise of you the statistical significance of your results, a statistician can dive deeper into this, and help you to measure significance outside of your tool. Perhaps you noticed shifts in site areas that weren’t one of your test success measures – a statistician can help you decide if these are merely interesting, or statistically significant.

Or perhaps you’ve tested in more of a time-series fashion. A statistician can try to tease out whether the change had an impact, or whether changes are due to seasonality. (This relates closely to the idea of an Impact Analysis.)

2. Impact Analysis

You make a site change, and you notice an increase in visits to a site area, or some key metric. You’re tempted to attribute this entire shift to the site change. (“Woo hoo! We’re up 5%!”) However, what about changes in marketing spend? Seasonality of your site traffic? Social initiatives? Are you taking those into account before reaching your conclusion?

A statistician’s analysis can attempt to tease out those additional variables to estimate the impact of the actual site change, vs. these confounding variables.

This same approach can be used to measure the impact of industry events or company changes (outside of the website) – anything, really. The benefit here is a better understand of the actual impact of events or initiatives, but a nice perk should be presenting your findings to the business and not having to freeze like a deer in headlights if someone says, “Yes but we spent another million dollars in paid search last week – did you factor that in?”

3. Standard reporting automation

Statisticians can use tools such as SAS to fetch data from FTP, combine and compute it, and deliver outputs to your system of choice (for example, Excel, if that’s somewhere you’re comfortable working.) This can allow you to schedule FTP delivery of SiteCatalyst reports, Discover reports, ad server reports (etc) – basically data from multiple sources – have SAS do the work of fetching multiple data sets, combine them and output to Excel.

That, however, doesn’t mean you need to deliver a huge scary data sheet to the business. On top of the data, you can build  a more user-friendly view (preferably formula-driven, so that you’re not manually updating!) in Excel to present the data.

This allows you to take a lot of the manual part (copy-paste, copy-paste) of standard reporting out the equation, and focus your time on explaining the shifts you might be seeing in the report. e.g. Perhaps traffic to a specific content area is down – start digging in. What traffic sources are driving it? Are there particular pages experiencing a more dramatic shift?

In addition, once the business sees the value of this work (the time it frees up for analysts to actually analyse!) it may actually help  argue for further automation and investment in further tools. So make sure you provide those insights, and use this work to prove why you shouldn’t spend your time copy-pasting.

4. Forecasting

Statisticians can build forecasting models to predict your site traffic, sales, ad impression volume – pretty much anything. You can go short-range, or long-range. Perhaps a simple “forecast through end of month” will suffice to start, or maybe you want to start forecasting three or six or twelve months in advance.

So why would you do this? Well, good analysts know that data needs context. That’s why we have KPIs, or compare month over month, year over year – to understand whether “2.6%” is “good”. Comparing to a forecast can be another way to get context for your data. If you’re diverging from your forecast, you can start digging in to see why. This divergence might be good – perhaps you saw a better than expected responses to your marketing initiatives. But on the flipside, you might also need to frantically search for why you’re suddenly down -10% compared to forecast …

Even a through-end-of-month forecast can be helpful here. An EOM forecast will tell you where you’ll likely end the month, based on current performance – even though you’re only on day 9. This will allow you to course correct throughout the month, rather than waiting till end of month to realise you didn’t match your forecast.

If your business sets site goals, forecasts can be the first step. First, forecast where your business will be for the next twelve months without any major initiatives. Simply assume the status quo. Then, look at the initiatives you want to add on top of that, and assess how much of an impact they may have. Forecast + specific initiatives = your goal. A statistician can also help you look back over time at previous initiatives and analyse their impact, to make sure that you’re not overstating how big an impact something new may have. (How many times have you heard “This is a game changer!” and found it barely moved the needle?)

There are still things you need to keep in mind when forecasting, but even starting small can bring value to your business.

Group Hug!

Still, analysts and statisticians may sometimes face some hurdles. Analysts need to learn the language of statisticians, and statisticians need to either learn the business, or be guided  by the analysts. A statistician exploring data with no understanding of the business, the website, or what any of it means normally doesn’t reveal great insights. On the flip side, the analyst really needs to start learning and at least dabbling in the world of statistics, and be able to translate complex concepts for the business users you support.

However, a cohesive team that learns to work together and leverage each other’s strengths can do amazing things.

Don’t have access to a statistician? Students often need real-life data for school projects. Consider seeking one out! (Who knows – you might find yourself a great future employee.)

Want to grow the analytics workforce? Go out and get ’em

Yesterday, Red Door Interactive held an intern event. We have four paid internship openings for this summer, in four fields (marketing communications, email marketing, SEO and web/digital analytics.) The purpose of the event was two-fold:

1. Tell the interns a little bit about the different positions – What is web analytics? What is SEO? – and give them a chance to learn, ask questions, and figure out what they’re interested in.
2. Get to know the interns, to start the interview process.

When interns came to the event, they had previously applied and indicated which of the positions they thought they might be interested in. Not surprisingly, a large percentage were Marketing Communications – because who knows what web analytics or email marketing is if you don’t already work in the field?

What was amazing was how many spoke to me after the event or emailed to say, “I put down Marketing Communications, but now that I know more, Web Analytics sounds really interesting to me!” or furiously wrote down my web analytics book recommendations.

So here’s where I got to thinking. Those of us in the industry, especially those of us in a manager, director, VP level (etc), lament the lack of smart, qualified people, and how hard it is to hire. There are too many positions and not enough good people to fill them.

Well you know what? We need to do something about that.

So here’s what I am going to do. I am going to reach out to local universities and colleges, and see how I can start getting in front of students. I am going to tell them about our field, what we do, what the work is like, what a “day in the life” involves. Some students will shrug, but I guarantee that some students’ eyes will light up, just like they did yesterday.

We’re not going to encourage more smart young people into our field by hoping they stumble upon analytics. We’re not going to grow our industry by random chance. We need to go out and get them – and there’s no time like now.

Who’s in?

Simplifying vs Oversimplifying vs Acknowledging Complexity

As analysts, we need to work with complexity, while simplifying for end-users, yet avoid oversimplifying. Naturally, this is easier said that done …

Simplifying for others: This is incredibly important. If you can’t explain the problem and findings to someone in 25 seconds or less, you 1) will likely lose their attention, and 2) possibly don’t understand it well enough yourself to explain it yet. That’s our job. We work with the details and bring others in on the conclusions.

Oversimplifying: The balance required is to simplify the final conclusions without oversimplifying the problem, the data, or your analysis. The struggle, however, is that our brains are hard wired to simplify information.

Think about the amount of stimuli your brain receives every day. For example, you are crossing the street. I am outside. I see a long stretch of gray. That is a road. There is a red thing coming towards me. I hear a noise. The red thing is making the noise. That red thing is a car. Cars can hit me. It is going at 45mph. I am stationary. It will reach my location in 4 seconds. I will take 10 seconds to cross the street. I should not walk yet. And of course, I’m completely understating all that goes through our brains for even simple tasks. If our brains didn’t find a way to make sense of a high volume of inputs, we simply wouldn’t function.

Acknowledging Complexity: The challenge for analysts is to try to simplify the answer, without oversimplifying the questions along the way. If you make erroneous assumptions because it (over)simplifies your analysis, you could end up drawing the wrong conclusions. You will probably make your analysis easier, but render it less valuable.

We need to acknowledge, work with (and enjoy) complexity. (And we had better get used to it, because the digital measurement space is not getting simpler.) However, we need to avoid oversimplifying more than is necessary to sift signal from noise. We need to question what we know, evaluate what we assume, and separate fact from opinion. And if in doubt, invite someone else question you or poke holes in your analysis. Chances are, they’ll spot something you didn’t.

What analysts can learn from group fitness instructors

Les Mills RPM

I am an analyst and a certified Les Mills group fitness instructor for BodyPump (weight training), RPM (indoor cycling), BodyCombat (mixed martial arts based group fitness) and BodyJam (dance based group fitness.)

While analyst and group fitness instructor seem very different, there’s actually a lot that analysts can learn from instructors.

When we are trained as instructors, we spend a lot of time thinking about how different people learn, and how to teach to all of them.

Visual learners need to see it to understand. In group fitness, these participants need you to demonstrate a move, not explain it. In analytics, this may mean visually displaying data, using diagrams, graphs and flow charts instead of data tables – and perhaps even hitting up the whiteboard from time to time.

Auditory learners need to hear it. In group fitness, they rely on verbal cues from the instructor. In analytics, you may have a thousand beautiful visual displays or PowerPoint slides, but it’s your commentary and explanation that will help these people understand.

Kinesthetic learners need to feel it to understand, to experience what you’re talking about. In group fitness, you can show them and tell them, but what they need is to feel the difference between “the right way” and “the wrong way” (for example, “Oh, now I can feel how muscle x engages when I turn my heel!”) This is the same group that tend to need repetition to perfect what they’re doing. In analytics, these are often the people that need to be led through your logic. It’s not enough to show them your findings, and to display the final results. They need to see the steps along the way that you used to answer your questions.

Now here’s where it gets trickier. When you are presenting to a group, they won’t all be the same type of learner. Which means that a good group fitness instructor and a good analyst needs to explain the same thing in different ways to ensure that everyone understands. For an analyst, this may mean using visual displays of information on your slides, talking through the explanation, and giving a step-by-step example to put everyone on the same page.

Keep in mind that you too have your own learning style. Your analysis and presentation style will likely match your learning style. (If you are a visual learner, a visual presentation will come easy to you.) It may take a more conscious effort to make sure you incorporate the learning styles you do not share. However, by tailoring your message to ensure you hit all learning styles, you stand the best chance of getting everyone to the same understanding.

Are you an expert?

Guru. Ninja. Rockstar. Expert. These descriptions are all over the place. (*cough* Twitter bios *cough cough*)

Done learning. This is what I hear.

Here’s the deal. Calling yourself an expert sounds like you think you’ve got nothing left to learn. How can you be an expert in web analytics, or social media? These fields have been around for all of about forty-five seconds. (And they’ve changed twenty-seven times since then!)

My $0.015:  Don’t ever call yourself an expert, a guru, a rockstar. (And don’t just replace it with samurai or swami. You get my point.) Someone else may call you that, but let’s be honest, even then you should shrug it off.

The most appealing trait is a desire to learn, improve, to continue honing your skills. Focus on that. Let your work and development prove yourself. Not a self-appointed noun.

OMMA Metrics #2: The full story

For those who were not as fortunate to attend OMMA Metrics in San Francisco, here are my key takeaways from each of the sessions. For those who did, I would love to hear yours.

Yes, it’s long. Feel free to skim what is of interest to you. There is no pop quiz!

Note: below are two fun, random facts that I enjoyed learning!

Evolving Analytics: Measuring and Analyzing the Digital Ecosystem at Lightspeed
Judah Phillips, Sr Director Global Site Analytics, Monster Worldwide

  • Everything in analytics is evolving: the skills needed, the size of teams within a company, the importance in an organisation and exposure to executive management, technology and tools and the scope of what we’re analysing (eg social, mobile, video, and tying traditional media back to the site.)
  • To compete on analytics requirements investment in people and technology.

Digital Measurement: A Retrospective and Predictions for the Future
Eric T. Peterson, Web Analytics Demystified

  • 50:50 rule: invest half your analytics budget on technology, half on people.
  • At scale, a centralised analytics group is all that ever works. Analytics should be in the center of marketing, operations, management, etc, but have “superusers” within each of the other departments. Decentralized analytics does not work.
  • We need to develop faith and trust from stakeholders by having more answers than questions.
  • Analysts are a service organisation. We have forgotten this! We need to serve to provide business intelligence: deliver incredible value to drive revenue; this will build trust.
  • We need to move beyond first generation tools and reporting and need to generate insights and recommendations.
  • Commit to create a measurable impact! “If you give me a testing tool, I will delivery a 5% lift in X.” At worst? If you fail, they’ll fire you and you’ll move on to another position (probably with a salary bump!)

Measuring Social Media: From Listening to Engagement to Value Generation
with Jascha Kaykas-Wolff (Involver), Anil Batra (POP), Jonathan Corbin (Wunderman), Taddy Hall (Meteor Solutions), Rand Schulman (Eightfold Logic and Schulman + Thorogood Group)

  • Think about 1) Reach, 2) Engagement, 3) Impact of social
  • To strategically enter into social, need to identify your objective, pick the appropriate channel (e.g. are the users you’re trying to reach on Facebook? Twitter? YouTube? etc), then find the right KPIs that take into account objective and channel.
  • Scalable measurement and monetisation is what is currently missing from social media.
  • Companies need to identify who the influences are, and who they influence.
  • Integrate social with other channels, and understand it in the context of all your marketing.
  • Don’t be afraid to do something different!
  • To measure success, ensure you take a baseline.

Analysing Across Multiple Channels: What Works and What Doesn’t for Multichannel Measurement
Akin Arikan (Unica), Roger Barnette (SearchIgnite), Casey Carey (Webtrends), Kevin Cavanaugh (Allant Group), Terry Cohen (Digitas), Andy Fisher (Starcom MediaVest Group)

  • Some channels are more involved with certain areas of the lifecycle. E.g. Mass media to attract attention, online to engage consumers and persuade, offline to grow and retain.
  • There are forty years of multi channel experience, but digital breaks all those rules. How do you mix it in?
  • Maturity of multiple channel measurement is mixed – some companies are doing a lot, but many are not, for a variety of reasons (e.g. silo nature of the organisation, perhaps using multiple agencies, etc.)
  • Financial services is ahead of the multi channel game, because they have statisticians, data, tools, etc.
  • Traditional media measurement has 30-40 years experience. In comparison, the techniques in digital are laughable. Digital needs to learn from this. However, in the digital space we embrace change, are fearless, and figure out how to benefit from the change. Need to combine these two: increase mathematical rigor in digital, and embrace change in traditional media. [Aka “Andy’s grumpy rant”]

A Measurement Manifesto
Josh Chasin (comScore)

  • Future of digital is not in selling clicks and click throughs.
  • Digital has a seeming ability to measure everything, but in some ways this hurts us. We’ll never be the most simple medium. The landscape is not simple, and it’s not getting simpler. However, we have opportunity for doing great and ground breaking things with metrics.
  • Strengths of digital: portable, affinity (consumers cluster around content of interest, and even create that content!)
  • Targetability makes audiences small. Affinity makes audiences relevant.
  • 20th century was the generation of the shouting brand. 21st century will be the listening brand.
  • Digital order of operations: Ready, aim, fire, measure, aim, fire, measure, aim, fire, measure …
  • We need to measure: audience size, ad effectiveness (across platforms), attribution, engagement, voice of the customer and brand robustness.

Engagement the Mobile Experience: Effective Mobile Measurement Strategies
Raj Aggarwal (Localytics) , June Dershewitz (Semphonic), Joy Liuzzo (InsightExpress), Evan Neufeld (Ground Truth), Virgil Waters (Acceleration), Jamie Wells (Microsoft Mobile Advertising)

  • Mobile often has different methods of data collection, as the common javascript tags may not work.
  • Some mobile metrics are the same as site analytics, but some are different. Mobile web is similar to desktop web, but applications can be unique.
  • Benefits of mobile analytics: you can have a more accurate reach metric, since there is a device ID, and people rarely share devices. Location is more granular and valuable.
  • Mobile is unstable right now – we are trying to figure out mobile analytics in a shifting environment. E.g. What will succeed: mobile web or apps? Will one succeed the other? Will there even remain a distinction between them?
  • Tools for mobile analytics: 1) Traditional web analytics tools and 2) Niche vendors (or a combination of both.) The benefit of traditional tools is the integration with your site analytics. The benefit of niche tools may be higher-value, mobile-specific data.
  • Third party measurement and Apple: feeling from the panel is that Apple will be forced to play by the market, and likely change its policies over time.

Metrics and Measurement at eBay
Bob Page (eBay)

  • eBay has a huge range (and volume!) of data (e.g. marketing, finance, customer service, user behaviour, web analytics, etc.)
  • There is no silver bullet. No one product will solve all your needs.
  • They have a huge datawarehouse that contains virtual data marts for different groups (e.g. marketing vs. finance) rather than silos.
  • They also have an internal web analytics community, building a type of “Facebook for analysts”: an internal social network where analysts can subscribe to each other’s feeds, look at the latest videos, discuss issues in forums, share PPTs etc.
  • Have a centralised technical team under the CTO, who is responsible for infrastructure, support etc.
  • Centralised business analytics team under the CFO, responsible for common, standard “north star” metrics.
  • Distributed product analysts in each business.
  • Note: size of the technical teams to support this is similar to the size of the core analysts.

Managing Analytics: An Executive’s Perspective on What Works, What Doesn’t, Best Practices and Lessons Learned
Judah
Phillips (Monster Worldwide), Matt Booher (Bridge Worldwide), Yaakov Kimfeld (MediaVest), Dylan Lewis (Intuit), Jodi McDermott (comScore), David L. Smith (Mediasmith)

  • Executive sponsorship of analytics is changing: “We used to have a megaphone, now we have a seat at the table.”
  • Centralisation enables standardisation, and helps with the evolution of analytics.
  • Where Analytics lives in the organisation: sometimes with the CFO, sometimes within marketing – differs within different companies. Analytics needs to own the technology and the data, though technical teams may actually implement.
  • Challenge: Lack of standards, lack of an organisational body.
  • Challenge: Executive distrust in the data or its validity. Jodi at comScore spoke of 4-6 months of having to explain data capture and constantly evangelizing before executives would place faith in data over gut.
  • Challenge: Hiring/recruiting. Companies want to find everything in one person: a technologist, a marketer, a statistician. Region can make hiring even more difficult. General sense is to find the right individuals/hire for instinct. You can always teach people the finer points of being an analyst (e.g. how to use a particular tool.)
  • Project management: Ticket-type system, scrum process. But no matter the project management used, requires ruthless prioritisation

Online Measurement: The Good, the Bad and the Complicated
Joe Laszlo (IAB)

  • The good: Online measurement is competitive, we have many vendors options. Vendors have integrity and are continually innovating. We can measure nearly anything.
  • The bad: Contradictory metrics from vendor to vendor, and changes in methodology can render dramatic fluctuations in measurement.
  • Online is managing to capture direct-response dollars, but not branding dollars. This is because brand marketers want to understand what their spend did for brand awareness, purchase intent, etc. What they get is “engagement”: view throughs, time spent, etc. There is a disconnect between what measures of success digital offers them and what they want.
  • Traditional media measurement allows calculation of reach and frequency. Also has years of experience of what matters, and has well-accepted metrics.
  • Lack of online measurement standards makes accurate data comparisons impossible. This can not be solved by any individual company, therefore the IAB is tackling through a cross-industry task force.

Modeling Attribution: Practitioner Perspectives on the Media Mix
Cesar Brea (Force Five Partners), Gary Angel (Semphonic), Jason Harper (Organic), Drew Lipner (InsightExpress), Manu Mathew (VisualIQ), Kelly Olson (Red Bricks Media.)

  • Attribution: What campaign/medium is responsible for the sale?
  • But there are more questions now: Is it better for someone to touch campaign a and b? What about b first then a? It’s not just the attribution, but do the two campaigns contribute together, in what order, or are two overkill, etc? E.g. Evidence that display with search adds value to search: someone searches after seeing a banner ad.
  • Can get a lot of benefit from evaluating click attribution, but even more from impressions optimisation.

Understanding the Multi-Screen Consumer: What’s on their Screens, What’s on their Minds
Alison Lange-Engel (Microsoft Advertising)

  • We now access online content via a variety of screens: computer, mobile device, TV, gaming consoles. We are always on and always connected.
  • Microsoft Advertising conducted survey to answer these questions.
  • The most active segment is 24-35 year olds.
  • Consumers are rapidly adopting technology and want control of the experiences.
  • Online gamers are the “game changers”. They do more of everything, all the time, social influencers. They spend the most time blogging, viewing, texting links. They view their game console as a communication device.
  • A linear funnel is not relevant anymore, as all screens impact purchase and allow an impactful story to be told.
  • Computers and smartphones are the key points of purchase.
  • The younger segments are accepting of advertising across multiple screens, actually want information and entertainment. They find ads helpful when they are targeted to their preferences and interests. They want a consistent experience across screens, and like the ability to access content across multiple screens – it actually improves their opinion of the content provider.
  • The key to success is: Consistent messaging + connected to other mediums + relevant = engagement and results.
  • Full report at advertising.microsoft.com/multiscreen.

And fun facts for the day:

  • The birthplace of web analytics is Hawaii!
  • Web Analytics is still small. All the web analytics companies sold for less than DoubleClick!

For those who did not get to attend this event, I highly recommend checking it out next year. It was interesting, informative, with great choice of speakers and a nice mix of presentation vs. panel discussions. Learning has never been so fun!

How to grow your web analytics skills (within your current role)

If you are an analyst looking to further develop your skills, what can you do (within your current role) to further grow and develop? Here are a few of my thoughts, though I am certain there are many others.

In no particular order …

1. Interact with others in the industry

  • Join Twitter, follow your web analytics peers. Twitter can be an amazing educational resource if you use it for something other than “I ate a ham sandwich today.” You get to hear about the challenges that analysts working with different business models or analytics tools face, what is going on in the industry, what the vendors are saying and perhaps new functionality they’re releasing.
  • But more importantly than reading what others say on Twitter: contribute. Voicing your views will force you to think them through. And everyone disagreeing with you (it will happen one day!) will be a great learning experience to see those other viewpoints.
  • Go to Web Analytics Wednesdays
  • Take the time to go to lunch/happy hour/etc with your peers within your company and “geek out”. While you may work in the same company, your responsibilities and experiences may still differ, and you can learn from the experiences, thoughts and views of others.

2. Take advantage of free learning opportunities

  • Attend free webinars. There are so many out there (you’ll find out about them through Twitter, blogs etc) and they can be a great resource
  • Attend free trainings (yes, they do exist. I can’t tell you how many emails I get from MicroStrategy about free one-day trainings.)

3. Attend conferences

  • This one can be tougher if your employer doesn’t support this. However, make an argument for why it is of benefit to the business. Trust me, the vendors give you plenty of information about how to sell their conference to your company!
  • If you can swing the cost, you do have the option to pay for it without your company’s support (or “financial assistance”) …!

4. Volunteer

  • Join the Analysis Exchange, a program that brings web analytics students, mentors and non-profit organisations together, to give more web analytics experience to the student and analytics assistance to the organisation.
  • Know a friend/family member/co-worker with their own site? Blog? Small business site? Volunteer your time to help them set up a free web analytics solution, and take time out of your schedule to analyse their site on a regular basis. Don’t know anyone? Why not start your own site? It doesn’t have to be big. It also doesn’t have to be about web analytics. But it will certainly give you a taste of analysing a different type of site, as well as some of the challenges of getting traffic!
  • Volunteer to work on things outside the scope of your standard role within your company. Is there a project out there that you think analytics could help with, but no one is asking for help? Volunteer it!

5. Read
6. Read
7. And then read some more

  • There are a lot of great books out there. Start with one. (A hint: If this sounds completely dull to you, and you can’t imagine anything worse than reading about analytics in your spare time, really take a look at whether you are in the right field …)
  • Read both corporate blogs (e.g. web analytics vendors: Omniture, Google Analytics, etc) and those of your peers
  • Ask your peers for their recommendations of books, blogs, journals, magazines, articles, etc
  • But don’t stop just at web analytics books. Start reading about related fields. Product development. Design. Usability. Marketing. Social media. Statistics. Even cognitive psychology!

8. Keep your eyes open to what employers are hiring for

  • Sure, maybe you’re happy where you are at your current company. Maybe you don’t feel you’ve extracted all the learnings you can from your current role. (That’s a great position to be in!) But keep your eyes open for what positions are out there.
  • Why? Seeing what employers want will allow you to keep a mental checklist of what skills you need to improve on, prior to your next promotion or job change. Better yet, think about what you want your next move to be, and monitor the companies that are hiring for that type of role. What are the requirements and responsibilities they have for it? This ensures you’re working towards filling those requirements in the future. You can’t grow into a position if you don’t even understand what it involves!

I would love to hear others’ thoughts on this. Please comment if you can think of any further advice.