How to grow your web analytics skills (within your current role)

If you are an analyst looking to further develop your skills, what can you do (within your current role) to further grow and develop? Here are a few of my thoughts, though I am certain there are many others.

In no particular order …

1. Interact with others in the industry

  • Join Twitter, follow your web analytics peers. Twitter can be an amazing educational resource if you use it for something other than “I ate a ham sandwich today.” You get to hear about the challenges that analysts working with different business models or analytics tools face, what is going on in the industry, what the vendors are saying and perhaps new functionality they’re releasing.
  • But more importantly than reading what others say on Twitter: contribute. Voicing your views will force you to think them through. And everyone disagreeing with you (it will happen one day!) will be a great learning experience to see those other viewpoints.
  • Go to Web Analytics Wednesdays
  • Take the time to go to lunch/happy hour/etc with your peers within your company and “geek out”. While you may work in the same company, your responsibilities and experiences may still differ, and you can learn from the experiences, thoughts and views of others.

2. Take advantage of free learning opportunities

  • Attend free webinars. There are so many out there (you’ll find out about them through Twitter, blogs etc) and they can be a great resource
  • Attend free trainings (yes, they do exist. I can’t tell you how many emails I get from MicroStrategy about free one-day trainings.)

3. Attend conferences

  • This one can be tougher if your employer doesn’t support this. However, make an argument for why it is of benefit to the business. Trust me, the vendors give you plenty of information about how to sell their conference to your company!
  • If you can swing the cost, you do have the option to pay for it without your company’s support (or “financial assistance”) …!

4. Volunteer

  • Join the Analysis Exchange, a program that brings web analytics students, mentors and non-profit organisations together, to give more web analytics experience to the student and analytics assistance to the organisation.
  • Know a friend/family member/co-worker with their own site? Blog? Small business site? Volunteer your time to help them set up a free web analytics solution, and take time out of your schedule to analyse their site on a regular basis. Don’t know anyone? Why not start your own site? It doesn’t have to be big. It also doesn’t have to be about web analytics. But it will certainly give you a taste of analysing a different type of site, as well as some of the challenges of getting traffic!
  • Volunteer to work on things outside the scope of your standard role within your company. Is there a project out there that you think analytics could help with, but no one is asking for help? Volunteer it!

5. Read
6. Read
7. And then read some more

  • There are a lot of great books out there. Start with one. (A hint: If this sounds completely dull to you, and you can’t imagine anything worse than reading about analytics in your spare time, really take a look at whether you are in the right field …)
  • Read both corporate blogs (e.g. web analytics vendors: Omniture, Google Analytics, etc) and those of your peers
  • Ask your peers for their recommendations of books, blogs, journals, magazines, articles, etc
  • But don’t stop just at web analytics books. Start reading about related fields. Product development. Design. Usability. Marketing. Social media. Statistics. Even cognitive psychology!

8. Keep your eyes open to what employers are hiring for

  • Sure, maybe you’re happy where you are at your current company. Maybe you don’t feel you’ve extracted all the learnings you can from your current role. (That’s a great position to be in!) But keep your eyes open for what positions are out there.
  • Why? Seeing what employers want will allow you to keep a mental checklist of what skills you need to improve on, prior to your next promotion or job change. Better yet, think about what you want your next move to be, and monitor the companies that are hiring for that type of role. What are the requirements and responsibilities they have for it? This ensures you’re working towards filling those requirements in the future. You can’t grow into a position if you don’t even understand what it involves!

I would love to hear others’ thoughts on this. Please comment if you can think of any further advice.

What time of day do web analysts tweet?

Thanks to @menggoh (and Virgin America‘s wi-fi) I have been thoroughly entertained on a red eye between LA and Boston with Twitter analytics from The Archivist. Which means naturally, I was curious: based on when the web analytics community is active and tweeting (based on tweets to the #measure hashtag) when is a good time of day for me to post new blog posts?

The cool thing about The Archivist is that they give you tons of fancy charts about who the top users our, our daily tweet volume over the past few months, topics discussed etc. (Seriously, check it out – so much fun!) However, what I couldn’t find was a breakdown of total tweets by time of day. The even cooler thing about The Archivist is that they allow you to download an Excel file of the data. (Insert cry of geek joy here!) So naturally, I took the data and DIY’ed it.

So as it turns out, the peak of #measure tweets is between 6-7 pm. We’re primarily an evening/night owl community (insert HootSuite joke, anyone? Get it? Owl? Hoot? My apologies – I blame the red eye flight…) with much lower activity in the morning. Possibly we work on being a peaceful community by waiting for the coffee to set in before communicating too much!

Here is total tweet volume to the #measure hashtag for 5/11 through 7/2/10.
[Ideally, I’d like to overlap total tweet volume to see whether we follow the overall Twitter trends. I’ll update when I have further data.]

*Note: Time of day is based on United States EST

So what does this tell me? I theorize that I probably don’t want to post my tweets at 6pm, right when a huge volume of tweets are flooding in (too easy to be missed in the midst.) However, perhaps noon/1pm would be a good time, so my tweet is recently posted as we start getting more active on Twitter for the day.

Now, I’m off to test some time-of-day theories …

[This adventure in geekdom was proudly brought to you by The Archivist and Virgin America‘s wi-fi service.]

How often should you revisit your KPIs?

I have been thinking lately about the right intervals for revisiting and changing your site and business KPIs. I won’t bore you with all the back and forth, but merely share a few thoughts.

The stability of your business plays a role. If you are an established business, with established goals of your site, revisiting your KPIs too often suggests to me that you didn’t have the right ones in the first place. If, however, you are a newer, (somewhat) flying-by-the-seat-of-your-pants business, perhaps perfecting your approach, I can understand a much more fluid approach to what constitutes “success” and a need to more continuously evolve your KPIs.

My overall thoughts are that KPIs can’t and shouldn’t change every month, even for the latter business example. (Would it be unprofessional of me to say, “duh”?) You should be consistently measuring against the same yardstick.  However, I do think it’s good practice to take a look every three to six months and make sure your KPIs are 1) useful and 2) complete. Do you actually need all of them? Or, on the flip side, is there something new that should be included? Perhaps there are new capabilities you have developed that would allow a new KPI to be measured? After all, a year can allow for a lot of development in the analytics industry. Take advantage of new measurement options.

Recently, I have been involved in the re-evaluation of our KPIs. At the beginning of this effort, the website product team and the analytics team were involved in brainstorming new ways to evaluate the success of the site. Once we decided 1) what we should measure and 2) what we could measure, figured out the overlap of the two and selected from those, analytics began publishing the information. Now, a few months down the track, we’re at a point where our product managers are somewhat comfortable with the information, and the time has come to revisit. (After all, they can’t give feedback on something they don’t even understand or use yet. You have to give the information some time to allow for informed feedback.) Are they using the information? What’s missing? What’s overkill? I expect to do this every 3-12 months from here on out. Just like our site, I expect our measurement of it to be iteratively developed over time.

Parting hint: If you’re not sure if something is helpful or not, try removing it for a month. If no one complains, you have your answer. (But I never do this. Never. No, Really.)

The (most?) valuable trait of analysts (that you can’t teach)

I have been thinking a lot about the type of analyst I enjoy working with, and what I think the critical elements of being a good web analyst are. In the course of doing so, I had an interesting realisation, that I look forward to putting into practice next time I’m searching for an analyst.

We’ve all read a thousand job descriptions, and we know the drill. Attention to detail, analytic skills (of course), able to synthesize large amounts of data to extract meaningful insights, deliver concise message to stakeholders. Etc. Etc. Etc.

But the trait I’ve not seen (often) on job descriptions (or heard in people’s conversations about what they’re looking for) is curiosity.

I want to both be and work with analysts who are curious. Who are forever asking “Why? Why? Why?” Who look at the site redesign of their favourite site, and think, “Oh man I wish I could get my hands on their data, I wonder what they’re seeing …” (And perhaps tries to hack at it via Compete.com or another competitive intelligence source.) I want the analyst who takes the initiative, and may even get a little side-tracked every now and then, because their curiosity takes them down an investigation path that no stakeholder or boss has asked them to go, or even thought to go. (The gems that can come of this …!)

If you’re lucky, you’ve worked with these kinds of analysts. If you’re very lucky, you are one yourself. (FYI, the fact that you’re reading a blog about web analytics pretty much suggests you are that curious person interested in the field. The 9-5 analysts don’t do this …) But me? I want to search for this, to hire and retain for it, and not just as a “nice to have”. This is top of my list. I can teach you how to use Omniture. I can’t teach you how to be interested in what we do, or to be curious to learn and grow.

Deciphering business user requests

In a previous post, I discussed the role of web analysts as “information architects”, responsible for outlining complex data and findings into easy-to-understand information. However, there is another hat we analysts wear, which is to serve as a an interpreter, or translator, between business users and analytics information.

Analysts are often charged with responding to business users’ requests for information. Whether it be a report of simple metrics or a more complicated analysis, this is the reactive side of our role. (Hopefully you also have a proactive aspect to your role, but that’s a topic for another time.)

Now, with no disrespect intended to business users (who know many things we don’t, and are good at many things we’re not) the simple truth is that business users don’t always know what they need.

Remember, this business user might be the person to whom you’ve explained the difference between a visitor and a visit, or a page view and an ad impression, four times already … this week. That’s okay – it’s our job to explain the subtle nuances of data and metrics. But you need to keep that conversation in mind when they later come back to you for information. Just because they want visitor information, doesn’t mean that’s the information they actually need. Perhaps a visits metric would be more appropriate in this case, because of XYZ reason that they’re not aware of, or because they’ve simply mixed up terminology.

I see this most with more junior analysts. Especially as the business person requesting the information becomes more senior, there is often an eagerness to provide exactly what they asked for, as quickly as possible. However, as an analyst, an essential developmental step is to question what is being asked of you.

It is less important to provide the information the user wants. What is crucial is to provide the information they need.

This is where an analyst needs to stop, consider the request, and ask: “What question are you trying to answer?” or “What problem are you trying to solve?” Once you understand what they need the information for, you’re then in a position to evaluate the request and ensure that the information you provide helps them with their business problem. After all, how can you respond to something you don’t even understand? Perhaps the user thinks they want visitors, but they actually want visits. Or perhaps they think they want last month’s information, but you know that it will help them to see last month, which was abnormally high or low, in the context of the last 13 rolling months. Perhaps there is even additional information available (other metrics, segmentation, etc) that they aren’t even aware of, that could help them solve this problem!

If done correctly (I don’t recommend rolling your eyes and saying, “Don’t be stupid, you don’t want that metric, duh!” – though not from personal experience! I just suspect it wouldn’t go over so well) business users really appreciate this assistance! They actually rely on it, whether they realise it or not. As subject matter experts, analysts are invaluable in helping the business figure out how to best solve problems, with the right information. We should be sharing our knowledge, and working collaboratively to solve business problems. After all, what’s the point of you handing over what they want, them finding it either useless or worse, it actually guiding a poor business decision?

This interpretation of requests never goes away, but it’s worth noting that it does lessen over time. As your business users and analysts work more closely together, and at least somewhat speak the same language, business users get better at knowing what to ask for and what is available, and analysts get better at understanding the business, and can proactively provide information that can help to solve problems, before it’s even asked of them. Makes you smile, doesn’t it?

The oh-so-elusive engagement metric

I was fortunate enough today to catch Eric T. Peterson‘s webinar about engagement, held by the International Institute for Analytics. The presentation was informative and, in some ways, reassuring. Why? Because even one of the leading experts in Web Analytics essentially agrees that engagement is not an objective, clearly defined metric, nor an easily measurable one. (What a sad day it would be to find out that measuring engagement is clear and simple, and I was just missing the point!)

While Peterson spoke of various definitions, as well as a “formula” by which he has measured engagement, he was very clear that this wasn’t the only possible formulation, nor that there was even one agreed upon industry definition.

What it came down to essentially was that:

  • Engagement truly doesn’t have a clear definition, at least not in the sense of “it is comprised of X + Y + Z metrics”. We all agree on the concept generally, but not necessarily what elements go into measuring it.
  • Sites really need to evaluate what engagement means in relation to their experience.

Perhaps this should be disheartening. Perhaps I should want a clear, defined notion of “engagement”.

But here’s why I don’t …

  • Web analytics is not simple (and anyone who thinks it probably isn’t doing much with it – if you are, and you still think it’s simple, please send me your resume!) Therefore I can’t believe a concept as powerful as engagement can actually be simple. Few (if any) web metrics are useful in and of themselves. We need context for our data to be meaningful. Engagement should be the same – it needs to be defined in the context of the site in question. (And its definition should be continually repeated and reinforced within an organisation, so everyone understands. A metric becomes more meaningless as we start forgetting what it actually represents and how it is defined.)
  • This fuzzy lack-of-definition of engagement allows flexibility. It allows the concept of an engagement metric to be truly tailored to the site it is measuring. No cookie-cutter solutions, or square pegs shoved into round holes, but something that is thought out, keeping in mind the goals of the business and the site’s visitor behaviour.
  • But most importantly – sites that work to define engagement as it pertains to their experience, to capture the data and to process, analyse and segment by engagement level will not let it turn into another useless metric touted at executive meetings. It will have meaning because of its specificity to the site in question, because it is truly helpful in understanding the site and its visitors’ behaviour.

Web Analyst = information architect

Everyone knows that analysts are constantly elbow-deep in numbers, math and data sources. However, in seeing my own skills develop from my very first analysis, as well as training and working with other analysts, it quickly became very clear that an analyst’s ability to translate large volumes of data into a concise, clear summary separates the mediocre from the invaluable.

Yes, web analytics can be granular. Yes, you are looking at a thousand different things, pulling together multiple data sources, segmenting and diving into detail, even investigating theories that did not pan out. But if you can’t tell others what you found, even just the one most important point, in a sentence or two, and in plain language, then you are missing an essential skill that we need as analysts. Your role is not just to dive into the data and draw out insights. It’s to share them with others in a way that they can understand.

Why?

Executives are busy. If you can’t fill them in on your findings in one or two sentences, they don’t have the time. You are not always going to be given an hour to present your findings in a lengthy PowerPoint. At times, the closest you may get to “presenting” your analysis may involve bumping into your President in the lunch room and having exactly 15 seconds to respond to a, “So how’s that home page redesign performing?”

Same thing goes for email. Eyes glaze over upon opening a lengthy email (normally followed by “Mark as Unread” with a, “I’ll come back to it later …”) Ask yourself this: “If someone only reads the first paragraph, have I given them everything they need to know (even if it’s abridged)?” By all means, provide more information below, or in an attachment, for those who are more involved and need additional detail. But understand that many just want (or only have the time for) the CliffsNotes version.

The simple truth is that executives are also not as close to the project as you are. You need to be able to pull yourself out of the trenches and find a way to summarise your insights to someone who is not as close to the business and the details of it as you are.

What it comes down to is that your job is not just to analyse data, you are also an information architect. Your role involves taking what is complicated, and making it feel easy to understand and digestible to a less-analytically inclined audience. This involves perfecting two crucial skills:

1. Summarising. (Then summarise your summary. Chances are, it’s still too long!)

2. Presenting your findings in the right way to the right audience. A twenty-five words or less approach may work with your executive team. However, further details with visuals may be appropriate for your data-driven business user. Know the difference. So much of our online business is in trying to present the right offer/ad/site experience/etc to the users of our site. We need to do the same to the users of our insights.

A few thoughts on forecasting

It seems fitting that my first post should involve something that occupies a tremendous amount of importance (and potentially, debate) within an organisation (and certainly involves a lot of sleep loss for me personally!)

Forecasting.

If you are new, or fairly new, to forecasting on your website, I’ll share some hard-learned truths.

You’re always going to be wrong. Always. The very nature of a forecast means you will always be wrong … and that’s okay. (Don’t get me wrong. When it actually happens, it’s thoroughly depressing, and often has you chasing your tail to find out why, but it’s still okay.) Your aim is just to be wrong as little as you can – and to be able to identify why your resulting actuals are off from your forecast. I would argue that divergence from your forecast today will make tomorrow’s re-forecast better, but only if you can explain it and learn from it.

Your forecast is only as good as the information you have. As you look back over your site’s history, if there are events you can’t explain, or inputs into your traffic that you fail to identify, your forecast will be more off than you would like it to be.

Don’t have all the information you need? Get it. Gather anyone/everyone/any information or inputs you need. If the company’s eyes are on your forecast, their gaze will stray if you frequently prove too far off. If they’re not yet looking at your forecast, they won’t ever start if you can’t prove your history of accuracy.

There is a difference between a forecast and a goal. As you start forecasting site traffic, ad space, conversion rates, lead generation (etc), you’ll need to explain this. Many times. Your forecast will be based on your site’s history and its inherent trends. But as an analyst/statistician/forecasting guru, you alone can’t identify everything that will happen in the future. (And if you can, give me your number – I have a few questions I would love answered.)

This is where your business/development/product team come in. Your forecast can estimate where your site will be in the future, based on your current trajectory. But you need inputs from others to anticipate future planned growth, that’s not evident in the data.

Let’s say your forecast suggests your site will be up 5% year-over-year. If your executive team want your site to  be up 20%, you need your business/development/product team to either a) temper this expectation, if it’s not reasonable, or b) advise how they will achieve this. If your fine-tuned, well-informed forecast suggests you’ll be up 5% year-over-year, 20% is not a forecast, it’s a goal. You can’t reach 20% YOY without at least a basic idea of how you’ll get there, and any attempt to incorporate it into your forecast without a skeleton of a plan will reflect poorly on your forecast, rather than reflecting on the failed execution of the plan for growth.

The moral of the story? Forecasting can’t occur in a silo. Analysts, business and executives must all get their “feet wet” to produce something that all are comfortable with and can rely on.

Not forecasting (yet)? Even if it’s rudimentary (forecasting your basic web traffic metrics only, e.g. Visits, Unique Visitors, Page Views) get cracking. It’s wonderful to be able to analyse, segment and test your history, but your business and executive teams will really appreciate even a lightly dotted line of where they’re headed.