December 2014
S M T W T F S
« Nov    
 123456
78910111213
14151617181920
21222324252627
28293031  

Tags

Happy Black Friday

So I pop up after a year and a half and wish you a late Thanksgiving? Yeah, that is about it. There are a few things on my mind today though:

    A blast from the past, very relevant to the day: Weebl Jazz (warning: there is sound and references to Pie)
    My new company: Numerical Truth. I’ve switched from Corporate back to Consulting. I actually made the jump a few months ago, and business is going very well so far.

Frankly, this post is mostly to “break the ice” of a dormant blog. Now that I’ve got it out of the way I can get back at it. See you soon!

Know What You Don’t Know

Today, a little web analytics advice from Donald Rumsfeld:

The Unknown
As we know, 
There are known knowns. 
There are things we know we know. 
We also know 
There are known unknowns. 
That is to say 
We know there are some things 
We do not know. 
But there are also unknown unknowns, 
The ones we don’t know 
We don’t know.

—Feb. 12, 2002, Donald Rumsfeld, Department of Defense news briefing
(From Slate

That Donald sure had a way with words, and I think they apply pretty well to our little world. In web analytics it is important to be keenly aware of what you don’t know. Why? Because we use the data we have – the things we know- to make decisions. Every day we’re trying to optimize the things we’re measuring, but often what we’re measuring is just a proxy for actual success. Because there are things we don’t know, things our tools can’t tell us. For exmple, if you run a lead gen site you probably don’t know if the leads are actually converting. If you’re looking at a ecommerce site what is the return rate on the sales you’re measuring? Are they repeat or one-time buyers? 

There are tons of critical connections that most of us haven’t made yet. And if we don’t take that uncertainty into account we can make some really bone-headed recommendations. We can go to great effort and expense to optimize the part of the process we can see and measure, while our seccess gets lost in the critically important part of the process we can’t see. Luckily for us, all is not lost. These are things we know we don’t know, so we just have to remember that, and factor it into the decisions we make from the things we do know. 

One thing I know is that I haven’t posted here in almost exactly a year. The year has seen me move from a medium sized retail company to a medium sized unit of a giant company. I’ve also come to realize why there are so few WA blogs from practitioners — we can’t really write about work. Consultants can write obliquely about what they’re thinking about all day, but that doesn’t pay for those of us working on a single site. That said, I’m going to try to make the effort. Will the next post come sooner than next April? I hope so, but that is a known unknown.

The sorry state of usability

We just finished up a round of usability testing on our site at work. It was incredibly frustrating in many ways. Watching person after person unable to complete the most simple task … I wanted to jump through the 1 way mirror and shake them: JUST CLICK ON THE LEFT NAV, dammit! Frustrating because although I wanted to blame the testees, I know that it’s really our problem.

It was also very illuminating. I saw many things that clickpath analysis (even super detailed clickpath like Tealeaf) just wouldn’t and couldn’t show. And many things that really explain what it does show.

But what really struck me as odd was that after struggling with our site for an hour every person said “I like that site, it was pretty easy to use” or something similar. If those were good experiences, I shudder to think what a bad experience is like.

Tealeaf vs. Web Analytics

There has been a ton of discussion on the Yahoo group about Tealeaf, particularly why and how it is different from traditional Web Analytics apps. Since I’m familiar with both, I thought I’d give my perspective.

Tealeaf collects everything. I mean everything. Every cookie, every image request, every bit of information entered in forms, everything. Their bread and butter is allowing you to replay single sessions to see exactly what happened. It is an amazing tool in this regard. It has extremely powerful search capabilities that can sift through millions of requests and turn up the few that have the exact characteristics you’re looking for. Do you want to see all the sessions for people with customer numbers between 55550 and 55562? Tealeaf is your tool. Do you want to see what happened at 6:10 last night? No problem.

It also has very nice monitoring capabilities. You can fire an “event” if a particular sequence of actions take place on your site. We use it to monitor for a few rare but serious errors that crop up from time to time. Some use it to monitor for fraudulent activities or hacking attempts. The event can send an email or just flag the session so it is easy to find. I use events to find pages that do not have analytics tags on them. Tealeaf has fairly rudimentary reports that show how many and when events occur.

We also use it for ad hoc web analytics requests that would otherwise require a tag change. It’s not made for operational reporting, but if we want to figure out a particular question without going through the whole IT rigamarole Tealeaf is often ideally suited to do this.

Tealeaf also has a module called cxConnect which allows you to export all this information to a database. Every day or every hour or whatever it will spit it all out into a SQL Server database. At first glance this seems like a killer web analytics application  — just imagine, every little detail of all activity on your site all organized nicely for you in a database!

That is great until you give it about 30 seconds of thought. Sifting through this data is a herculean task. There is no way you have enough disk space to keep it all, so right away you need to write some sort of ETL to pick out what you need and compact the rest.  Then you need to replicate all the reports that a web analytics platform has out of the box. Will you end up with something better than you’d get from Omniture or WebTrends? Yes, you probably will, if you have smart data warehouse architects and good technical resources. If you don’t have both these you’ll end up with a large pile of useless crap. Will you end up spending way more money and time creating and maintaining this system than you would by buying and implementing Omniture or WebTrends? Absolutely, probably many times more.  Will it be worth it? Who knows. If it works it would be awesome. If it doesn’t… well, there are a lot of web analytics jobs out there, right?

The strength of the traditional web analytics applications is what they don’t show you. They sift out all the junk that you don’t need to see and they’ve created a bunch of reports that present the information that is useful. Someone else worries about the data storage/retention problem. It’s all built into your very predictable monthly fee. They may not be perfect, but they are pretty good, and buying one of these is much safer than embarking on a huge data warehousing project. As we all know there is a ton of detail in web data that is totally useless 99.9% of the time. Tealeaf lets you efficiently find that 0.1% – and that alone is more than enough to allow it to pay for itself.

Brand feud?

I thought it odd when Avanash titled one of his posts “Web Analytics Demystified” since that’s kinda sorta an established web analytics brand. It seemed an especially poor choice of title because it was a post about web analytics basics which were covered in Web Analytics Demystified (the book) which was written by Eric Peterson. Now Eric Peterson (of www.webanalyticsdemystified.com) has returned the favor by titling his latest post Web Analytics: an Hour a Day. This just happens to be the title of the book Avanash wrote. Neither one references the other in their post. This cannot be good for either party’s search engine rankings.

What is going on here? Here are my votes:

  • An interesting and cooperative strategy to get their links on each other’s branded search. It’s already worked for Avanash.
  • Preparation for a CAGE MATCH at the next Emetrics.

Only time will tell, but I’m registering for Emetrics now just in case.

WA Detector Bookmarklet Update

It’s been a long time since I posted here, but one of my new years resolutions is to post more often. Plus, what with new GA and Microsoft code being released the cry for an update to my WA Detector script has been deafening. Well, actually only Jaques Warren of http://www.waomarketing.com/ asked for an update, and since I kinda wanted one myself I went ahead and updated it.

It now checks for MS Analytics and Google Analytics v2 code. For MS Analytics it outputs what would be sent if you called “TrackPage” on that page. I’m still not smart enough to decode the GA javascript enough to output what is sent, so it just lets you know that the code is present on the page. I also updated it to detect the latest version of WebTrends’ tag and I removed the VS check. I’ve learned a lot more about this product in the last year and I now know it was pretty ridiculous to make any sort of claim of reliably detecting it.

Due to WordPress’s very un-helpful smart quote feature I couldn’t put the bookmarklets in this post. You can find them in the right margin of my blog. Again, there are two versions of it. The first one works for IE7 and Firefox. For Firefox just drag the following link to your links toolbar, for IE7 right click and choose “add to favorites”. Then save it in the links folder.

This second one works for IE6. Right click and choose “Add to Favorites” and save it in your links folder. The downside to this one is that it will call back to blog.keyes.us each time you use it and it gives an error on https pages.
These updates will not overwrite the old version, so if they don’t work as well then a) let me know why and/or b) keep using the old version. These have been lightly tested so please let me know of any problems you encounter when using them.

Finally, if you’re interested in this you may like Stéphane Hamel’s WASP tool. It’s much more fully featured than this one, but it works differently. Also Eric Peterson has a hosted vendor discovery tool that you may find useful.
PS — somehow my comments table got corrupted, so all the comments on my previous posts were lost. Sorry! If your comment was deleted it’s not that I hate you, it’s that my WordPress database hates you!

Engagement Metrics

I’m back! I’m attending a WebTrends conference today (worth it if you’re into WebTrends), and conferences get me in the posting mood.

Today Ian Thomas points out Live 3d Stats which IMO suffers from a very bad name, but might actually be worth the money. One problem with working with web analytics is that it is really easy to lose context. At an eMetrics conference a few years ago someone suggested that you should “grep your log files” occasionally. By that they meant look at the log files as they are created in real time. This allows you to see visitors using your site in real time which really kind of brings to life the whole web analytics thing.

 Of course grepping log files isn’t for everyone, but Live 3d stats may be. I like the idea of having this displayed on a screen in a public place in the company just to bring home the idea that people are coming into the site from everywhere at all hours from all sorts of zany keywords. Of course, in reality the display would fade into the background after a few days. Sadly, no-one would “see” it after they got used to it….

Even so, realtime is cool in limited doses so this might be worth looking into.

More Changes

Big changes, to tell you the truth. I’ve gone and switched jobs. Previously I was at a small consulting company focused on many smallish lead gen and very small e-commerce sites. As of this week I’m working at a largish online retailer as their lead web analyst. Life is good: I realized today that my new job is a blast! Also, if you’re an aspiring web analyst in the Twin Cities give me a call or an email, I’d like to talk to you….
I’m not sure if my posts could get any farther apart, but posting may be scarce around here while I assimilate into my new job.

Goofy Idea

Lead GenAs luck would have it I’m responsible for web analytics on several lead generation sites. I’ve been working diligently for many moons to “close the loop” on these leads so I can optimize for leads that close, instead of just optimizing for leads in general. I’ve had pretty good success with this approach, but it brought to light a nasty little surprise: for most clients each lead has a high average dollar value, unfortunately the total number of leads is small enough that sales from leads are very small percentage of total company revenues. Not good if you want management to take the site seriously.
piggy bankNo one would argue that leads are the only value that a lead generation site provides. The 95+% of people who don’t convert must be doing something useful! There has been a good conversation about measuring “engagement” recently. Eric Peterson outlined a very robust measurement methodology that produces an engagement index. An abstract number like this is good for optimization, but I’ve found that a good, old fashioned dollar amount is much better at getting people’s attention. I’m wondering about the value, in dollars, of all the people who don’t submit a lead form.
This seems like an almost impossible task given the available information, but maybe it isn’t…. I’ve been assuming all along that submitting a lead form is an excellent indicator of purchase intent — that someone who submits a form is much closer to closing on a sale than someone who doesn’t. It seems very obvious that this should be so, but I recently had kind of a goofy idea. What if this is assumption is bunk and lead form submitters are just a sample of average site visitors? What if they are exactly like all the other site visitors except for their propensity to submit lead forms?
If the submitters are a sample of the average then we can use their data to calculate the influence of the website on total revenue. Here are some (completely made up but reality based) numbers:
Annual Unique Website Visitors 500,000
Average Sale Value $ 10,000
Web Lead Close percentage 10%
Calculated Closed Sales from web visitors 50,000
Web Influenced Revenue $ 500,000,000
Average Visitor Value $ 500
In the calculation above I’m assuming that the web lead close percentage applies to ALL site visitors, not just those who submit lead forms. The Web Influenced Revenue that results is a much more satisfying and believably large percentage of total revenue. For real lead gen sites that I’ve seen, the Web Influenced Revenue number as calculated above is typically between 50% and 80% of total revenue. This makes sense, I’ve seen research that the web influences around 50% of high consideration purchases.
Goofy BankThe fact that the calculated revenue numbers line up so nicely with reality makes me think that my hypothesis isn’t so goofy. Maybe the lead form submitters aren’t really that special. I think the only way to know for sure is to survey customers and see how much influence the website had on their purchase.
If we do assume that lead form submitters are a sample of the total site population here are a few things we can do with this info:
  • Use the sample of closed leads to learn about other site behaviors — i.e. what does Eric’s engagement profile look like for this sample? How many other visitors have a similar profile?
  • De-emphasize lead form optimization– most lead gen sites optimize for the lead form conversion but since relatively few customers actually submit a lead form this is really a low value activity in terms of overall sales.
  • Use the calculated $$ amount to increase influence and power (buwaa hahaha).

Do you think I’m cracked? Let me know in the comments.

Updates and upgrades

Well, instead of spending my free time writing interesting blog posts, I’ve been fooling around with my software and hosting. The blog looks almost exactly the same on my new WordPress site, but I’ve managed to break all my permalinks. Eh, I don’t have that many posts anyway so I guess it’s not that big a deal. If I want to avoid posting in the future maybe I’ll redirect some of the old URLs to the new URLs.

 I also upgraded to IE7 and found that the self-contained firefox WA detector bookmarklet works fine. The IE one works as well, but the Firefox version is faster and doesn’t give an annoying security warning on SSL sites. If you’re using IE7 (or Firefox) you can use this version:

WA Detector (right click and choose “Add to Favorites” in IE, or drag to your Links toolbar in Firefox)

If you’re reading this via RSS you may have to click through to the site to get it to install correctly.

Enjoy!