Tuesday, May 13, 2008

Do Searchers Search More Over Time?

A paper from 2004 recently fell into my hands. It's from the journal Management Science, so you'll have to go there to get the paper (you'll need an account). As is usual in journal papers nowadays, it has five authors, Johnson, Moe, Fader, Bellman, and Lohse.

They did several studies, partly focused on asking whether people use search more often as they get more experienced on the Web. They also looked at how much people searched for sites when they wanted to buy something.

The results might be surprising to some - even in the age of search, users don't like to check out a lot of e-commerce stores. The majority prefer to settle on a few stores and go there rather than constantly checking out new ones. And even when they get more experienced with search, they don't use it much more. And they found that users don't search as much as you'd think.

None of it surprises me, although they didn't account for some factors, like age. Marketers have long known that past a certain age, the willingness to try new brands drops like a stone. The authors didn't break out sessions by individuals, but by households, so there's likely to be a lot of slop to the data.

The question that occurs to me is whether, if such brand loyalty online is real, it's due to actual loyalty, or reluctance to tangle with a new interface. E-commerce isn't so much like brick-and-mortar shopping as it like operating software, and few users enjoy mastering new software. Is it the label, or the comfort level?

There's also the fact that many users dislike searching unless they know a strong keyword. Look for "toilet seat" and you're likely to find an online hardware store. Search for "pens" and you'll end up with specialty stores, stationers, and collector sites, all of which have to sifted. Google is good, but it's not clairvoyant.

Monday, April 28, 2008

Third-Party Perils

A lot of client sites that I evaluate have tagging problems that aren't really of their own making. We have clients "tag" their sites for analytics purposes to send data back to our mothership, which is then returned to the client as reports. As you undoubtedly know, it's been getting commonplace to "farm out" a certain part of a site to third-party suppliers. Many clients, for example, now out-source their employment pages, with just enough matching page elements to make the visitor think they're still somewhere in the same site. Same thing with newsletters and emails - other people handle it for you. The problem is that those sites usually aren't tagged, so you can't track them. No tracking, no evaluation. Again, small sites aren't deeply affected, but bigger ones are. If you can, work it out with your vendor to let you tag their pages, or have them tag the pages. It's not a new request for most of them. Don't ignore such vital functions as recruitment and marketing contacts.

Saturday, April 19, 2008

Engage This!

I'm afraid that I have to take exception to yet another Web buzzword. This time it's "engagement". It's hot right now. Just ask Eric Peterson, who's making a little cottage industry out of his own "Engagement Index". Please. Make it stop. "Engagement" is no better defined than "intelligence", "happiness", or "it sucks".

I'm really a numbers kinda guy, with the heart of a researcher. That means I resist sloppy thinking. And "engagement" is just that, sloppy thoughts. Naming something and believing that you've driven to the heart of it. Peterson's various components may have merit, but he's going about this all the wrong way. Ideally, you study a big group of things and then derive patterns using standard statistical techniques. You don't just wish them into being, no matter how sure you are that they exist. Then you validate your model against a known situation and see if it holds up. If it wavers, fragments, or veers wide of the mark, then your model is faulty.

So far as I can tell, Peterson has never subjected his model to rigorous validation. His various engagement components in the index aren't weighted, so as one rises another could fall, leaving you with the same EI, but with a different situation entirely. I think anybody who relies on a single-measure EI to make expensive business decisions is playing with a loaded gun with the barrel plugged.

That's not to say that "engagement" could never be defined. It can. But it can be defined only as a series of KPIs that shouldn't be arbitrarily added together. A simple radar chart could show them all. So could time series charts. And it should be defined anew for each site. The quest for a standardized index will go on, but in the end I think it's futile. Adopt a Deming approach and keep working on your own special site. I don't think there are any shortcuts.

Tuesday, April 8, 2008

More Unintended Consequences

I love all aspects of how humans interact with technology, so I was particularly interested in seeing how well the new crimecams of San Francisco would work out. Turns out they're very effective in reducing crime - within range of the cameras. Were the designers of this system not parents? Even toddlers catch on that if you want to misbehave and not get caught, you move out of sight. Mayor Gavin Newsom voiced the paradigm of a generation when he said that the cameras at least made people feel safer. This would seem absurd if it weren't followed by the next quote. Paraphrased, it says that citizens felt safer because crime moved away a block or two, so that their neighbors would have to deal with it instead. The ultimate nimby. Newsom even says that he anticipated some kind of felon shuffle when he had them put up, but that voters generally liked them. The fact that the cameras might be able to zoom in on their bedtime activities doesn't seem to faze them.

Friday, March 28, 2008

Old Houses and Portals

I live in an older home (1920s) in a historic neighborhood. It's not a particularly wealthy neighborhood. Most historic ones aren't. Money flees its breeding ground. But the neighborhood is comfortable and reasonably vibrant. I always wondered why I loved older homes, and finally one of my gurus, Stewart Brand, might have explained it in his book How Buildings Learn: What Happens After They're Built. He says that older buildings exude what we call "charm" or "character" because they've been altered over time to suit both changing infrastructure needs (the arrival of central heating, air conditioning, indoor plumbing, electricity) and the changing life needs of its occupants (bigger kitchens, more light, more entertainment at home). They grow, morph, and gradually conform more closely to actual human life, like an old pair of jeans. New homes are raw despite their efforts to "design for life". Brand points out that buildings can't be designed up-front for our lifestyles, because no designer can get it right the first time. That's why a home needs so much time to find its proper shape.

What's the lesson for Web designers? Alas, probably not much, despite my most earnest desires to bring the analogy across. The missing element is time. Websites don't give you time. Portals were supposed to let users modify their views quickly, compressing the decades of home conformity into minutes online. Never worked. The vast majority of visitors never knew about customization or took the time to mess with it. Personalization works to an extent, but not completely. Web users are now used to their comfortable sites changing regularly, and although they may not approve, they rarely boycott on that basis.

That said, for years I've been fascinated with the idea of a personalization engine that would track Web user behavior and subtly shift the interface to suit. I've never bothered to fully flesh out the concept, but in general it would work much like Microsoft's failed personalization functionality in Office, the one that gave you chevrons instead of full menus. It was a good idea, but possibly the wrong place to use it. Office users are almost all repeat visitors. Website visitors aren't. Amazon does a good job with personalization, but I'd extend it from "you might also like this stuff" to actually shifting controls and navigational paths. A pipe dream, certainly, but given a huge pile of cash something I'd be interested in researching.

Saturday, March 15, 2008

Reading TeaLeaf

Sorry to have been away so long. Complications of various kinds. But now I'm back, and with tea.

Have you seen TeaLeaf? It's a snazzy app that sits athwart your Web traffic, sniffing and recording every user's session. A bit disconcerting, that. But its benefits are undeniable. It stores thirty days (or more, at your discretion) of user transactions, at the user level. It aggregates them too. I've long been a proponent of continual usability checking. Our profession seems to put all its emphasis on initial design and testing, while utterly neglecting Web analytics and other red-flag functionality that can signal usability leaks. Traditional Web analytics is good, but it isn't always granular, meaning that its results are en masse, not at the level of the individual user. It's great for marketing departments, but not as good for usability concerns. TeaLeaf shows the actual user transactions - where people go, what they click, what choices they make, and whether their conversions are successful.

For example, you can lose users at any turn in the road, but especially during checkout. Many visitors drop off when money becomes an issue, and understandably so, since they had no intention of paying anyway; they're just here for the experience, or the knowledge. But others experience technical problems or usability pitfalls. TeaLeaf generates a report on who converted and who didn't, and then you can track out why the failures happened, following every user's trail.