Sunday, September 23, 2007

Did My Grandparents' Brains Ever Explode?

I live in a historic neighborhood, about three blocks from where my paternal grandparents set up housekeeping in the very early years of the twentieth century, certainly before the 1920s and quite likely before WWI, which my grandfather did not participate in. I sometimes try to imagine what life was like during that period. We like to think of today as the epitome of rapid progress, but I think it's nothing next to what they experienced. Today, semiconductor technology has made only tiny incremental advances since the breakthroughs of the transistor and the integrated circuit back in the 50s, 60s, and 70s. But consider what my grandparents lived through -

The telephone. During the period before my father's birth (1928), the telephone went from being an oddity in hotels and banks to everyday objects in homes.

The car. When they moved to our town, my grandparents could have seen horses still pulling wagonloads of coal and ice around town. By the time my father was born, horses had vanished from the streets, and internal combustion engines ruled.

Central heating. Most of the homes around here still have coal scuttles or blocked-off holes where the scuttles were. By the 1940s, coal was a dead business, supplanted by gas heat in a central furnace.

Radio. By 1930, most of the homes hereabouts were within hearing distance of a radio, and by the late 1930s every home had at least one.

Television. They appeared in abundance in the 1950s. My grandparents were only just thinking about their retirements.

Air conditioning. They never had any. Neither did most of the homes until recently, perhaps the 1970s. Many still don't. But they saw it arrive in homes.

Aircraft. They lived from the beginning of aviation until the Jet Age. Although neither ever traveled on a jet, they could watch the planes pass overhead.

Their world both shrank and expanded at an astonishing pace. When they moved here, most transportation was horse (expensive), on foot (limited), or by trolley (good for long distances). Most people lived near their jobs and walked to work. Communication was by word of mouth, or, rarely and expensively, by telegram. Perhaps by newspaper for larger news. By the time they died, they could have phoned any place in the world, heard war news from Europe, watched live broadcasts from Africa, flown to Greenland. My grandfather's jobs changed accordingly. The plant where he worked started out making farm implements and ended up making trucks. The coal and ice supply where he worked part-time shut down finally, when demand shriveled away to nothing.

Yet they never seemed to be overawed by what happened around them. Today we're terribly self-conscious about our new things, but they seemed to take them in stride. When they answered the phone, they never put the receiver down reverently. The refrigerator, gas stove, and furnace were just ordinary devices by the time I knew them in the 1960s and 1970s. I suspect my grandmother, had I asked, would have expressed joy that she no longer had to kindle a coal fire first thing in the morning, but she never brought up the subject. They saw revolutions over and over again, yet never seemed stunned by any of it. Perhaps it was a lack of self-awareness, but it may also have been simple acceptance of it all, much as they seemed to accept all the deaths in the family (two out of four children dead in early adulthood) and the antics of the grandchildren. Come what may, they did their best without any public notices about it. Maybe in their world, revolutions didn't need to be understood, just tolerated, like everything else.

Wednesday, September 19, 2007

Bookmarks as IA

I once did a project for a client that involved talking with users about their browser bookmarks. The project was a redesign of an intranet that was built like a Wild West town, with a wacky combination of independent little plots stitched together only by virtue of being under the same corporate umbrella and having links on the central page of the intranet. Every department had its own navigation and design. It turned out that employees coped by using bookmarks to provide dependable paths back to the information they had so painstakingly located. Nothing new in that, of course; Web users still use that strategy. But interestingly, the names they gave to the bookmarked pages in the bookmarks were indicative of their own quirky needs. In effect, the bookmarks were individualized navigation schemes, or IAs. By studying the bookmarks, we got a pretty fair idea of users' mental models for information.

Whenever users create informational structures, it's worth studying them to discover what's core, and what's transitory. That's the problem with today's tag or link clouds: they can't distinguish between fad and eternity. Any given cloud today may have "Britney" as its biggest member, but that probably won't be the case next year. Clouds are intrinsically time-bounded. But it would be interesting to do some multivariate work like cluster analysis on several clouds over time to see what drops out and what stays.

Wednesday, September 12, 2007

Ice Cream by the Blues

Here's an ice cream machine that dispenses an amount based on how unhappy the customer is perceived to be. The vending machine does a voice analysis to determine your level of the blues. Some days I'd qualify for a whole week's worth of ice cream production.

Friday, September 7, 2007

What Have I Forgotten?

I'm teaching a graduate course in HCI, and I've learned one thing so far - how awfully much I've forgotten already. We're getting into patterns and pattern languages, and I'm alarmed to say that although I have a vague recollection of this subject, I haven't been able to use them in the field much, so they've receded into my mental archives. The same is true of other things. Names for rarely-used prototype techniques. Details about the types of conceptual models. The difference between categorization and classification. I know this kind of forgetting happens, but it's not supposed to happen to me.

Still, teaching these courses keeps the information fresher, and that's comforting to me. It's one reason why I keep teaching, even though sometimes the cognitive loading of work, home, and two or three simultaneous classes can get stressful.

Why Am I Not a Programmer?

I was recently in a meeting where I was enumerating what I did well and what I didn't. Although I have a lot of skills, there are some things I just don't do well. In user experience terms, those are primarily programming and graphical design. I can do both, but haltingly, and others do them much better, so I have a tendency to avoid them. At least with graphical design I tend to compare my meager skills to those I see on display on the finest pages, so perhaps my bar there is unreasonably high. I've never had much coursework in graphical design, at least in the past two decades, so I have little basis for comparison.

I have taken courses recently in programming, though, so I think I'm more realistic there. Java drove me crazy. I'd be looking for a method in the documentation so I could find its arguments, and to my annoyance I'd have to trace it up the tree several classes, because it was inherited a dozen times. The classes weren't hard, just tedious, frustrating, and boring. I've had the same problem in other programming classes. I was probably the only English major in history to sign up for assembly language programming class. It went OK, but I didn't feel any affinity for the subject. No spark. No gift. I concluded I would never be much good at it, and that was that.

But after my meeting, I started to reconsider my position, especially after I talked with a programmer and looked up some "how to think like a programmer" pages. To my amazement, it appears programmers don't like to code much more than I do. It's just that to solve their problems, which they love to do, they have to code.

It reminds me of my mathophobic days. Although I teach statistics now, I was once afflicted with math anxiety. That cracked away after I abruptly realized what math is. It's a modeling language, with enormous lossless compression. You can model reality with it, sort of. Once you learn the language, the rest is just fiddling with it until any given equation makes sense. Even mathematicians diddle and doodle. There are no born mathematicians, only those who have messed around with it for a long time until it's easy for them to manipulate the linguistic symbols. There may be a math aptitude, but no math gene. I could do it too.

Maybe programming is like that and I'm being too critical with myself. I have yet to meet anyone who enjoyed programming courses and having to memorize languages, any more than I've met mathematicians who liked algebra classes. The essence of math is modeling; the essence of programming is problem-solving. Indeed, a lot of programmer-pundits say that formal college training in programming is counterproductive, because it confuses syntax with thinking. Many advocate starting programming careers by learning calculus, linear algebra, or even physics, just to get into the swing of thinking through complicated problems. Maybe I don't want to earn my living inside a compiler, but maybe too I'm too hard on myself when I sneer at my programming skills. Maybe I'm not much worse than anybody else.

Tuesday, September 4, 2007

Boeing Turns to Psychology to Design 787

The September issue of Air and Space Magazine has an article about how Boeing designed the interior of the new 787 using psychologists, focus groups, and other user-centered design techniques. They hired renowned marketing expert Clotaire Rapaille to help. Apparently together they did a load of research about how fliers like to see aircraft interiors. And they're not publishing what they learned, either. Usability as trade secret.

Friday, August 31, 2007

New Data Visualizations

One of my enduring interests is data visualization. It hits so many user experience hot buttons: cognition, potential for confusion, Gestalt principles, and so forth. Research in the past few years seems to have slowed considerably in this area, perhaps because much of the breakthrough work has already been done. We're not seeing new methods of visualization now, but refinements of old ones. Fisheye views (PDF) have been around for a very long time now. So have heat maps, tree maps, network maps, and so on. They're just getting new treatments and makeovers. If you want to see how a lot of them have been retooled with modern computing power and pretty colors, check out this article in the online zine Smashing.

Most of the applications are intriguing and professionally done, but I'm not seeing anything that makes me sit forward in my chair. Many old standbys have been dusted off, like the radar chart, but everything here has been done elsewhere. I don't suppose there are many more visualization methods to be discovered. But the flip side of this is that these techniques are getting more common and less expensive, and therefore more accessible to us. I've wanted to do tree maps forever, but no client has ever warmed to the idea. If you want to see how the principle can be applied well, if a bit understated, look at the daily stock market data here.

Wednesday, August 22, 2007

Control is Everything

Scott Adams got me thinking. In his blog entry for August 17, he mentions that one of our strongest needs it to feel like we're in control. He used an old example: A genie offers you two choices. In the first choice, "You can eat at the finest restaurants in the world for free, twice a week. The only catch is that the genie picks the day, when you are not already booked, and he picks the specific restaurant." In the second choice, "You can eat at “good” restaurants, again for free, twice a week. But this time you can schedule it whenever you want, up to two places per week, and pick whatever “good” restaurant you want."

He goes on to develop the theme that the first choice probably wouldn't make many people happy, because they would eventually feel the keen sense of loss of control. The second choice, while gastronomically less appealing, is probably a better one for most of us.

It reminded me that one thing users dearly love is control, or at least the illusion of it. This is something that subconsciously irks me about lots of software and websites, I think. It's why I'm irritated with Flash so often. It just takes off and does things without asking me. The same thing annoys me about flashing ads, shifting menus, and other things that don't help me do things, but invade my locus of control. We humans don't seem to resent losing control if we don't expect it. We accept that the good guy may die at the end of the movie, but we'll shriek in fury if we can't change the channel to another movie. And we accept a loss of control when it benefits us. My car's engine does hundreds of things that I don't need to approve as they're happening. But there are some places where humans just won't accept interference. I wouldn't pay less for a car if it decided by itself when it would start. The same thing is true for software and websites, I think.

Decluttering

A group of scientists at MIT headed by Ruth Rosenholtz, a long-time researcher into vision and technology, has developed a prototype application in MATLAB that determines the amount of clutter on-screen (Link). The HCI profession has long needed something that could separate figure from ground reliably. The program is only in prototype, but apparently it's rather promising.

The problem of figuring out what's vital few from trivial many isn't trivial itself. Nuclear facility control rooms are a case in point. Rows of lights can go from being background hum to suddenly becoming extremely important. How much do you expose to an operator, or to a website user? Hicks Law was an early attempt at measuring how much stuff was too much, but the sophistication of control schemes today needs a better way of knowing when you've overstuffed the interface.

Friday, August 17, 2007

Are We The Way of the Future?

Computerworld recently published an article listing twelve job skills that no employer can refuse. The usual suspects hopped onto the list - whatever's hot, that is. Wireless, for example. But one of the twelve was usability. That surprised me, for several reasons.

First, usability isn't a universally-needed skill, at least not at the level a specialist brings. It kicks in only when the stakes are high and failure is all too expensive. For websites, it's primarily for ecommerce and other high-end sites. And those are designed and built on the coasts, not in the flyover zone where I live. Check out monster.com, dice.com, UPA's career page, or careerbuilder.com, and you'll see what I mean. Jobs are plentiful in Massachusetts, Washington State, California, New York, New Jersey, and Virginia. There aren't many in Iowa, Montana, Arizona, Indiana, Alabama, and most of the other flyovers. So how do we qualify as owners of a "can't miss" skillset?