User Experience, User Assistance: an Interview with Joe Welinske

Product content, a.k.a. merchandising a.k.a. enhanced content a.k.a. our specialty, provides critical information to the customer about what she wants to buy at the point of sale. After the customer buys the product, the content he receives is called user assistance content (which we’ll call UA from here on out). UA eases customers into new products or systems, helping them get the most of the product and through sticky spots.


If you want to talk UA, you want to talk to Joe Welinske. After an early career in software engineering, Joe became deeply interested in technical writing. Today, he is the founder of WritersUA and he has more than 30 years of experience in UA creation and, in recent years, instruction. He also founded ConveyUX, a Seattle-based conference for user experience professionals. The Seattle user research firm, Blink UX, now owns the event and Joe is the Program Manager.

We sat down to discuss user testing with content, how to make UA and content a priority in resistant organizations, and the ways mobile remains uncharted territory for many companies’ UA.

“UA Is a Subdomain of UX”

Breanne: How would you differentiate user assistance from online help or manuals?

Joe: UA is like a subdomain of UX, but UA is the idea that we’re putting together information to help people. [UA] comes in all kinds of different formats, whereas in the more traditional, narrow view, we only talked about manuals. A more progressive view of UA considers all of the text in a user interface to be the frontline in effective Help. UA also includes wizards, tutorials, eLearning, and knowledge bases and content management systems. UA professionals also are involved with usability testing, localization, testing, quality assurance, and branding.

My part of UA has always been making sure that what we call “content” is something that’s studied by a company, like what you do [at content26].

Breanne: How is the introduction of testing content effectiveness usually received?

Joe: It depends on who you’re talking to. I find that a lot of UX design firms have a content strategy practice as part of the rest of the things that they do, and then you have people at companies that talk specifically about content strategy. I come from a technical communication background, and I get frustrated that hardly anybody has any kind of data about whether their work is useful.

Breanne: Is it difficult to gather data on usefulness of technical content?

Joe: In my experience, I’ve found that it’s relatively easy to identify gaps with content because it hasn’t been tested.

When I reflect on a lot of the things that I’ve done, I have no idea whether it helped anybody, because there was no data to record what we did and the effects it had. Now, we have the tools available, so that’s not an adequate endpoint anymore. It’s like, if you want to know, you can find out, but I think a lot of people in the tech comm space don’t want to find out.

Breanne: Why do you think that is?

Joe: Why do you think that is?

Breanne: I would guess it would just be a fear of finding that what you’ve done for 10 or 20 years was ineffective, and that’s more than some people might care to get into.

Joe: I talk to people about testing word choices, and most look at me like I’m nuts. We’ll test button shapes and colors and orientations, but testing four or five synonyms for the button text? The answer is usually no.

[Tweet “”We’ll test button shapes and colors, but synonyms for button text? The answer is usually no.””]

Anecdotally, if you look at how modern software’s put together, there’s almost no decent software development process in any company that doesn’t have some type of quality assurance testing. But then you look at the user assistance side, and it’s completely the reverse in percentages. All I can think is that the value isn’t placed there, and/or there’s no pull-through from those departments to make that happen.

Specs for Programmers, but No Style Guides for Content Creators

Breanne: At Facebook, in the position that they call “content strategist,” they have this focus on microcopy. They are usually UX writers whose jobs are just to focus on button text, but the scale has to be that big and that profit-driven to create that kind of focus. Otherwise, no one is going to pay such close attention to button text or 50 shades of blue.

Joe: I’ve talked to people at large companies. They never get the impression that the actual words get vetted appropriately. I find that there are specs written for the programmers, and then that’s what they actually code for. But we never create the style guide for who the writer might be at that organization.

The Quandary of Mobile

Breanne: How is mobile UA usually addressed? Just the usual content stuck on the smaller screen?

Joe: Yeah. I’ll use an example from SeaTac’s website. There is a page that leads you to the light rail connection to go downtown. There’s a whole bunch of information about it, and on a desktop, you can process it fine. When you get onto the small screen, it does scale itself for that space. But if you actually put yourself in the situation of coming out of baggage claim, pulling your bag, getting to the light rail… most of the information on that webpage isn’t relevant.

You’re walking through the airport, you have visual aids, you have signs. Instead of a copy of a map you might find at a help desk, maybe the map should link to GPS on your phone. And then what do you need? Do I pay cash in the terminal, or are there kiosks over by where the train is? Scaling the website down using a technique like responsive design doesn’t take that into account at all.

Breanne: What are some common issues that come up when creating content especially for mobile?

Joe: UA for mobile has to be really concise. If you have space for just four or eight words, you want it to be the best possible 4- or 8-word choices, and most people don’t make sure that happens.

On the programming side, I don’t think enough is done in terms of presenting information when you expect the user is likely to need it. For example, if the user has tried a certain feature four times, but the app can recognize that they haven’t used it in a sequence, that can be an optimal time to reveal the information. That kind of adjustment makes your app and your information smarter.

Where to Catch Joe Next:

Joe Welinske has a robust slate of activities and events throughout the year, which you can keep up with here. He also regularly teaches courses at Bellevue College and the University of Washington.