Using Quick Surveys for Website Task Analysis

End-user task analysis data are critical in designing or redesigning websites. Site sponsors need to know what tasks and goals users have in mind when they visit the site. Numerous methods are available to gather user task data, including online surveys, web metrics, in-person interviews, field studies, and usability testing.

a survey

A quick survey with eight questions, for the Office of Cancer Complementary and Alternative Medicine.

We have had experience with clients who needed to take the “economy class” approach, at least at first. In this case, quick online surveys were the most practical way to collect task data from users. In this article, we describe how you can make high-quality online surveys.

We use a number of examples: one is a recent quick poll by the first author for a UK university library. The others are quick surveys, some of which involved the second author’s work for the National Institutes of Health (NIH).

We use the term “quick poll” rather than “quick survey” when there are only two or three questions.

When to Use Quick Surveys

Quick online surveys serve as a task analysis tool for websites. Quick surveys can be very useful:

  • In the analysis and design phases to understand users’ tasks and goals better
  • To evaluate an existing website or measure improvements after redesign.

By task analysis we mean getting an insight into the tasks the user wants to carry out using the website. These tasks can be interactions such as renewing a library book or ordering a publication, or they can be information-seeking tasks.

Analysis and Design

Quick surveys are a low-cost, low-barrier way of gathering a large amount of data on users’ tasks. At Glasgow University, we used a quick poll on the home page with just two questions, and gathered data over several months.

We used an eight-question quick survey as input for the redesign of the Office of Cancer Complementary and Alternative Medicine website. Users were asked the reason for visiting the site that day as well as why they normally used the site.

Evaluation and monitoring   

Quick surveys can also be useful to evaluate the success of a redesign, or to monitor quality on an ongoing basis.

A quick survey before a site redesign allows you to gather data for benchmarking the new site against the old site so you can measure improvements. Users’ responses are more likely to be reliable when they are asked about their current experiences than when they are asked to rate improvement over a certain time span.

Many of the examples for the National Institutes of Health are user satisfaction surveys. Examples of questions probing how well the current site is performing are:

  • What difficulties or frustrations did you have in trying to find this information?
  • What, if anything, did you find confusing about the site?

Quick Poll Example

A quick poll was used to obtain task information from a wide range of people using the Glasgow University Library website. Combined with information from other sources, such as interviews and user feedback on the existing site, the quick poll data helped develop a more task-oriented site.

website poll

The quick poll for the Glasgow university website.

The quick poll collected information on users’ tasks: (“Why are you visiting the library website today?“), and on user type: (“Are you staff, student, or visitor?”).

Responses were anonymous and the user type question was optional. The quick poll ran for a number of months and data from key weeks (for example, exam times, vacation, term time weeks) was sampled and analyzed.

The open question gave us insight into the way users think about their tasks, and the terminology they use.

The results from the quick poll data convinced management to provide a budget to perform usability tests on the new design. The insights from the task question were the foundation for the scenarios used in the usability tests.

What to Ask About

There are many possibilities to obtain a better insight into users’ tasks and goals. Below we discuss three different approaches which were used in the Glasgow University case study and the NIH quick surveys.

Users’ Tasks and Goals

One option is to ask directly what task the user is carrying out, for example: “Why are you visiting the library website today?”.

When asking about users’ tasks, it is helpful to collect additional information so that the task can be considered in context. This can be:

  • The user’s main activity. For example, “Which of the following best describes you when you visit the website? Researcher, Health practitioner, Patient…”).
  • Whether the respondent is a frequent site user.
  • The environment the user works in (“What type of organization do you work for?”).

These types of questions about users’ tasks provide the most direct input for task analysis.

Level of Support for Users’ Tasks and Goals   

Another option is to assess how well the current website information supports the users in their tasks and goals. This can take several forms:

  • An overall rating. For example, “Please rate how useful you found the information on the site: Not useful/Somewhat useful/Extremely useful”. Answers to this type of question provide you with overall impressions.
  • Missing information or tasks not well supported. For example, “Using the OCCAM website, I wish I could…”, and “Is any information missing on the website that would be helpful to you?” This strategic approach can identify which issues are most on users’ minds and should be addressed first.
  • Support for a specific task. Survey questions can also be used to evaluate how well the website supports a specific task—either the task the user is carrying out or a task already identified as key. For example, “How much did the website help you to find a mentor with similar research interests?“

Usability Dimensions Relating to the Task

You can also assess specific usability dimensions of the website. Whitney Quesenbery (www.wqusability.com/articles/more-than-ease-of-use.html) defines the five dimensions of usability as the 5Es: Effective, Efficient, Engaging, Error Tolerant, and Easy to Learn.

A general satisfaction open question can be used to assess how well the site is doing in terms of the 5Es. Examples are:

  • “What do you dislike about the site?”
  • “What difficulties or frustrations did you have in trying to find this information?”

The answers can be categorized according to the 5Es and used to measure how well the site performs for the different usability dimensions. It is useful to consider the results in light of the key user tasks.

Alternatively, you can focus on a specific usability dimension, for example:

  • Effective: “Please rate how useful you found the information on the site.” “How complete was the information?” “The Search feature was helpful and accurate.”
  • Efficient: “Please rate how easy it was for you to find information on the site. ”It was easy to navigate through the site.” “The CARDS website is quick to search.”
  • Error-tolerant: “If you use a screen reader or have other accessibility requirements, did you have any difficulty using the website?.”
  • Engaging: “I like the website’s appearance.”
survey question with multiple answer options

A closed question (with “Other” option) for the Office of Science Planning and Assessment,

How to Ask

When designing surveys, we find it useful to follow the steps outlined below.

Step 1. Define the objectives and the audience

What information do you need and how will it help you to make decisions? We recommend prioritizing the objectives into essential, useful, and nice-to-have. Sometimes at this stage it becomes clear that it is far better to concentrate on the first two priorities and leave out the wish list.

You also need to think carefully about the audience of the survey. Again, it is useful to reflect in detail and document the different audience segments. Do you want to target only a sub-population of your audience? If not, include all of your audience segments.

An important decision is whether or not to gather anonymous responses. Surveys which don’t ask for respondents’ contact details have a lower barrier. For the quick poll at Glasgow University Library, we opted for anonymous responses, as we wanted to encourage users to fill out the poll at every visit. Non-anonymous responses have other advantages, though:

  • You can contact respondents to clarify their answers.
  • You can compare responses from the same user over time
  • If you are working with a mailing list of named contacts, you can follow up any non-response.

Whether or not to select for anonymous responses depends on the situation. Do you need more responses? Do you want to be able to follow up to solicit detailed information? Does your organization have a solid reputation in respecting users’ privacy?

Step 2. Write the questions and the answer choices

Starting from the objectives, write down each question. Decide whether an open question is more appropriate or whether to use a rating scale.

Open questions are a good way to get an insight into how users think about their tasks and what terminology they use. At Glasgow University Library, we discovered that some of the terms in use on the existing site did not correspond to users’ terminology.

In contrast, a quick survey for the redesign of the Office of Science Planning and Assessment website used a closed question to probe users’ tasks; although there is an ”Other” option, users are basically presented with a set of predefined choices

Analyzing the results from closed questions is much quicker than from open questions. However:

  • Respondents may be biased by the pre-defined categories
  • Only information at category-level is collected; finer-grained information is lost
  • Respondents may put their task into the “wrong” category
  • No information on users’ terminology is gathered

When all questions have been drafted, go through the survey and imagine you already have the answers; will these answers give you the information you need? Will the objectives set out in Step 1 be met? It is worth spending some time thinking about possible answers to open questions? Personas can be a useful tool for this.

Step 3. Test and launch the survey

Before launching the survey, find a few representative users to test the survey: are all questions clear and unambiguous? Is the wording clear and neutral so as to not bias the responses? Using the test responses, verify that the survey will collect the information you need.

While quick surveys may not yield the best quality or most comprehensive data, they are attractive when you have a limited budget. Surveys are one of a number of guerrilla or discount usability methods (see www.useit.com/papers/guerrilla_hci.html for more information.) Keep in mind that any input from end-users is usually better than nothing!

Positioning Quick Surveys

Because quick surveys allow large numbers of responses, they can play an important role in convincing sceptics who may dismiss small-sample techniques such as in-depth interviews or usability testing. Quick survey results can be helpful getting management buy-in to invest in more expensive or time-consuming usability methods. Finally, they can complement other data sources or inform other usability methods such as in-person interviews, field studies, and usability testing. Quick surveys can be an excellent way to introduce usability and a more user-centered way of thinking into an organization.

Online Resources for Quick Surveys      

The National Institutes of Health (NIH) resource offers practitioners in the public and private sectors examples of best practice surveys. See www.nih.gov/icd/od/ocpl/resources/OMBClearance/ClearedSurveys.htm.

In particular, the quick survey for the redesign of the Office of Cancer Complementary and Alternative Medicine collects information on user’s tasks, level of support for tasks, and usability dimensions: www.nih.gov/icd/od/ocpl/resources/OMBClearance/NCIOCCAMwebSurvey.pdf.

Two useful examples of user satisfaction surveys are: www.nih.gov/icd/od/ocpl/resources/OMBClearance/NIHaboutSurvey.pdf.

www.nih.gov/icd/od/ocpl/resources/OMBClearance/NIDCDwebSurvey.pdf

Brys, C., Gaddy, C. (2007). Using Quick Surveys for Website Task Analysis. User Experience Magazine, 6(3).
Retrieved from https://oldmagazine.uxpa.org/surveys_website_task_analysis/

Comments are closed.