Contact  |  Login  Volunteer

Remote Evaluation

Any usability testing< method where the evaluator and user participant are not in the same location. Remote evaluation may be moderated, with the evaluator observing the participant in real time, or may be automated or unmoderated with the participant working without direct observation or interaction.

The term "remote evaluation" spans a wide number of detailed methods that collect a range of data. At one extreme, there is little difference from in-person task-based lab testing, except that the moderator and participant are not in the same place. At the other, there are no user tasks at all, and the data collected is aggregated analytics.

Bolt and Tulathumutte diagrammed the methods< on two axes: Qualitative (moderated) vs. Quantitative (unmoderated) and Concrete vs. Conceptual (how closely the method reveals actual behavior on a completed interface).

They group together all qualitative (moderated) methods using remote screen-sharing and audio: a participant and moderator work together in real time. Tools include Adobe Connect, GoTomeeting, NetMeeting, LiveLook, UserVue, Skupe, WebEx, Glance, Youguu.
They organize quantitative (unmoderated) methods for collecting task data on a range from concrete to conceptual methods:
  • Testing on live sites/apps. Tools include UserZoom, RelevantView, WebEffective, Webnographer
  • Testing wireframes. Tools include Chalkmark, Usabila
  • Testing conceptual artifacts. Tools include online card sorting, OptimalSort, WebSort
Quantitative methods without tasks include:
  • User analytics on live sites. Tools include ClickTale, ClickHeat
  • A/B/C testing on live sites
  • Surveys


Related Links

Formal Publications

Bolt, N. and Tulathumutte, T. (2010) Remote Research: Real Users, Real Time, Real Research<. Rosenfeld Media.

Web Resources

Resources from UPA

Markel, Joanna.; Rosehan, Serena. Making Method work for you: how remote contextual inquiry got us up-close with users. UPA 2008 Conference.

Mitchell, Peter P. An Inventory and Critique of Online Usability Testing Packages. UPA 2002 Conference.

Nuez, Alfonso de la., Tedesco, Donna., Aseron, Rob., Tullis,Tom., and Albert, Bill. Unmoderated Usability Testing: Experiences from the Field. UPA 2009 Conference.

Pressman, Eric. Usability TV Techniques and Tips for Broadcasting Usability Tests to Remote Observers on a Budget. UPA 2002 Conference.

Sapienza, Fitipp PhD. Working with lmmigrant and Trans-National Users in Usability Evaluation. UPA 2008 Conference.

Semen, Timothy S., and McCann, Tom. Timothy. The Trials, Tribulations and Triumphs of Online Usability Testing. UPA 2001 Conference.

Tullis, Tom.; Fleischman, Stan.; McNulty,Michelle.; Cianchette, Carrie.; Bergel, Marguerite. [< An Empirical Comparison of Lab and Remote Usability Testing of Web Sites.] 2002 UPA Conference.

Wei, Carolyn.; Barrick,Jennifer.; Cuddihy,Elisabeth.; Spyridakis, Jan. Conducting Usability Research through the Internet: Testing Users via the WWW. UPA 2005 Conference.

Related Topics

Detailed description

Moderated Remote Testing

Benefits, Advantages and Disadvantages

As with either form of remote testing, the primary benefit is ability to work with participants without some of the difficulties of bringing them to a lab.


  • You can work with participants from any geographical location
  • Participants are working in their own familiar environment
  • Observers can watch sessions from their own location


  • You (usually) cannot see the participant, so cannot gauge body language or other indirect communication.
  • Technical difficulties can interfere with the session.
  • The material being tested must be on-screen.

An additional advantage to remote moderated testing is the ease with which people with a complex or unusual technical setup to work in their own, familiar, environment. This includes people with disabilities who use assistive technology, or who find travel difficult.


The primary cost benefits of moderated remote testing come from eliminating the need for a lab, and for either the moderator or participant to travel. How much of the cost this saves depends on whether you are replacing a rented facility, how much travel would have been required and so on.

Appropriate Uses

Moderated remote testing is most effective when:

  • It is difficult to bring participants into a lab, either because of geographical distance, or because they are too busy to recruit easily.
  • It is valuable for the participants to work in their own environment.
  • You want to intercept visitors to your site, rather than recruit in advance.

Unmoderated Remote Testing

Benefits, Advantages and Disadvantages

The primary benefit of unmoderated remote testing is the ease with which large number of people can participate in the test.


  • Quantitative data
  • Participants work in their own environment
  • Participants set their own schedule
  • Lower cost per participant


  • You cannot watch the participants in real time
  • Tasks are fixed in advance, and cannot be adjusted for each participant, or be interview-based
  • Task success may rely on self-reporting or web analytics
  • Qualitative feedback is limited
  • Some participants may be interested only in early the honorarium


The cost of unmoderated remote testing include:

  • Time to set up and pilot test the test script
  • The cost of the tool. This varies from free to a high cost, depending in part on the features of the tool.
  • The time to analyze the data collected
  • Honorariums are lower, or not paid.

The biggest cost savings of unmoderated remote testing come from not needing a moderator.

Appropriate Uses

Unmoderated remote testing is most effective when:

  • You need data from large numbers of participants
  • You have very specific questions
  • You have clear tasks for participants
  • You are more interested in what participants do than in exploring why or how they do it.

How To

Moderated testing tools

These tools all allow remote screen sharing and some form of voice connection (either VOIP or via conference call)

Unmoderated testing tools

These tools all allow you to construct a set of tasks and invite users to complete them. They vary widely in the complexity of the tasks they allow and the data they collect.


Lifecycle: Evaluation
Sources and contributors: 
Whitney Quesenbery
Released: 2012-04
© 2010 Usability Professionals Association