Crowdsourcing for Usability Testing

Computer Science – Human-Computer Interaction

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

10 pages

Scientific paper

While usability evaluation is critical to designing usable websites, traditional usability testing can be both expensive and time consuming. The advent of crowdsourcing platforms such as Amazon Mechanical Turk and CrowdFlower offer an intriguing new avenue for performing remote usability testing with potentially many users, quick turn-around, and significant cost savings. To investigate the potential of such crowdsourced usability testing, we conducted two similar (though not completely parallel) usability studies which evaluated a graduate school's website: one via a traditional usability lab setting, and the other using crowdsourcing. While we find crowdsourcing exhibits some notable limitations in comparison to the traditional lab environment, its applicability and value for usability testing is clearly evidenced. We discuss both methodological differences for crowdsourced usability testing, as well as empirical contrasts to results from more traditional, face-to-face usability testing.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Crowdsourcing for Usability Testing does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Crowdsourcing for Usability Testing, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Crowdsourcing for Usability Testing will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-393992

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.