Professional Documents
Culture Documents
Nina McHale Assistant Professor, Web Librarian Auraria Library November 10th, 2006
Our Agenda
Why usability? Three phases: planning, testing, results Testing example: Auraria Library home page Fall 2006 usability testing Create your own test plan!
Why Usability?
Continuous assessment and improvement The Google problem The user isnt broken. -Karen Schneider Learning to think like your user: the reverse teachable moment
Testing Phases
Planning:
Results:
Testing:
Collate data Identify problems Prioritize problems Recommend/implement fixes Write report
Card sorting/affinity mapping Focus groups Surveys and Interviews Task-based testing Prototyping
For a more complete list of methods, see James Homs Usability Methods Toolbox
Identify staff willing to participate in the various phases of testing (planning, testing process, evaluating results) Include staff from multiple areas of the library Consider recruiting non-library staff to test:
Reduced anxiety for test subjects More critical/honest feedback from subjects May be required in some settings Suggestions: friends groups, volunteers, students
Strive for a representative sample of your patrons How many subjects is enough? Provide an incentive valuable to them
Federal law requires that any institution receiving funding from the Department of Health and Human Services formally review any study using human subjects Institutional Review Board (IRB) Plan ahead:
Online course (2-4 hours) Submit written proposal and paperwork (sample) Allow time for the IRB to review your project Allow time for revision, if required by the IRB Exempt status versus full review
Phase 2: Testing
Phase 2: Testing
Materials:
Space: a separate room/classroom Office supplies (pens, paper, flip chart, scissors, Post-Its, etc.) Computer or video camera setup (w/software such as Camtasia if desired)
Do a dry run that approximates the real test situation as closely as possible Revise your test procedure and documents as necessary
Subjects can be videotaped, or software such as Camtasia can be used to record sessions Useful for revisiting test sessions or as evidence in the results phase Note: Academic IRBs generally will NOT extend exempt status to any project that involves videotaping your subjects; allow extra time for full IRB review if you feel strongly about recording
Collate data collected from all subjects into one document Identify problems common among subjects Prioritize problems: two ranking systems Recommend fixes Implement fixes Written report
Type of document will depend upon the type of data collected: Survey/interview responses Mock-ups of a proposed home page Recorded sessions of patrons performing tasks Usually a spreadsheet with accompanying chart Georgetown example task-based testing 4 users
Identify problem areas common among subjects (the single collated document simplifies this process)
Prioritize problems with a pre-established ranking system Two examples (Source: Barnum, p. 270): Rubin: Unusable Severe Moderate Irritant Dumas and Redish: Level 1: prevents completion of a task Level 2: creates significant delay and frustration Level 3: has a minor effect on usability Level 4: subtle problem; points to a future enhancement
Recommending fixes:
Involve staff from multiple areas of the library in the discussion of how to resolve problems If you have a pre-established web advisory body, you may wish to begin with them Distribute data collation document prior to discussion sessions, if possible Delegate tasks as appropriate For complex problems, plan steps accordingly
Implementing fixes:
Write a report
Communicates your findings Documents an extensive process Record everything from your initial goal through methodology and results Report can be written before all changes are made Barnum: Document positive findings Good news! To ensure that things that arent broken dont get fixed in the future
Use of jargon: how to make links meaningful? Too many links (30+) Outdated look and feel
Based closely on a study conducted at the University of Central Florida Terms and phrases on the librarys home page are chopped up into cards for users to group and then arrange on a flip chart
Web Librarian Two Metropolitan State College of Denver students enrolled in the course COM 3625 Usability Testing Standing library Web Advisory Committee
Undergraduate and graduate students from all three Auraria institutions Goal is 3-4 students from each institution (per Nielsens recommendation) Incentive: 128 MB USB hard drive, customized with the librarys logo Advertising: library signage and note on library home page
Flip chart Copies of library terms and phrases on small slips of paper Scotch tape Pens/markers Blue dot stickers Room 130
Web Librarian and students did a dry run One student completed the process while the other facilitated Procedure improvements:
Scotch tape versus glue sticks Allowing users blank cards to create their own terms/wording
Phase 3: Results
Flip chart home page mock ups Data will be collated a la the University of Central Florida method Web Advisory Committee will provide recommendations to Web Librarian about the revised home page
Be specific! NOT the library web site Survey/interview; card sort; task-based; focus group? Given the goal and method, who should be involved? Ensure that multiple departments are represented Whom and how many will you use to test? How will you recruit a representative sample? Academic librarians: do you need/have IRB approval?
Staff:
Subjects:
Where will you conduct the test? Will you recruit non-library staff to help? What kinds of information will you include in your script? Will you need office supplies? Describe your test procedure.
What type(s) of data will you have after the testing sessions? How will you analyze/present the data, and to whom? Once problems are identified, how will you proceed? Who will implement the necessary fixes to finalize the project?
Web Resources
Books
Carol Barnum, Usability Testing and Research. New York: Longman, 2002. Elaina Norlin and CM! Winters, Usability Testing for Library Web Sites: A Hands-on Guide. Chicago: ALA, 2002.
Articles
All articles are from Computers in Libraries, October 2005, Volume 25, Issue 9
Frank Cervone, What Weve Learned From Doing Usability Testing on OpenURL Resolvers and Federated Search Engines, 10-24. Janet Ballas, Does Your Library Pass the Web Site Usability Test? 36-39. Heather Cunningham, Designing a Web Site for One Imaginary Persona That Meets the Needs of Many, 15-19.
Questions? Comments?
Nina McHale nina.mchale@cudenver.edu Presentation slides, handouts, and supporting materials are available online: http://carbon.cudenver.edu/~nmchale/usability/