Assessment and Evaluation of Public Library Websites in Australia, Canada and U.S

Diane Velasquez, University of South Australia

Concurrent session 5
Tuesday 30 August 2016, 2:30pm - 2:55pm


Evaluation of library websites began as early as 1994, when the internet and World Wide Web were in their infancy.  Empirical studies are available on websites which may not be related to libraries, but these studies are informative on what questions to ask and the frameworks needed to move forward with the current study (Cao, Zhang & Seydel, 2005; Chua & Goh, 2010; Teo, et al., 2003). However, the majority of the literature available on libraries is in the academic library sphere, and while what academic libraries evaluate for websites has merit for public libraries, their stakeholders have a different focus. The academic focus is on curriculum; the public library is on lifelong learning and leisure.

Evaluating a website and assessing its quality is in large part evaluating its usability. According to Poll (2007), “contents, language, structure, design, navigation, and accessibility” of websites are key areas of focus for libraries (p. 1).  Determining what usability is can be different for each member of the community. Past research based on users’ acceptance of a particular system has shown that usability is based upon the way the site functions or works (Goodwin, 1987; Wang & Senecal, 2007).

Usability has been defined as ‘how well and how easily a visitor, without formal training, can interact with an information system of a website’ (Benbunan-Fich, 2001, p. 151). According to Nielsen (2012) usability is defined by looking at it as a ‘quality attribute that assesses how easy user interfaces are to use’ (para. 3). Nielsen (2012) goes on to discuss that usability refers to the methods used for improving the ease of use during the design process. He further defines this as the five quality components of ‘learnability, efficiency, memorability, errors, and satisfaction’ (Nielsen, 2012, para. 4). Implementing Nielsen’s (2012) usability heuristics will make any website better.

Jakob Nielsen (1996) proposed very simple suggestions for user-centered design guidelines that he called “usability heuristics” (p. 34).  Nielsen’s suggestions were to “make screens simple and natural, speak the users’ language, be consistent, provide feedback, use plain language for error messages, prevent errors, and provide clearly marked exits” (Nielsen, 1996, p. 34).  None of Nielsen’s suggestions are difficult to envision when being considered within a library framework. These suggestions are ones that can make a library website move from a hard to navigate site, to an easy one that users will want to come back to visit. If a website is difficult to navigate users will leave and not return (Nielsen, 2012). For a public library, this is to be avoided, as return visitors indicate that the e-branch of the library is also a valued branch.

One aspect of good user-centered design is finding out what the community wants and expects on the web page. It is important to remember that the website is for the users not the library staff (Goodman & Schofield, 2015). Library literature underscored the importance of usability testing and having an iterative process during the design phase (Becker & Yannotta, 2013; Dominguez, Hammill, & Brillat, 2015).  If a library is redesigning their website, there are many different methods that can be used to go about this.  Becker and Yannotta (2015) in their project at Hunter College Libraries used an iterative usability testing process to improve their website.  Dominguez et al. (2015) used a Think-Aloud Protocol during usability testing, focus groups, and card sorting during their many phases of website design from 2001 through 2012.  Both Becker and Yannotta and Dominguez et al. used many of the processes that Nielsen (1996, 2012) suggests in carrying out user-centered design in order to get the best accessibility for the website.

The project methodology partially replicates Powers’ (2011) study in Pennsylvania where a random sample of public library websites were evaluated and assessed against 18 specific criteria.  Powers used 10 basic elements based upon another study done in Idaho that determined what basic elements a public library website should have (Persichini, Samuelson & Zeiter, 2008; Powers, 2011). The remaining eight criteria were based upon an article Brian Matthews (2009; Powers, 2011) wrote for the Library Journal that is considered desirable for library websites. Powers (2011) put the two sets of criteria together and used it in a spreadsheet protocol to assess the sample of Pennsylvania websites by answering either yes or no for each criterion. Powers broke the two sets of criteria into basic or desirable (Powers, 2011).

To determine the assessment and evaluation of the websites eight groups of postgraduate students worked over eight semesters on 1342 Australian, Canadian and U.S. public library websites using a pre-determined protocol based upon Bonnie Powers’ 2011 research of Pennsylvania public libraries.  Each student was assigned 25 public libraries and required to write a 1500-2000 word literature review on any aspect website usability.  This was part of the student capstone project course under supervision of the lecturer in charge of the course. 

The paper will present three phases of Australian, three phases of Canadian, and two phases of U.S. results from March 2013 to June 2016.

Paper - Now available.

Presentation - Now available.


Creative Commons Licence

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

We're starting in


ALIA National Conference provides the platform as a meeting point for all Library and Information professionals, from all sectors and all areas of Australia and the international community.

Stay connected on:


For general information about ALIA National 2016 Conference, including registration, please contact us at:

  • events (@)
  • +61 2 6215 8222
  • +61 2 6282 2249
  • PO BOX 6335 Kingston ACT 2604