Previous | Contents | Next | ||
Issues in Science and Technology Librarianship |
Summer 2008 | |||
DOI:10.5062/F45X26VD |
URLs in this document have been updated. Links enclosed in {curly brackets} have been changed. If a replacement link was located, the new URL was added and the link is active; if a new site could not be identified, the broken link was removed. |
Nancy Allmang
Materials Science and Engineering Laboratory Liaison
National Center for Neutron Research Liaison
nancy.allmang@nist.gov
National Institute of Standards and Technology
100 Bureau Drive, MS 2500
Gaithersburg, MD 20899
This article describes a campus-wide customer satisfaction survey undertaken by the National Institute of Standards and Technology (NIST) Research Library in 2007. The methodology, survey instrument, data analysis, results, and actions taken in response to the survey are described. The outcome and recommendations will guide the library both strategically and operationally in designing a program that reflects what customers want -- in content, delivery, and services. The article also discusses lessons learned that other libraries may find helpful when planning a similar survey.
The 2001 survey measured customer satisfaction with resources and gauged the impact of earlier journal cancellations on customers' work. The 2007 survey was planned to assess usage of and satisfaction with resources and select library services, including the laboratory liaison program, the information desk, interlibrary loan, and document delivery. The survey aimed also to correlate the demographic data with the information-gathering habits of customers in order to segment their research habits by work unit and length of time worked at NIST. Furthermore, it would examine the use of iPods1, BlackBerry devices and other wireless tools; varying materials formats; and the use of collaborative technologies such as wikis to determine customer preferences.
Literature searches in preparation for the 2001 and 2007 surveys did not turn up information specifically useful in surveying customers of this unique research library. In order to gather data relevant to its institutional needs, NIST Research Library management elected to design its own survey instrument rather than use LibQUAL+, the Association of Research Libraries' survey instrument that, "determines …users' level of satisfaction with the quality of collections, facilities, and library services" (Kemp 2001). That survey allows academic libraries to compare themselves with other libraries who administer the same survey http://www.libqual.org. Although LibQUAL+ is commonly used by academic libraries, NIST librarians decided it would not be the most useful instrument for this library with its distinct mission and clientele.
As in many academic and special libraries, NIST librarian/laboratory liaisons work closely with various portions of their customer community to provide individualized customer service (Ouimette 2006). NIST also has a Research Library Advisory Board which acts as a two-way conduit by providing input for the library, and carrying important information about the library to customers.
In addition, the Malcolm Baldrige National Quality Award, which provides useful performance excellence criteria, was established by Congress in 1987 and has been administered by NIST since that time (Baldrige National Quality Program 2008). The library has an awareness of the criteria, and over the years has strived for continuous quality improvement; customer feedback has been extremely important in guiding library decisions.
The 20 questions addressed the following areas:
In the winter of 2006, a consultant was brought in to assist with the design of the survey instrument and analysis of responses. The library staff worked closely with the consultant to ensure that survey questions would ultimately produce useful trend data. It was also important to learn whether the time had come to implement new tools such as the blog to communicate with library customers or perhaps enhance communications among customers by means of a wiki. A library team reviewed the 2001 questions, identified those to repeat or drop, and developed new ones; of particular interest to staff was a question, new in 2007, regarding impacts of the library on customers' work. To ensure maximum participation, the library promoted the survey through its Research Library Advisory Board, a library newsletter, and electronic bulletin boards.
Lab liaisons launched the survey in March 2007 by sending email messages linking customers to the web questionnaires. The survey was open to receive responses for 3 weeks. During that time we received 629 responses from a target audience of 2900, a 21.6% participation rate. Results of the survey were captured and tabulated directly by the consultant, who considered this a good response rate.
Seventy-five percent of respondents stated that the greatest barrier to obtaining information was that the resources they needed were not available electronically; 49% stated they were too busy to obtain the information. We discovered that the majority of customers were not aware of the laboratory liaison program of customized library services; of those who knew of it, 98% were satisfied with the service.
The customers were not shy in providing comments, answering not only the questions at hand but making observations on everything else about the library. They recommended new journals, books, and databases, although recommendations were not solicited. The library basked in the survey's positive feedback ("Excellent staff. The best department in the entire NIST.") and noted opportunities for improvement ("Add a real Google search of your resources").
Dry run the questionnaire. Test the questionnaire with a small group to make sure questions are clear. Testing questions with our library advisory board and revising those that were ambiguous improved the questions immensely. Even afterwards, there remained ambiguous verbiage we did not catch. For example, defining "satisfaction" in "Please rate your satisfaction" would have been helpful. In discussing electronic journals, did "dissatisfied" or "satisfied" refer to a customer's satisfaction with the collection of e-journals, or satisfaction with the smoothness of access to the e-journals? Unless this meaning is clear, it will not be obvious how to make "satisfied" customers "very satisfied."
Craft an introduction. Overstating the length of time it will take to complete the survey can be counterproductive. We heard that potential respondents were putting the survey aside after the first screen, where they read a welcome message estimating the survey would take 20-30 minutes to complete. Changing this to, "Thank you for participating in this 20-question survey," produced more completions.
Include a question on impact. In these times of inflationary increases, tight budgets, downsizing, and trends of shutting down physical libraries, be sure to include a question which will demonstrate the library's impact on the larger organization.
Trend data. Asking questions that led to useful results and demonstrated trends provided additional rich data. Without a previous survey, a library might use other baseline data. Asking a question without a way to use the resulting response is not beneficial.
Follow up surveys. Subsequent surveys can be shorter and more effective if specifically targeted to and rotated among specific groups or work units with varying needs. A shorter, targeted survey may result in greater participation, and its creation and analysis will certainly be a less labor-intense process for those conducting it. Currently the library is assessing customer satisfaction with the collection in new subject areas; we plan to ask two questions of a targeted customer base.
Identify potential participants. Identifying in detail, well ahead of time, all groups who were to receive the survey turned out to be more difficult than we expected. Adding groups to a web survey once begun turned out to be extremely difficult and time-consuming.
Don't be discouraged. Interestingly, in spite of a concentrated effort to build up our library's biosciences collection after results of the 2001 survey demonstrated lowest satisfaction in this area, trend data showed that in 2007 the satisfaction level in the broad area of the biosciences had dropped. Building a comprehensive collection in a new area is a long process. In this electronic era, new resources may spring up faster than funds to purchase them.
Take action on the survey results. Taking suggestions for improvements seriously and doing something with the information that the customers provided was extremely important (Silcox & Deutsch 2003a). Quick action remedied situations in the category of "low hanging fruits."
Welcome negative results. Opportunities for improvement were our most useful survey results. It was only after the need was clear that true improvements could actually be made.
Baldrige National Quality Program. 2008. [Online.] Available: {https://www.nist.gov/baldrige} [June 22, 2008].
Kemp, J.H. 2001. Using the LibQUAL+ Survey to assess user perceptions of collections and service quality. Collection Management 26(4):1-14.
NIST Virtual Library. 2008. [Online.] Available: http://nvl.nist.gov/ [June 23, 2008].
Ouimette, Mylene. 2006. Innovative library liaison assessment activities: supporting the scientist's need to evaluate publishing strategies. Issues in Science and Technology Librarianship (46) Spring 2006. [Online.] Available: http://www.istl.org/06-spring/index.html [June 30, 2008].
Silcox, B.P. and Deutsch, P. 2003a. From data to outcomes: assessment activities at the NIST Research Library. Information Outlook 7(10):24-25, 27-31.
Silcox, B.P. and Deutsch, P. 2003b. The customer speaks: assessing the user's view. Information Outlook, 7(5):36-41.
SurveyMonkey. 2008. [Online.] Available: http://www.surveymonkey.com/ [June 22, 2008].
Previous | Contents | Next |