Previous | Contents | Next | ||
Issues in Science and Technology Librarianship |
Fall 2011 |
|||
DOI:10.5062/F47P8W90 |
URLs in this document have been updated. Links enclosed in {curly brackets} have been changed. If a replacement link was located, the new URL was added and the link is active; if a new site could not be identified, the broken link was removed. |
rhare@indiana.edu |
jihjo@indiana.edu |
emoreton@indiana.edu |
astamm@indiana.edu |
danhwint@indiana.edu |
Graduate Students
School of Library and Information Science
Indiana University
Bloomington, Indiana
Obtaining reliable information is essential to forming a balanced understanding of the scope and complexity of environmental sustainability, and it is essential for effective participation, decision-making, and research in sustainability-related activities. While the ACRL Standards for Information Literacy (2000) are a good guiding principle for obtaining needed information, they must be applied in conjunction with other standards and principles in order for students to obtain all the necessary skills to become information literate, lifelong learners. Though they come from traditionally unrelated fields, the principles of evidence-based practice mirror the ACRL standards, and applying these methods to sustainability reference interviews has the potential to increase the science literacy of students who otherwise lack background knowledge in the sciences.
Environmental sustainability is an increasingly important topic, but the science behind it is often not well understood by the general public. Sustainability, the meeting of society's current needs without compromising its future needs, has been a recognized issue since the early 1970s, as evidenced by the formation of the Environmental Protection Agency (Environmental Protection Agency 2011) and other groups for the conservation of natural resources. Over the last 30 years, sustainability as an issue has been gradually introduced to and adopted by the public, not only in fields directly related to the environment, but also in almost every sector of contemporary life and politics. In order for people to understand the impact of their actions, a more complete understanding of sustainability is essential. In turn, an efficient method of finding and evaluating sustainability information is also essential to acquiring the level of science literacy that the complexity of environmental decisions demands. These skills are especially needed by students who do not possess a background in the sciences. The processes necessary for finding scientific information are accommodated by the Information Literacy Standards created by the Association of College Research Libraries (2000), but the specific implementation of those processes in the context of sustainability information has yet to be explored . Borrowing from the research structure established in evidence-based medicine may illuminate methods of application that will allow librarians to help students become information literate with regards to sustainability information. This article discusses the tenets of evidence-based medicine and the application of those principles to the search for sustainability information as a method of increasing the science literacy of students without a background in science.
The diverse and conflicting research in the environmental sciences may overwhelm inexperienced users, and the quantity of such information continues to increase exponentially. Librarians must provide constructive methods to insure patrons are able to obtain the most credible and up-to-date resources in the environmental sciences. The most reliable way to provide users with valuable conservation materials lies in the filtering of information for its validity. Patrons should be trained to recognize authentic sustainability literature with simple quality filtering methods. The subject of filtering information is not new. Amitai Etzioni (1971), the chairman of the Department of Sociology at Columbia University, discussed his concern for users and their need for quality filters in Science in the early 1970s. He noted that the overflow of information was no longer benefiting users and their research, and that weeding through mass quantities of sources was time consuming and imprecise (Etzioni 1971). Furthermore, in addition to the struggle of sorting through prodigious amounts of literature, the information contained therein may be inaccurate or poorly supported. This becomes a significant problem when students lack the information literacy skills to analyze the information they are reading.
Within the information explosion, the wealth of knowledge has become limiting to students rather than helpful because the information tends to be shallow and lacks evidence to support it (Brem et al. 2001). As a result, students have difficultly identifying information as valuable. In Brem's study, researchers watched how student subjects analyzed different web sites containing varying levels of credible information. The students were more willing to accept a web site as accurate and reliable if it had statistical data, without even considering that it might be underdeveloped or arbitrary (which it was) (Brem et al. 2001).
Students must be taught to analyze science information and consider its purpose and validity, but traditional methods of doing so have brought less than stellar results. There is currently no system in place for evaluating the credibility of data and claims about sustainability. When implementing such a standard, librarians should look to ideas that have worked in the past. Evidence-based practice is already successful in the medical field and has also begun to spread to other areas of science, creating new partnerships with librarians in order to improve scientific practice. These partnerships also benefit those librarians, who learn how to assess credible sources of science information in the process. During a study at the University of Pittsburgh, medical librarians without backgrounds in the sciences were tested for their ability to locate valid information (Kuller, et al. 1993). The outcome of the study indicated that librarians were as effective in the search process as researchers and clinical physicians. In addition, both the librarians and the physicians located similar relevant science literature and their decision-making processes were parallel. This ability to locate and evaluate information without having experience in a certain field could ideally be passed on to students, who, like librarians, do not always have the capability of processing scientific data. Using the model of evidence-based medicine, patrons without science backgrounds should not only be able to locate information; they will also be able to evaluate and use it effectively.
One "sprouting" method for patrons who need reliable information on sustainability may lie in evidence-based practice (Sutherland 2003). The Centre for Evidence Based Conservation (CEBC) is an example of this, incorporating evidence-based practices in conservation through systematic reviews. These systematic reviews "summarise, appraise and communicate the results and implications of a large quantity of research and information," making it much easier for patrons to understand scientific data (2011). The CEBC understands that information and research results may become conflicting; they work to "synthesise results" in order to avoid arbitrary literary bias (Centre for Evidence Based Conservation 2011). The CEBC helps translate raw scientific data into manageable information, but still leaves a gap between evidence-based science and the information literacy skills necessary to evaluate that particular area of science. Within the sustainability information search, the use of many technical resources and tools may be difficult for students who lack specialization in science-related fields, but librarians must show patrons more than just where or how to find information. Both the evaluation and incorporation of information are crucial to students' information literacy. Academic librarians in conjunction with faculty must educate students in these processes if they hope to increase information and science literacy in the undergraduate population.
While basic information literacy is interlaced through the K-12 and higher education models, it is too often presumed that information literacy skills are successfully imparted to all students. This is not always the case, even among students in higher education (Maughan 1994). Likewise, students have great difficulty with literacy in the sciences. Science literacy can be defined as: practical science, which is science used to solve practical problems, such as land management and ecosystem stabilization; civic science, which is the broader knowledge of the focus, scope, and environmental cost of scientific research and applied sciences that is needed in order to make informed decisions related to scientific problems; and cultural science, which is the knowledge and appreciation of the scientific achievements in environmental science (Shen 1975). Each of these aspects forms the base through which a science literate person can differentiate between biased and non-biased science information and have a basic understanding of how and where the science is impacting the physical environment, as well as the social and political environments.
Librarians have the resources and knowledge to begin to bridge the gap between students and science literacy, but critical skills such as search methods and standards for assessing materials must be imparted to the student in order for this mediation to take hold. These skills will vary based on the literacy level of the student, so librarians must first recognize the abilities of their users and then be able to tailor their methods to each individual. Not all methods of assessment are appropriate for all users, and effective science information assessment tools and techniques must be able to compensate for a lack of a science background in some patrons. Evidence-based science in sustainability calls for an approach that integrates information literacy with the search process, allowing users to gain the critical skills necessary for making sustainability-related decisions.
The Centre for Evidence Based Medicine (CEBM) at the University of Oxford (2011) describes evidence-based practice as a process of five distinct steps that correlate with the ACRL's tenets of information literacy (2000). These steps consist of:
This step in the evidence-based approach involves developing a very specific question to be answered. Within a medical context, focused questions follow a set structure and contain the elements of Patient or Problem, Intervention, Comparison, or Outcome (also known as PICO). PICO encourages information seekers to define their query with very specific detail, which leads to high precision in search results (Centre for Evidence Based Medicine 2011). The PICO process can be easily adapted to fit within other science settings, leading to an easier search process and providing better resources for students without a science background. This step in evidence-based practice is analogous to the first step in the ACRL standards, determining the extent of information needed (Association of College and Research Libraries 2000).
Finding the evidence is the process of the patron completing the search with the aid of librarians and databases. This requires a thorough and varied search process in order to meet the ACRL standards of accessing information effectively and efficiently (Association of College and Research Libraries 2000). Finding concise results is crucial in science research; the field is demanding and may require immediate information (Kuller, et al. 1993). In their comparison of clinical physicians and librarians, Kuller et al. indicated that physicians used their prior knowledge to locate articles, but librarians relied on Medical Subject Headings (MeSH) (1993). MeSH descriptors are precise, allowing librarians to research efficiently. The librarians proved they could still return results without utilizing bibliographic biases and without earning a professional science degree (Kuller, et al. 1993). By teaching students to understand classification systems and subject headings (and thus, the structure of information), non-science students would be able to execute their own skilled searches and improve their science literacy.
For sustainability and environmental sciences, critical appraisal of the information identified in the search process is essential in determining what is credible and useful. Information assessment is necessary because of the large quantities of available information and because that information is not necessarily accurate or well-supported. Critical appraisal mirrors the evaluation step of information literacy (Association of College and Research Libraries 2000) because it encourages information seekers to take a close look at the information they are collecting. Many assessment tools and methods have been developed, practiced, and revised to evaluate the significance and impact of research (Breckons et al. 2008; Eysenbach et al. 2002; Grafstein 2002; National Center for the Dissemination of Disability Research 2005). These tools, however, are primarily used by librarians rather than patrons, who typically do not employ them at all. Kuller et al. (1993) found that both clinical physicians and librarians select materials based on the literature's article title, abstract, and journal title. Neither filtered their scientific findings by author or author affiliation. The journal alone indicated authority and reliability. Teaching students the basics of evaluating sources will help them in the long run as they evaluate authors and sustainability claims.
In this step, the user applies the information that has been obtained and evaluated in order to answer the original question. This parallels the information literacy standard that addresses the understanding and incorporation of information (Association of College and Research Libraries 2000). Making a decision about the information means that the patron must not only decide to include it in or leave it out of the collected research, but the user must also determine if he or she is going to believe that information and incorporate it into his or her life.
In the final step in the evidence-based approach, the information seeker determines if the original question has been fully answered or if more research is needed. Was the search effective and efficient? Is the information credible and useful? The final step in the information literacy standards deals with applying the information obtained (Association of College and Research Libraries 2000), but the process does not end here. A user should both determine if he or she has answered the original question and think about the implications, not just for closely related fields, but also for sustainability issues encompassing the environment, the economy, and society. In sustainability information, this is the area that will have the greatest impact on the user because these decisions have the potential to affect the user's daily life.
Traditional methods of conducting reference interviews generally leave students without the means to properly locate or evaluate the validity of sustainability information. This is critical within such an important, highly political field, and drawing inspiration from outside sources is quite possibly the only way to overcome the science literacy roadblock. Librarians should combine the tenets of information literacy with those of evidence-based practice in order to obtain positive results where past attempts have failed. This synthesis of information literacy standards and the evaluation tools of evidence-based practice will enable students lacking a science background to understand information in a new way and gain the science literacy skills they otherwise lack. By teaching students to use the evidence-based approach to information, librarians will not only be imparting the knowledge contained in the information literacy standards valued so highly in higher education, but they will also be the impetus for the science information fluency that those students will come to experience in the future.
We would like to thank Brian Winterman for playing an instrumental role in the development and writing of this article. His invaluable support and insight are greatly appreciated.
Association of College and Research Libraries. 2000. Information Literacy Competency Standards for Higher Education. [Internet]. [Cited May 2011]. Available from: http://www.ala.org/ala/mgrps/divs/acrl/standards/informationliteracycompetency.cfm
Bernstam, E., Walji, M., Sagaram, S., Sagaram, D., Johnson, C., Meric-Bernstam, F. 2008. Commonly cited website quality criteria are not effective at identifying inaccurate online information about breast cancer. Cancer. Issue 6. 1206-1213. DOI: 10.1002/cncr.23308 http://onlinelibrary.wiley.com/doi/10.1002/cncr.23308/full
Breckons, M., Jones, R., Morris, J., Richardson, J. 2008. What do evaluation instruments tell us about the quality of complementary medicine information on the Internet? Journal of Medical Internet Research. DOI: 10.2196/jmir.961 http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2483844/
Brem, S.K., Russell, J., & Weems, L. 2001. Science on the web: student evaluations of scientific arguments. Discourse Processes 32(2/3). 191-213.
Curran, C. 1990. Information literacy and the public library. Public Libraries 29(6): 349-353
Centre for Evidence Based Conservation. 2011. Introduction to Systematic Review. Prifygol Bangor University. [Internet]. [Cited: June 6, 2011]. Available from: http://www.cebc.bangor.ac.uk/introSR.php?catid=&subid=6367
Centre for Evidence Based Medicine. 2011. EBM Tools. Center for Evidence Based Medicine. University of Oxford. [Internet]. [Cited May 2011]. Available from: http://www.cebm.net/index.aspx?o=1023
Environmental Protection Agency. 2011. Sustainability: Basic Information. United States Environmental Protection Agency. [Internet]. [Cited June 2011]. Available from: http://www.epa.gov/sustainability/basicinfo.htm#sustainability
Etzioni, A. 1971. The need for quality filters in information systems. Science. 171(967):133-133.
Eysenbach, G., Powell, J., Kuss, O., and Sa, E. 2002. Empirical studies assessing the quality of health information for consumers on the World Wide Web. The Journal of the American Medical Association. 287(20): 2691-2700 {dx.doi.org/10.1001/jama.287.20.2691}
Grafstein, A. 2002. A discipline-based approach to information literacy. Journal of Academic Librarianship. 28(4):197.
Kuller, A.B., Wessel, C.B., et al. 1993. Quality filtering of the clinical literature by librarians and physicians. Bulletin of the Medical Library Association 81(1): 38-43.
Maughan, P. 2001. Assessing information literacy among undergraduates: a discussion of the literature and the University of California-Berkeley Assessment Program. College and Research Libraries 62(1): 71-85
National Center for the Dissemination of Disability Research. 2005. What are the standards for quality research? Focus. Technical Brief NO. 9 http://www.ncddr.org/kt/products/focus/focus9/Focus9.pdf
Petzold, J., Winterman, B., and Montooth, K. 2010. Science seeker: a new model for teaching information literacy to entry-level biology undergraduates. Issues in Science and Technology Librarianship 63. [Internet]. [Cited June 2011]. Available from: http://www.istl.org/10-fall/refereed2.html
Shen, Benjamin S.P. 1975. Science literacy and the public understanding of science. In: Day, Stacey B., Ed. Communication Of Scientific Information. New York: S. Karger. p. 44-52.
Sutherland, W. 2003. Evidence-based conservation. Conservation in Practice 4(3): 39-41.
Walji, M., Sagaram, S., Sagaram, D., Merc-Bernstam, F., Johnson, C., Mirza, N., and Bernstam, E. 2000. Efficacy of quality criteria to identify potentially harmful information: a cross-sectional survey of complementary and alternative medicine web sites. Journal of Medical Internet Research. [Internet]. [Cited November 8, 2011]. Available from: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1550600/
Wooding, S. and Grant, J. 2003. Assessing Research: The Researchers' View. RAND Europe. MR-1681-HEFCE - DTIC Document. [Internet]. [Cited November 8, 2011]. Available from: {https://web.archive.org/web/20110726182055/http://www.rareview.ac.uk/reports/assess/AssessResearchReport.pdf}
Previous | Contents | Next |