Previous | Contents | Next | ||
Issues in Science and Technology Librarianship |
Winter 2010 |
|||
DOI: 10.5062/F4TX3C9K |
URLs in this document have been updated. Links enclosed in {curly brackets} have been changed. If a replacement link was located, the new URL was added and the link is active; if a new site could not be identified, the broken link was removed. |
While interactive hands-on instruction is usually considered the best approach for engineering students for both their academic courses and for library instruction, the size of the engineering student population compared to the number of instructors and the available classroom space means that engineering librarians, like engineering faculty, may have to use a lecture format. One way to inject some interactivity into engineering library orientations is a new instructional technology popular in university science and math classrooms called an "audience response system." This article will describe and evaluate the success of the use of an audience response system for first-year engineering library instruction at Rowan University.
The College of Engineering at Rowan University, located in Glassboro, New Jersey, is graduating their 10th class in 2010. The faculty enthusiastically embrace the latest educational research and teaching methods. There is an emphasis on undergraduate research and hands-on team learning, embodied in a course called Engineering Clinic which students take each year. In their freshman and sophomore years, Engineering Clinic students work with Writing Arts faculty to develop writing skills; in Junior/Senior Clinic they use their research and writing skills in their work with engineering faculty on industry-funded research. I provide library orientations to freshmen and sophomores, and office consultations for upper-level and graduate student research projects. I also support the Biology, Chemistry, Physics and Mathematics departments.
The content objectives of the Rowan University Freshman Engineering Clinic curriculum explicitly include learning about library research, making it fairly easy to develop a lesson plan. Objective 2, Library Research Skills, states the following goals:
As the new science and engineering librarian at Rowan in the Fall of 2005, I was immediately approached by several engineering faculty to teach library orientations. My librarian colleagues had been offering orientations in the library instructional lab (a computer lab that seats 20) and I initially offered orientations in this format to individual sections of Freshman Clinic. Like my colleagues supporting other departments, I demonstrated the use of the library catalog and relevant databases, and asked the students to follow along on their computers.
During my first two years (2005 and 2006), I provided freshman library orientations in response to faculty requests, after advertising my availability for them to the college as a whole. Over time I became aware that although I was getting more and more requests, I still wasn't reaching all the freshman engineering students with library orientations. Some sections were taught by adjunct instructors, while others met at times that either the library instructional lab or I were not available. I also started hearing complaints from faculty that students were not paying attention during the library orientations because the availability of a computer in front of them was too tempting -- they were surfing the web, playing games, or checking e-mail.
Over the summer of 2007, I met with the Freshman Clinic Coordinator to assess the situation. We decided that in order to provide a consistent library orientation to all the freshmen, I would give a presentation in the College's largest lecture hall instead of using the library instructional lab. Even then, the freshman class had to be divided in two, half on Tuesday and half on Wednesday. The auditorium has a projector and two huge screens, and can hold 75-80 people. For the class that entered in the fall of 2007, I prepared a PowerPoint presentation that covered all the concepts the faculty wanted me to cover, including demonstrating searches in several engineering databases. The Freshman Clinic faculty were happy with my presentation, but I detected some boredom among the students. I started thinking about what I might do to make the presentation more effective.
In recent years much research has gone into student learning styles and how they impact the effectiveness of various ways of teaching. Using various types of psychological learning inventories (Myers-Briggs, Holland Vocational), students can be categorized broadly according to their "preferred learning style" -- visual, auditory, hands-on, and so on. Williamson (2003) surveyed the literature concerning the personality types of engineering students and determined that most have a preference for hands-on learning. This is unsurprising since many students choose a career in engineering after experiences building model rockets, Lego robots, soapbox cars, and other hands-on projects. In her conclusion, Williamson states: "The applications that were gained from examining the body of literature on Myers-Briggs Personality Types and Holland Vocational Types were that librarians ought to appeal to engineering students with logical, objectively worded instruction containing causation sentences (reasons for statements and instructions); well-organized instruction; and hands-on learning opportunities. Librarians ought to appeal both to engineers' practical, competitive natures and their intellectual interests."
Most of the library literature on instruction for engineering students concludes that the best approach is instruction in a lab setting where students use computers to conduct their own searches, preferably for an actual assignment. For example, Quigley and McKenzie (2003) describe the evolution of their instruction from a lecture format to a more effective interactive format, and offer conclusions and tips that mirror my experiences (such as the desirability of instructor presence and students already knowing their topics, and focusing on two or three key concepts). I have had success collaborating with the Writing Arts program to offer workshops for the Sophomore Clinic/College Composition II classes which allow students to research their paper topics as part of the library orientation. However, my engineering faculty really want their students to have an introduction to the library prior to their sophomore year, and for all the students to get the same orientation.
During my undergraduate days, writing out the equations in colored chalk was the only innovation my professors used to keep students awake in lecture. Now, technology offers many new options. One newer technology is the audience response system, which allows students to respond to multiple choice questions posed by the instructor. This is accomplished by giving each student a wireless remote, often called a clicker. Instead of students raising their hands and responding out loud, potentially exposing their misunderstanding in front of their peers and their professor, they respond silently and anonymously. As soon as everyone has voted, the results are displayed on the screen as a bar chart (and stored for later analysis as well). The anonymity, especially from peers, is one of the greatest advantages of the technology. However, the remotes can be registered to specific students for the duration of the course if the instructor wants to use them for grading.
A recent literature review by Kay and LeSage (2009) indicates that audience response systems are being used extensively in higher education, particularly in high-enrollment freshman and sophomore level physics, engineering, biology, chemistry, and mathematics courses, where content is highly sequential. There is a substantial literature on the use of audience response systems in higher education, if you know how to find it. Unfortunately, there is no consistency in the way these systems are named. Kay and LeSage discuss this very problem, revealing that the most common terminology is Audience Response Systems, but they are also called personal response systems, electronic voting systems, and student response systems (Kay & LeSage, p. 820). In the library literature (I used Wilson Library Science & Information Full Text), none of these terms brought up any articles, but when I simply typed "clickers" I got three hits.
While I did not find any articles on the use of audience response systems for engineering library orientations, I did locate several studies on their use for other types of library instruction. Each of them used a different vendor, but no specific differences between the systems were mentioned. Deleo and her colleagues (2009) evaluated a system with graduate students in Educational Leadership. Reactions on the part of these students (who are classroom teachers) were very positive, and indicated that the interactive system helped bring less technologically-savvy students up to speed on modern library technology with a minimum of embarrassment.
Dill (2009) conducted a controlled study of the effectiveness of an audience response system in library instruction for a one-credit first-year experience course. She tested the group which used clickers during the lecture against the group who simply raised their hands, using a post-lecture quiz, and found that recall of concepts was basically the same. However, she concedes, "If one of the goals of a library's instruction program is to get students more engaged with and interested in the library, then perhaps clickers would serve this purpose in instruction sessions."
A third study, also of a freshman-level library orientation, was conducted by Collins (2008). Like Deleo, she asked about the use of the clickers as part of a post-workshop survey. Most students liked them and felt they were a beneficial and fun addition to the workshop, although a few students felt that they were a waste of time or were tired of being polled all the time (a result of too much telemarketing in modern life). Also, anecdotally, "Instructors are finding the clickers help the students stay focused on the lecture."
In early 2008, Rowan University's College of Engineering purchased an audience response system called iClicker from Information Learning Technology. They used it for several mechanical and chemical engineering classes with some success. Then the professor who had spearheaded the effort moved to another university and the initiative withered. When I learned about it, I decided it would be a interesting technology to try out in library orientations for freshmen. I asked if I could use the system and they were happy to lend it to me and help me set it up.
The iClicker system came with about 55 remotes, the software application on a USB drive, a wireless receiver, and an instructor remote. The instructor develops a presentation which displays multiple-choice questions on the screen (using existing projectors and PowerPoint presentation software) at intervals during his or her lecture. When the instructor hits the Start Polling button, the students each enter their answer (A,B,C,D or E) on their remotes. Then the instructor hits Stop Polling and Display Results, and the iClicker software displays a colorful histogram of the results. At this point, the instructor discusses the various answers and why the correct one is best.
Among the many benefits of audience response systems found by the research reviewed by Kay and LeSage (2009) were greater attentiveness, engagement, and participation in lectures, as well as higher quality learning. Both students and instructors like the fact that the instant feedback allows difficult concepts to be clarified before moving on. Although the systems can be used to grade students, they are more often used for "formative assessment," in which student understanding is measured during the content delivery rather than afterwards. That type of spot-check allows the lecturer to modify lecture content in real time if necessary, although thinking on their feet is a skill at which not all lecturers are adept.
Kay and LeSage identify two main challenges to using audience response systems: technological issues and developing good questions. Technological issues include matters as basic as getting the projector, computer, PowerPoint presentation and remotes to talk to each other, and getting the clickers to work.
According to Kay and LeSage: "Writing good ARS questions can be a demanding task for instructors. Researchers agree that the most effective questions have the following characteristics: they address a specific learning goal, make students aware of opinions other than their own, uncover misconceptions and confusions, explore ideas in a new context, and elicit a wide range of responses." (p. 824). It's similar to writing good multiple-choice test questions, which is not necessarily a typical skill set for librarians.
Another challenging issue which can occur is student resistance to the system. If students are used to a passive lecture format, they may not want to cooperate with a learning approach that asks more from them. I saw some definite signs of that phenomenon in my use of the technology.
After I took a workshop offered by the College of Engineering on the iClicker system in the Spring of 2008, I decided I was ready to try it with my engineering freshmen in Fall 2008. The technology was new to most of the students, so trying out a new high-tech gadget seemed to be entertaining for them. I was happy enough with the improvement in attentiveness among the students that I used the iClicker system again in the Fall of 2009. The system overwrote my 2008 data with the 2009 data so I only have the 2009 data available.
The iClicker system was easy and quick to set up, with some help connecting all the pieces from the engineering faculty who usually use the auditorium. I did take the precaution of a dry run the day before so that I wouldn't waste class time. For my Fall 2009 orientation, I asked six questions (see Appendix) which students answered with their clickers during my prepared presentation. The results are shown in the tables below.
Tues 9/15/09 |
Q1 |
Q2 |
Q3 |
Q4 |
Q5 |
Q6 |
A |
27 |
1 |
5 |
26 |
1 |
3 |
B |
1 |
22 |
6 |
4 |
1 |
2 |
C |
13 |
9 |
1 |
1 |
1 |
0 |
D |
8 |
12 |
24 |
11 |
38 |
31 |
E |
2 |
8 |
16 |
8 |
10 |
15 |
Wed 9/16/09 |
Q1 |
Q2 |
Q3 |
Q4 |
Q5 |
Q6 |
A |
41 |
3 |
6 |
22 |
1 |
3 |
B |
0 |
13 |
10 |
3 |
1 |
0 |
C |
5 |
9 |
1 |
3 |
1 |
1 |
D |
2 |
20 |
5 |
3 |
36 |
31 |
E |
3 |
7 |
30 |
18 |
11 |
12 |
The first two questions (Have you visited the library and How do you start a literature search) were intended to gauge the students' "before" state of information literacy and give me a chance to plug the advantages of using library resources. Considerably more students in the Wednesday class had apparently visited the library in person. It's likely that one of their other classes meets in a library room, since we have a shortage of classroom space at Rowan. Only one person out of both classes had visited the library web site, so the focus on the library online resources was appropriate. On the second question, more of the Tuesday students picked answer B, using Google to start a literature search, while the Wednesday students intuited that the desired answer was D, using library databases.
The next two questions (How do you get to engineering databases and How do you find electronic journals) were used as a quick check to see if they had followed my demonstrations. It is not as effective as trying out the web site themselves, but does provide reinforcement. Getting the correct answer to Question 3, "D", required paying attention to my database demonstration. The students in the Tuesday class did pay some attention, with about half of them selecting the correct answer. Most of the Wednesday class unfortunately chose to play around and select E, which was not one of the choices.
In addition, in Wednesday's class for question three I forgot to hit the Stop Polling button before I hit Display Results, and several students entertained the class by changing their answers over and over and watching the bars on the histogram jump up and down. Later that day one of the students saw me at the reference desk and apologized on behalf of her "immature" classmates. You can see from my results that in each class there were a few students who insisted on picking choice E on their remotes for every question, even though I pleaded with them to choose one of the four responses on the screen (next time I'll be sure to offer a real 5th option). No matter what format I use for my instruction, there are always engineering students who think they don't need to learn how to use the library!
Question four, concerning the process of finding electronic journals, was a case of one best answer (A) and several other less good but still acceptable answers, because there are multiple ways of finding e-journals in my library. The students in both classes overwhelmingly chose the correct answer, so apparently most of them were paying attention. Some were still more interested in flaunting their rebellion by choosing option E, but at least they weren't asleep.
The last two questions (Why is it important to properly cite your sources and What are the advantages of using scholarly articles from the library rather than web sources) were intended to emphasize the broader concepts I wanted to get across. Unfortunately it was obvious what the answer was from the phrasing of the questions -- "all of the above" is a strong giveaway. Only one person chose each of the first three choices for question five in each class. It would have been better to have one right answer and three wrong answers that could be discussed. Question six was slightly less obvious, with three students picking the benefit of library resources which is most emphasized by faculty (peer review), but most choosing All of the Above.
My biggest challenge with using the audience response system was writing good questions. In my opinion, the technology would have added more value if my questions had been better. While I did tie each question to a specific learning goal, some goals were not addressed, and some of the responses I offered were too obvious. The literature search for this article turned up some good examples of other librarians' questions which will help me develop better questions in the future. Student resistance was also a significant issue. Since answers were not graded, students didn't feel that they needed to take the process seriously. More questions, and questions that require students to think more, could help.
For example, I could add a question about the differences between scholarly and popular sources, and a question about the differences between searching for articles and books on the library web site. I would also like to improve my question about citing sources, as I continue to work on incorporating a meaningful discussion of plagiarism into my orientations. Perhaps I could include a short video clip on the subject, based on a tutorial that I created several years ago; this would add to the diversity of learning styles addressed in my presentation.
The combination of a well-organized PowerPoint presentation with the use of an audience response system does fulfill a number of the recommendations by Williamson for appealing to engineers' learning styles. There is an active learning component to engage student attention, which could also address their competitive natures if they compete to see who gets the most right answers. In my presentation I make sure to provide logical reasons for my instructions, for example why they should use library resources instead of free web sources, and why it is important to cite their sources. I also try to appeal to both their intellectual interests and their practical needs, by using search topics in my demonstrations that reflect their actual research areas (such as biofuels, solar panels, and permeable pavement).
I think the audience response system is an interesting technology which I am not yet using to its full potential. It is difficult to find ways to present content in a way that seems relevant to engineering students, despite the fact that they know their instructors will expect them to put the information to use in their assignments. As Quigley and McKenzie (2003) indicate, it's important to "find a hook" which motivates students to pay attention. However, most of them will be back for a more hands-on library workshop as sophomores, when they take Sophomore Clinic/College Composition II. So my presentation to the freshman class is a first introduction to the library, but not my only chance.
An audience response system can make a lecture format more palatable for engineering students and provide a way to engage their attention, thus making it more likely that they will retain some of the information. I still think a lab setting is more effective for teaching information literacy objectives, but in a situation where the size of the student body and the available resources make a hands-on approach impractical, a lecture using an audience response system can provide benefits in student engagement over a lecture alone.
Collins, B. L. 2008. 'Debating' the merits of clickers in an academic library. North Carolina Libraries (Online) Spring/Summer 2008. Available: {http://www.ncl.ecu.edu/index.php/NCL/article/viewFile/84/114}. [Accessed December 4, 2009].
Deleo, P. A., Eichenholtz, S., & Sosin, A. A. 2009. Bridging the information literacy gap with clickers. The Journal of Academic Librarianship 35(5) : 438-44.
Dill, E. 2008. Do clickers improve library instruction? Lock in your answers now. The Journal of Academic Librarianship 34(6) : 527-9.
Kay, R. H. & LeSage, A. 2009. Examining the benefits and challenges of using audience response systems: A review of the literature. Computers & Education, 53(3) : 819-827.
Quigley, B. D. & McKenzie, J. 2003. Connecting engineering students with the library: a case study in active learning. Issues in Science and Technology Librarianship, Summer 2003. [Online]. Available: http://www.istl.org/03-spring/article2.html [Accessed December 4, 2009]
Williamson, J. 2003. Suiting library instruction to the Myers-Briggs personality types and Holland Vocational personality types of engineering students. Issues in Science and Technology Librarianship, Spring 2003. [Online]. Available: http://www.istl.org/03-spring/refereed2.html [Accessed December 3, 2009].
Question 2: How do you start a literature search?
A. Go the library and use the card catalog
B. Use Google to find information on my topic
C. Read about my topic in Wikipedia
D. Search my topic in one of the library databases on the Engineering E-Resources page
Question 3: How do you get to library databases of peer-reviewed engineering research?
A. Type "Engineering" in the catalog search box on the library home page
B. Select "E" from the list of letters under E-Resources by Title (A-to-Z)
C. Type "Engineering" in the Search E-Journal box
D. Select Engineering under E-Resources by Topic
Question 4: How do you search for electronic journals?
A. Click on Databases/E-Resources under Quick Links, then use the area labeled Search E-Journals
B. Enter the journal name in the main search box on the home page
C. Go into Engineering Village and search for your journal name
D. Come to the library and ask a librarian
Question 5: Why is it important to properly cite your sources?
A. The authors might sue you for using their work without permission, especially if your work later becomes part of a commercial product.
B. Plagiarism is a serious violation of the Rowan University student code of ethics and a potential offense under New Jersey state law.
C. It is a good habit to document where ideas came from so you (or others) can go back and find the information again if you need more details.
D. All of the above
Question 6: Which of the following are advantages of scholarly articles available by library subscription over free resources available over the Internet?
A. They have passed a peer review by one or more professionals with academic credentials in that scholarly discipline
B. They have been indexed by professional catalogers to allow retrieval of all relevant articles on a given topic
C. They cite other scholarly work upon which their research is based, allowing readers to verify methodology and trace related research
D. All of the above
Previous | Contents | Next |