Hutchins Library

Blue Line

Bibliographic Instruction Program Evaluation


Blue Line

A LONGITUDINAL STUDY OF A BIBLIOGRAPHIC INSTRUCTION PROGRAM:

AN EXPERIENCE IN ASSESSMENT AND ACCOUNTABILITY


Presenters:



The last decade has seen an increased demand for the assessment and accountability of library bibliographic instruction programs. A growing number of institutional self-studies and accrediting agencies have required that library instruction programs be proven as essential to the teaching and learning process. As a result, an increasing percentage of the library literature deals with evaluative research of user education. Evaluation is recognized as a critical element to a successful bibliographic instruction program. Librarians are being expected to be knowledgeable about assessment methodology and to demonstrate the effectiveness of their instruction programs both within and outside of the library.

Both assessment and accountability were major goals in a five-year evaluative study (1989-1993) of Berea College's bibliographic instruction program. This report describes the methodologies employed in the study, challenges and problems encountered, and changes implemented in Berea's instruction program as a result of the study.


THE BEREA COLLEGE BIBLIOGRAPHIC INSTRUCTION PROGRAM


Berea College, a four-year undergraduate liberal arts college with approximately 1,500 students and 150 teaching faculty, has had a course-related bibliographic instruction program since the early 1980s when Tom Kirk began as director of Berea's Hutchins Library. Influenced and mentored by Evan Farber of Earlham College, Kirk worked with faculty to develop an instruction program similar to Earlham's. Library instruction was integrated into the curriculum and was gradated with the intention that students become increasingly more sophisticated library users.

In 1989, when the evaluation of the instruction program began, formal library instruction was provided at several stages of a student's college career. At the freshman level, students received instruction through Freshman Composition, Freshman Seminar, and, in some cases, through English 015/016, a remedial English course. (Students were required to take English 015/016 and/or Freshman Composition before Freshman Seminar if they tested deficient in writing skills.) Library instruction for English 015/016 was provided at the instructor's request. The library instruction component for Freshman Composition, which required the writing of a brief research paper, consisted of a library tape, a follow-up exercise and one library class session. The major portion of Freshman Seminar, which all students were required to take, was the writing of a research paper. Seminar was the central focus of the library's bibliographic instruction program. Two library class sessions, each with bibliography handouts and follow-up exercises, were provided during the semester. The sessions covered search strategy, reference sources, book and periodical literature. Librarians worked closely with each seminar section to tailor the instruction to the instructor and students' needs.

Library instruction at the sophomore through senior level was built upon the instruction the student had received as a freshman and focused on specialized tools in the discipline of the student's major. Instruction librarians were each assigned academic departments and worked with the teaching faculty to develop the appropriate kind of instructional support such as a bibliography, a class presentation, an exercise, and/or individual student conferences.

Since its inception, Berea's course-related bibliographic instruction program went through minor modifications as a result of some informal and sporadic evaluation. Beginning in 1987, the instruction librarians were challenged by director Tom Kirk to explore the options available for formally evaluating the library's instruction program. In February 1989, as the call for assessment and accountability in higher education increased, planning was officially begun for a formal evaluation of Hutchins Library's bibliographic instruction program.



PLANNING AND DESIGN OF THE STUDY


The first step in planning the program's evaluation was to survey the existing literature on evaluating bibliographic instruction programs. A review of the library literature revealed few published reports on evaluating bibliographic instruction at the program level. The reported studies used mostly quantitative methods, emphasized a cognitive approach, and focused on short-term objectives. There appeared to be no consensus within the library profession about what or how to evaluate a user education program. A survey of the non-library literature on program evaluation revealed useful information on qualitative methodologies, surveying the affective domain, and utilizing long-term objectives.

After surveying the relevant literature, the purpose and criteria for evaluating the library's instruction program and the need to develop program goals and objectives prior to the evaluation had emerged. ACRL's 1983 Evaluating Bibliographic Instruction and its 1987 "Model Statement of Objectives for Academic Bibliographic Instruction" were essential sources utilized in this phase of the planning process.

The goals of the study were to evaluate first, the skills and achievement of students in regard to library use, and, second, the attitudes of students and teaching faculty regarding libraries and librarians. These goals demanded designing both quantitative and qualitative methodologies and a combination of unobtrusive and obtrusive evaluative methods. It was thought that applying multiple methods of evaluating the instruction program would overcome the limits of a single approach. Furthermore, collecting a broad range of data from multiple sources was desirable when conducting a summative evaluation of a program. A longitudinal study was deemed most appropriate, with a representative sample of students being tracked over a four-year period from the freshman through senior level.

The survey methodologies chosen for the study included a pre/post test, an attitudinal survey for teaching faculty, focus group interviews, and student bibliography evaluations.


IMPLEMENTATION OF THE STUDY


The evaluation of Berea College's bibliographic instruction program focused on the students of the Class of 1993. The study was implemented when the students enrolled as freshmen in August 1989 and was completed in the spring of 1993, their senior year.


PRE-TEST/POST-TEST

In the summer of 1989, the "Hutchins Library Inventory" was developed as a pre/post test to: 1) assess change in a student's basic library skills, and 2) assess change in a student's attitudes and perceptions of libraries and librarians. Some changes were expected to occur from the time a student enrolled as a freshman in August 1989 to the end of his/her first semester in November 1989. It was expected that for the majority of students, formal library instruction would be given during the fall semester, either through Freshman Composition or Freshman Seminar. However, students qualifying for English 015/016, may not have received any formal instruction.

The Inventory consisted of three sections. The first was informational to acquire the student's social security number and to determine how much prior formal library instruction the student had received and in what setting. In the second section, the student was asked to respond to nine attitudinal statements about libraries and librarians. The final section included 20 matching questions and three completion questions to assess the student's library skills.

To check the validity and reliability of the Hutchins Library Inventory, it was administered to 43 student library workers in June 1989. As a result of this trial testing, one question was rewritten for clarity.


Data Collection and Analysis

With the assistance of Berea College's Office of Institutional Research and Testing, the Inventory was administered as a pre-test to approximately 400 new students in August 1989. The Inventory accompanied the College's "1989 Student Information Form," which was completed by all incoming students. The data gathered from this form would provide demographic information about the students who would be studied during the next four years. Information included age, gender, average high school grades, parental income, reasons for attending college, goals and values.

In November 1989, the Inventory was administered as a post-test to students through all sections of English 015, Freshman Composition, and Freshman Seminar. (There were no sections of English 016 that semester.) The questions were identical to those asked in August, though the numbering of the sections was modified from the pre-test, which had been tailored to accompany the Student Information Form.

To ensure that data would be collected only for students who had completed both the pre- and the post-test, the tests were matched by Fscored. In late 1991 compilation and analysis of the completion questions were finished. A total of 193 tests were analyzed.

When responses from the Student Information Form were compared with results of the tests, the correlation figures for the comparisons were relatively low and thus deemed not particularly significant. The data collected from the first section of the Inventory indicated that more than half of the students had received formal library instruction by the end of the fall semester. An analysis of the second section in which students responded to a set of attitudinal questions revealed hat students were feeling less confident and comfortable in the library after the first semester. A factor contributing to this change in attitude may be that most of Berea's students have had little exposure to large libraries and thus are overwhelmed as Freshmen by the size of Hutchins Library and its collections. In addition, the challenges posed by the Freshman Seminar research paper may have contributed to the students' loss of confidence. Both the pre- and the post-tests indicated positive feelings about librarians. From the data collected in the third section of the Inventory, significant improvements in students' library skills were noted. It was recognized that while some of the differences might be attributed to library instruction, the changes might also be a result of growing familiarity with a new setting.


FACulTY SURVEY

Larry Hardesty's "Scale to Measure the Attitudes of Classroom Instructors Toward the Role of the Academic Library in Undergraduate Education"* was administered to Berea College teaching faculty as a second phase of the evaluation program. In order to ensure reliable comparisons with Hardesty's use of the survey, the survey was administered unchanged, although some informational questions asking name and rank, etc., were removed. Adopting Hardesty's premise that faculty play a dominant role in determining undergraduate student use of the library, the purpose of the survey was to assess the attitudes of Berea's faculty regarding libraries and librarians.

A set of 30 attitudinal statements followed an introductory section, which asked for information such as academic department, tenure status, and age. For each of the statements on the survey, there were seven choices allowing an expression of attitude from strong agreement to strong disagreement. Comments concerning the study were solicited at the conclusion of the survey.


Data Collection and Analysis

In April 1990 Hardesty's "Scale to Measure the Attitudes...." was sent o 154 Berea faculty. Sixty-eight usable responses were returned for a 44% response rate. The process of analysis used in Hardesty's initial study was replicated as much as possible.

By the spring of 1991, the collation of the data and comments from the faculty survey was completed. The first data examined were faculty comments. Thirty-five percent of the faculty surveyed commented about the library. Some of the comments were words of encouragement about the existing instruction program, some noted specific examples of how the faculty member used the library, some revealed their library use philosophy, and others addressed specific concerns about the survey itself.

Five factor labels were utilized in analyzing the survey data: library passive, library active, library interactive, library traditional, and library supportive. Five statements were discounted in the factor analysis due to nonresponse rates above 4%, leaving twenty-five statements distributed over the five factors.

Berea's administration of the faculty survey produced results similar to Hardesty's original study. Overall, Berea's faculty were supportive of the library. While some believed it was important for faculty to be actively involved in teaching students to use the library, others believed it was more the librarian's job to teach library skills. A major conclusion Hardesty reached as a result of his study, and one that appears applicable to the Berea study, is that faculty members' attitudes are primarily shaped by local conditions. Thus, academic librarians have the opportunity to be of greater influence in shaping faculty attitudes than they would if faculty attitudes were shaped by prior experiences.

*Copyright 1982 by Larry Hardesty. Reprinted and used with permission of Larry Hardesty.


Hardesty, Larry. Faculty and the Library: The Undergraduate Experience. Norwood, NJ: Ablex Publishing, 1991.


FOCUS GROUP INTERVIEWS

The focus group interviews conducted in early 1992 provided the most qualitative element of the evaluation program. The goals of the interview process were: 1) to discover what the students remembered about the library instruction they had experienced, 2) to informally assess their knowledge of the library research process, and 3) to discover what the students thought the role of the library had been in their education thus far and what its future role would be. These goals were based on the goals and objectives of the library instruction program.

In the spring of 1991, 60 students were identified to comprise 15 groups, with four students in each group. The students were selected from a sample pool including students still enrolled for whom both pre- and post-test data existed. They were drawn evenly from three courses they could have had as freshmen--English 015, Freshman Composition, and Freshman Seminar. Students who had worked in the library at any given time were eliminated from the pool.

The questions and interview script were developed in fall 1991. The College's Office of Institutional Research and Testing was consulted during this planning phase. Eight student library workers from the Class of 1993 were selected to conduct the interviews and received training in interview techniques. After two mock interview sessions, the script and questions were revised and finalized.

To facilitate scheduling the interview sessions, the class schedules of the 60 students were obtained from the Registrar. Course listings of courses taken were also obtained for each student in the sample group. The first week of January 1992, the students received a letter accompanied by their course listing. They were asked to check the courses for which library use was required and bring the list to the interview. A reminder memo was sent the second week, and phone call reminders were made the third week.


Data Collection and Analysis

The interviews were conducted during the third and fourth week of January 1992. Moderated by a library student staff member, each interview session lasted approximately one hour and included up to four students. After completing a brief questionnaire on library use, the students were asked twelve questions designed to give them an opportunity to describe their various library experiences at Berea. The sessions were taped with the students' knowledge to allow for future data collection and analysis. Following each session, the interviewer completed an "Interviewer Response Sheet" which provided continuous feedback and review of the process. If two or more students came, the interview was conducted. The interview was conducted with one student if he/she felt comfortable about being the only interviewee. The response for the interview sessions was 36.7%.

The transcription of the tapes from the interview sessions continued into fall 1992, when the process of summary and analysis began. The tape transcripts of the group interviews were summarized question by question and student by student. Institutional Research assisted in the analysis which was completed in spring 1993. From the comments made in the interviews, the majority of the students seemed comfortable in making candid remarks. This may have been due to the fact that a student was conducting the interview.

Though the response rate was low, the information gathered from the interview sessions resulted in some immediate changes in the instruction program. One of the frequent observations made was that the library instruction sessions were dull, repetitive, and boring. As a result, the librarians began to vary the location of the sessions and provide a less formal setting with more hands-on experiences.

Two conclusions resulting from the focus group interviews were: 1) the timing of library instruction sessions and the direct application of skills to assignments are critical in order for students to discover value in the library research process, and 2) more opportunities for students to develop a feeling of success in their early library experiences should be offered.


STUDENT BIBLIOGRAPHY EVALUATIONS

The final phase of the evaluation of Berea's bibliographic instruction program involved an assessment of a product of the students' library research. The bibliographies of the fall 1992 Senior Requirement papers were evaluated on the basis of six characteristics thought appropriate for measuring library research skills: 1) appropriateness of sources, 2) appropriate number of sources, 3) appropriate form of citations, 4) integration of citations, 5) currency of sources, and 6) variety of types of sources.

Planning for the process to be used in evaluating the papers and bibliographies was begun in the spring of 1992. By the beginning of the spring 1993 semester, the criteria for evaluating the bibliographies were finalized and a "Bibliography Evaluation Checklist" was developed.


Data Collection and Analysis

Initially, the plan was to gather the bibliographies as a follow-up to the focus group interviews. Students were asked at the conclusion of the interviews to send the library a copy of their research paper for evaluation. However, the number of papers gathered in this manner was insufficient. Instead, the final papers for the five sections of the Senior Requirement course were obtained in the fall of 1992. Of the 74 papers obtained, 13 "reflection papers" were removed and judged inappropriate for the developed evaluation criteria.

Each of the remaining 61 papers was assigned a number and all identifying marks were removed. Four librarians served as readers and each initially received one-fourth of the papers to evaluate. Using a scale of one to five (1=not appropriate, 3=appropriate, 5=highly appropriate), the papers and bibliographies were evaluated and scored for the six characteristics by two readers. If a discrepancy of two or more points resulted on any given characteristic, a paper was assigned to a third reader for balance. The scores for each characteristic of each paper were averaged and entered into a database.

The reading and assessment of the senior papers were completed during the spring of 1993. Twenty-five papers were read by two readers and 36 were read by three. In the assessment of the bibliographies, there was a concern regarding the readers consistently and evenly applying the criteria. However, the results indicated that, with the exception of one characteristic for which the majority of papers scored above the average scale of three ("appropriateness of sources"), the readers were successful in their attempt to apply the criteria in a consistent manner.

For each of the six characteristics, several types of statistics were calculated, including the mean, median, mode, standard deviation, and variance. The characteristic with the highest mean score was "appropriateness of sources," while the lowest mean score was recorded for "variety of types of sources."

Though the scoring for the majority of the established criteria was average, the readers were disappointed with the overall scholarship of the papers. Higher level skills, such as the integration and solid support of ideas through documentation of sources, were too infrequent in occurrence. Use of a consistent form of citations was disappointingly lacking in many papers. While the need for improvement was evident in each of the areas evaluated, the librarians identified "citation format" and "variety of sources" as key areas in which students needed help.


CONCLUSIONS


In the summer of 1993, the final analysis and summary of each segment of the evaluative study of Berea College's bibliographic instruction program were completed. An "Executive Summary Report" and an indepth report providing detailed analysis of the data were prepared.

The longitudinal nature of the instruction program's evaluation, as well as the use of multiple methodologies, contributed to the overall success of Berea's evaluative study. While some of the survey methodologies were more revealing than others, the combination of both quantitative and qualitative methods assisted in evaluating the students' library skills and attitudes. The challenges inherent in a longitudinal study such as changes in the student population, library staff, and technology were met by a staff committed to the project and by a supportive administration. Among the factors enabling the success of the study were the small size of the student population, the College's course-related and integrated bibliographic instruction program, and the support and assistance of the College's Office of Institutional Research and Testing.

The results of Berea's five-year evaluation program have been most rewarding and gratifying. The evaluation process has served as an energizing and learning experience for the instruction librarians and has provided fresh insights and ideas for incorporation into the instruction program. The results of the process have guided the program changes required for the College's new general education curriculum, implemented after the conclusion of the study.

It is evident that some form of program evaluation should be ongoing and continual if a bibliographic instruction program is to be responsive to the ever-changing needs of students and if it is to accurately reflect the rapid changes occurring in the information society. While a second longitudinal study of Berea's library instruction program may not be feasible or desirable in the near future, administering specific elements of the original study might be appropriate. For example, the Hutchins Library Inventory could be administerd periodically. Modifications to the original would be necessary, however, to reflect technological changes within the library since 1989. Hardesty's faculty survey could once again be distributed and compared to the 1990 results. The process of discovering students' perceptions of library instruction through the focus interview process was most enlightening and perhaps could be continued in some similar fashion. A library portion might be added to the College's senior exit interviews, for example, or seniors could be surveyed on an annual basis.

Continual evaluation of Berea College's bibliographic instruction program is now recognized as an important element of the program's success. The call for assessment and accountability is an ongoing challenge for the instruction librarians to explore new and innovative ways to strengthen the program through evaluation and study.


"Measuring Up! Improving Instruction Through Evaluation: A Longitudinal Study of a Bibliographic Instruction Program," ALA 114th Annual Conference, June 25, 1995.




Susan_Henthorn@berea.edu
mroyse@utk.edu

Blue Line

Hutchins Library Bibliographic Instruction Program Evaluation Home Page
Hutchins Library Home Page | Berea College Home Page


Updated 5/23/17
Mail comments or questions to susan_henthorn@berea.edu