ACRL

College & Research Libraries News

Evaluating Bibliographic Instruction: A Handbook

Bibliographic Instruction

Editor’s Note:C&RL News is pleased to have the opportunity to reprint (with permission) the Contents, Preface and Introduction to the new handbook published by the Bibliographic Instruction Section, Evaluating Bibliographic Instruction: A Handbook. The 129-page publication was coordinated by the BIS Research Committee’s Subcommittee on Evaluation, whose members are: Mignon S. Adams, David Carlson, Bonnie G. Gratch, Larry Hardesty, David N. King, John Mark Tucker, Richard Hume Werking, and Virginia Tiefel (chair). It is now available from ACRL/ALA, 50 E. Huron St., Chicago, IL 60611, at $13 for ACRL members, and $17 for nonmembers.

Contents

“Evaluation and Its Uses,” David N. King.

“Evaluating in Terms of Established Goals and Objectives,” Virginia Tiefel.

“Research Designs Appropriate for Bibliographic Instruction,” Bonnie G. Gratch.

“Data-Gathering Instruments,” Mignon S. Adams.

“Data Management and Statistical Analysis,” T. Mark Morey and Jacqueline Reihman.

“Significant Works,” Richard Hume Werking.

“Glossary,” John Mark Tucker.

“Bibliography” (Part I, Annotated List; Part II, Basic Textbooks).

Preface

Because the rapid growth of bibliographic instruction is a comparatively recent phenomenon, many instruction librarians have simply not had the time to evaluate what they are doing; and many lack the specialist knowledge about, or experience in, program evaluation. This handbook has been written by librarians (with one exception) for librarians both to offer an introduction to basic precepts cf evaluation and to furnish some direction to move beyond the material covered in the handbook.

The subtitle of A Handbook has been chosen because the authors wish to stress that this is not the definitive work in library instruction evaluation. It is an introduction to the topic, intended to give general direction, and especially encouragement, to librarians attempting evaluation in this important area.

The handbook is a collection of chapters on various aspects of evaluation and each reflects the author’s approach to the subject and style of writing. There is, then, diversity in the writing; and many who critiqued the manuscript noted some inconsistency in style. But as there was no agreement on a uniform style (e.g., between more formal and informal styles), the decision was made not to attempt a major editing of the manuscript. It is hoped that readers will not be distracted by this, and that, indeed, they might enjoy it. Also, the authors invite comments and suggestions, all of which will be retained and referred to the authors of a revised edition.

In closing, as chair I want to thank the members of the committee responsible for the creation of this handbook. No finer group of professionals— working over a long time and distance—could have been found. The contributions of several people beyond the committee proved invaluable. They are Katherine Branch, William Crowe, Evan Färber, Constance Finlay, Elizabeth Frick, Martin Gibson, James Kennedy, Tom Kirk, Maureen Pastine, Linda Phillips, Ronald Powell, Daniel Ream, Anne Roberts, and Nancy Taylor.

Steps in the Evaluation Process

Deserving of special thanks are Mignon Adams, who collected the manuscripts and supervised their preparation for printing, and Glenda King, who prepared the artwork for several of the charts and tables. — Virginia Tief el, Ohio State University Libraries.

Introduction

The evaluation of library instruction programs has been a subject of much discussion during the past two decades. Although there seems to have been an increase in the number of published evaluation studies in recent years, it is difficult to tell whether or not this reflects any significant increase in the use of evaluation by the majority of instruction librarians. Evaluation is often assumed to be a complex, time-consuming process; and for those unfamiliar with the methods and tools, it may seem an intimidating prospect.

In fact, evaluation is what you make of it. The process may indeed take on the aspect of a sophisticated educational research endeavor, if that is what you wish. But it is not always necessary to develop large-scale, complex projects in order to profit from evaluation. The process itself is straightforward, easily described in six basic steps.

Step 1. Describe the purpose of the evaluation. The first step in any evaluation effort is to make sure that you fully understand the reasons for evaluating. Who wants to know and what you hope to learn from the information you obtain will determine the kinds of information you need to collect and how you can best collect it. Chapter 1 ‚ “Evaluation and Its Uses,” discusses many of the factors which should be considered, and introduces some of the prominent approaches to systematic evaluation.

Step 2. Describe the program in terms of its goals and objectives. Once you have a clear idea of why the evaluation is to be undertaken, it is important to develop a description of the program as it currently exists. Educational programs tend to evolve, and statements of program goals and objectives may not fully detail current practice. Be sure to look for any implied goals and objectives that might not have been included in a formal statement. Chapter 2, “Evaluating in Terms of Established Objectives,” provides an overview of behavioral goals and objectives, and explains a taxonomy of educational objectives.

Step 3. Determine the criteria to be used for evaluation. When the program has been described in sufficient detail, and the goals and objectives are clearly identified, evaluation criteria can be determined. If goals and objectives have been written with care, this step is relatively easy. Just decide which goals and objectives should be studied within the context of the purpose of the evaluation, and what standards would indicate success. But be sure to consider outcomes which may not have been anticipated by the goals and objectives, or which may indicate undesirable or counterproductive results of instruction. Chapters 1 and 2 include sections discussing the criteria that might serve as standards.

Step 4. Develop the evaluation procedures and overall design of the study. After the criteria have been determined, procedures for conducting the evaluation can be developed. If the evaluation is to be a major summative effort, or if you intend to monitor your program on a continuing basis by means of evaluation, you may find it useful to develop an evaluation plan based upon one of the approaches discussed in Chapter 1. An appropriate evaluation design should be selected at this point. Chapter 3 provides an overview of some commonly employed designs. The selection of the design will clarify most of the procedures to be used in the evaluation, which instruments might be adopted, and the kinds of statistical analysis that will be necessary. You may find it helpful at this point to set timetables, determine personnel and training needs, and budget for any costs you anticipate. Chapter 3, “Research Designs Appropriate for Evaluating Bibliographic Instruction,” introduces concepts related to experimental designs and describes and illustrates their use.

Step 5. Develop instruments and collect data. The fifth step in the evaluation process is to develop the instrument or instruments to be used to implement the study. The keys to success in this step involve pretesting the instrument to eliminate any unforeseen errors, and applying the instrument systematically. Deviations in the way data are collected can irreparably compromise the entire evaluation process and render the data useless. Chapter

4, “Data Gathering Instruments,” describes a variety of instruments which may be used to collect information, and offers practical advice on choosing an instrument and/or developing your own tests and questionnaires.

Step 6. Analyze the data and report the results. Once the data are collected, they should be interpreted within the context of the criteria established in Step 3. The analysis should point out which criteria were successfully achieved, as well as identify components of the program that need improvement. Chapter 5, “Data Management and Statistical Analysis,” presents the core concepts of statistical analysis and describes some of the procedures most useful for the analysis of instructional data.

Even if a report to outside clientele is not required, writing up the result will help in interpreting and assessing the evaluation results.

The techniques and methodologies introduced in this handbook may be employed in a number of ways to help understand the effects of your program and improve the quality of instruction. Careful planning and attention to detail are necessary throughout the process.

You will not find the answers to all your evaluation questions here. Each author provides references for further reading, and an annotated bibliography of suggested readings is included. Examples of evaluations and some of the more important sources for information on the evaluation of bibliographic instruction programs are included in Chapter 6, “Significant Works.” A glossary is also provided. Even if you follow up on all of the sources mentioned, we encourage you to take advantage of the expertise of others on your campus.

You need not be an expert to succeed at evaluation. Evaluation is not an exact science, and creativity, resourcefulness, and perseverence yield impressive results. Your programs and your students will benefit from the effort.—David King, Houston Academy of Medicine, Texas Medical Center Library. ■ ■

Copyright © American Library Association

Article Views (Last 12 Months)

No data available

Contact ACRL for article usage statistics from 2010-April 2017.

Article Views (By Year/Month)

2022
January: 1
February: 0
March: 4
April: 0
May: 2
2021
January: 3
February: 6
March: 0
April: 3
May: 2
June: 3
July: 2
August: 0
September: 1
October: 2
November: 2
December: 0
2020
January: 4
February: 3
March: 3
April: 0
May: 2
June: 2
July: 5
August: 1
September: 1
October: 11
November: 2
December: 3
2019
January: 0
February: 0
March: 0
April: 0
May: 0
June: 0
July: 0
August: 4
September: 5
October: 4
November: 0
December: 4