Conducting formative evaluation online: From planning to execution

Amanda Nichols Hess; James L. Moseley


Assessment in academic libraries is of growing importance, especially in data-driven higher education environments. Demonstrating value and proving effectiveness is especially important in instruction, and formative evaluation is one strategy that instructional librarians and designers alike can use to measure potential value and effectiveness ahead of time. This kind of evaluation is conducted while a learning object, educational tool, or curriculum is still under development or before it has been widely implemented.

While formative evaluation is not a new idea, the increasing prevalence of online learning in academic libraries means that formative assessment’s charge to be as true to the environment where learning will happen may take a different form.1 One such instance of formative evaluation involved an online learning module with free and easy-to-use online tools. These tools and strategies can be translated to other environments where academic librarians are seeking to engage in formative assessment of instructional programs or objects.

As noted by Martin Oliver, online teaching and learning programs present a particular set of possibilities and challenges for evaluation and evaluators.2 However, Melody M. Thompson asserted that an evaluation, whether for an online or in-person product or program, is not an ends, instead it provides a vehicle for asking the right questions in the right ways.3 Asking these questions in the right ways not only involves wordsmithing, but also involves the right evaluation format. In fact, James L. Moseley and Nancy B. Hastings indicated that selecting the appropriate medium for the delivery of a formative evaluation is an essential component of an effective and complete evaluation design.4 For many online learning environments, that appropriate evaluative medium may also be online.

Formative evaluation of Copyright and You

This formative evaluation used electronic data collection tools and focused on a three-part online learning module, Copyright and You. Created by three librarians at Oakland University (OU) Libraries, the course content was originally created in response to a request from Art and Art History department faculty. When it became apparent that it could be used more broadly, an online formative evaluation was designed to determine how to extend Copyright and You’s academic reach.

The eCourse contains lessons on basic copyright information, the student as a user of content, and the student as a creator of content.5 Each lesson delivers written instruction and concept explanation, and concludes with relevant practice questions. At the conclusion of the three-part eCourse, learners can take a ten-question assessment to test their knowledge. Badges of completion are awarded to scores of at least 8 of 10.

Evaluation questions

Since Copyright and You had not undergone previous formal evaluation, this process’s central concern was whether the course was applicable and useful to students across academic disciplines. From this overarching focus of determining academic applicability, two secondary questions developed:

  • Is the coverage of content appropriate and clear, particularly for learners with no copyright knowledge or experience?
  • Is the instructional design of the module responsive to users, and does it help enhance understanding and build knowledge?

Evaluation participants

To answer these questions, subject matter experts and undergraduates from a range of academic areas were engaged in the formative evaluation process. Two librarians not affiliated with the course, but with considerable experience in copyright instruction and instructional design, were asked to serve as the external expert reviewers and share their perspectives on the course.

Also, a small group of library student employees from across academic majors was sampled and asked to consider the applicability, instructional content, and design of Copyright and You. These two groups provided different kinds of formative feedback on the module’s content, design, clarity, and usefulness. Moreover, their perspectives offered insight into how students and instructors, the primary and secondary user groups, would perceive Copyright and You when used in courses.

Collecting data

Perhaps most importantly, all data collection for this formative evaluation happened online using self-designed Google Forms. Collecting data online most closely simulated the actual learning environment, and the course’s online design is meant to provide convenience and accessibility for a wide range of courses and students, including distance learners. By using Google Forms, both expert and student respondents could work, impediment-free, through the module’s content while concurrently providing their thoughts. Also, Google Forms’ simple and easy-to-use interface allowed for the creation of surveys that captured both qualitative and quantitative data through Likert-style and guided free-response questions.

Separate Google Forms were created for each respondent group, and these documents considered respondents’ different perspectives and points of access to the online learning module. The two external experts evaluated the learning module as a whole with attention toward its content and instructional design. One expert focused solely on the coverage of content in Copyright and You, while the other focused on its instructional design.6 The student participants’ questionnaire asked them to consider each section of Copyright and You in terms of its clarity of directions, design, content, and perceived usefulness.7 Each questionnaire also offered respondents free-response space to share any additional thoughts on the module’s strengths and weaknesses.

Collecting data online through Google Forms surveys took advantage of several other technological affordances. For instance, the online questionnaire was free and easy to share. Also, it was convenient for both the evaluator and respondents. By hosting the survey online, it could be delivered to participants instantly, and they could then immediately submit their feedback. Finally, offering the survey through a clickable link delivered via email meant students and experts alike could access it when convenient.

However, survey data were not the only information collected from respondents. Because the eCourse exists within the university’s course management system, Copyright and You stores enrollees’ performance and participation data. Student responses to both the module’s practice questions and concluding quiz were captured, along with their page views by time and frequency. This information proved helpful in better understanding and framing students’ thoughts on the online learning module.

Lessons learned

By using Google Forms and pulling student performance data from the course management system, the evaluator was able to identify recommendations for Copyright and You’s development team, and support those recommendations with qualitative and quantitative data. However, considering the formative evaluation process illuminates several applicable lessons for other librarians interested in conducting formative program evaluations online.

  • Gather data from multiple inputs. Collecting both student response and performance data were the key components of this formative evaluation. In collecting multiple data inputs from student participants (evaluation surveys, review question performance, certificate attempts/performance), the evaluator could frame student feedback in terms of performance.These multiple data sets allowed students’ comments on the online learning module to be considered in light of their performance in the module. This either illustrated that understanding of content, and perception of that understanding, matched or were incongruous. It also helped the evaluator determine which student comments were valid, and which student had completed the evaluation without working through the course at all.Again, here, the technology used helped enable this type of data collection. The self-recording feature of the online learning module meant that performance data were instantly accessible to the evaluator, anywhere and at any time. This is one advantage of an online testing tool over a printed assessment. So whenever possible, collecting more than just performance data, or just survey response data, or any other kind of data, is very valuable—especially online.
  • Seek diversity in feedback. With an online learning module like Copyright and You that can have a wide academic impact, diversity of opinion should be encouraged and sought. Diversity can mean many things. For instance, consulting with instructors or faculty members outside of the University Libraries and the Art and Art History Department could provide future direction on implementation of the module in courses. A faculty member in the Business school, or in the Sociology department, may be able to provide useful guidance on how the module could be made more useful for their students.Diversity of student respondents should also be encouraged, particularly if taking a small-scale evaluation (such as this) to a larger group. In this particular instance, diversity can be achieved in several ways. While student respondents were from a diverse range of academic backgrounds, a more concerted effort to recruit student participants from a variety of the university’s schools and colleges could offer insight on how the module can be shaped to be applicable to the broadest possible student audience.Diversity of experience is also important. All student respondents worked for the library system, and this could have led to bias, or even a greater level of copyright knowledge. Using a broader cross-section of the population in a wider-scale evaluation would offer a logical next evaluative step. Here, too, technology can help. Recruitment of diverse populations can be done via email, announcement on a website, or even social media. These technology tools can help widen the net cast by this, and other, formative evaluations.
  • Improve response rates through greater supervision. This formative evaluation was very hands-off by design, in part to replicate the true nature of the learning experience. While this may provide more true-to-life feedback, increased scaffolding, and, yes, more structure, it could also improve the quality of responses and response rates. An in-person think-aloud protocol or even a technologically advanced adaptation of the procedure using screen capture and voice recording software (i.e., Camtasia) could provide respondents with a better understanding of the kinds of feedback requested and desired. Such scaffolding could also help the evaluator collect data from both students and experts on thoughts, processes, or difficulties not recorded in the online forms or performance data.

Conclusion

In responding to the increased importance of assessment data, finding meaningful—yet simple—ways to conduct formative evaluation can enhance librarians’ practice and improve library services, especially in instruction. As more learning content is available online, both synchronously and asynchronously, it is important for academic librarians to evaluate these resources before they are deployed to patrons.

Using a free survey tool like Google Forms with intended patron groups and subject matter experts is one way to collect valuable feedback that allows librarians to improve an online learning object before it is finalized (if indeed any learning object can ever really be considered finalized). Conducting, and learning from, formative evaluations in situ can help academic librarians improve their services, practices, and instructional offerings.


Notes
1. Tessmer, M. , Planning and Conducting Formative Evaluations: Improving the Quality of Education and Training (London: Kogan Page, 1993 ).
2. Oliver, M. , “Evaluating Online Teaching and Learning. ,” Information Services & Use, 20, no. 2/3 ( 2001 ): 83-94 –.
3. Thompson, MM.. , “Evaluating Online Courses and Programs. ,” Journal of Computing in Higher Education, 15, no.2 ( 2004 ): 63-84 –.
4. Moseley, JL.. Hastings, NB.. , “Is Anyone Doing Formative Evaluation?. ,” in The 2008 Pfeiffer Annual: Training, ed. Biech, E. (Hoboken: Wiley, 2007 ), 233-40 –.
5. Rodriguez, J. Greer, K. Shipman, B. , “Copyright and You: Copyright Instruction for College Students in the Digital Age. ,” Journal of Academic Librarianship 40, no. 5 ( 2014 ): 486-91 –.
6.

See the sample forms at bit.ly/CRLNewsformative.

7.

Ibid.

Copyright © 2016 Amanda Nichols Hess and James L. Moseley

Article Views (2017)

No data available

Contact ACRL for article usage statistics from 2010-April 2017.