Interactive tutorials—the platform matters: Learning from a comparative evaluation of tutorial platforms

Graham Sherriff


There is a growing body of theory and research on the characteristics of effective learning objects and library tutorials, but understanding the platforms used to create them presents its own challenges. New products continue to emerge, while established products continue to develop and upgrade. This article describes what one library accomplished by conducting a comparative evaluation of two platforms used to create the same tutorial model: the frame-based, live-web tutorial.

Homegrown tutorials for local needs

For several years, the Bailey/Howe Library at the University of Vermont (UVM) has flipped library instruction with online tutorials. In particular, UVM librarians use them to support one-shot sessions integrated into the university’s first-year Foundational Writing and Information Literacy (FWIL) initiative.

We aim to give the FWIL library sessions maximum impact by timing them to take place soon after the students have begun the research component of their course and by focusing the sessions on the challenges they are beginning to encounter. These are often difficult questions involving the evaluation of search results and sources, for example how to find diverse perspectives or how to find sources with the right level of technical detail. Addressing these questions in the class setting gives students the tools and techniques to move their research forward.

The timing and focus of these sessions depend on flipped instruction in foundational concepts that enable students to begin their research in advance of the library session. We provide this instruction through a suite of five interactive tutorials covering the evaluation of information, periodicals and scholarly communication, databases, and search techniques.1 This flipped instruction ensures students have a common baseline of knowledge and allows librarians to focus their one-shots on students’ application of what they have learned.

Each tutorial contains several questions and two have additional quizzes, but assessment is not the priority. They are primarily designed for formative learning: a learning experience made effective through the learner’s self-guided discovery that brings each student to the same point of knowledge and understanding.

Frame-based, live-web tutorials

UVM created these five tutorials in Guide on the Side, the University of Arizona’s open source platform for creating frame-based, live-web tutorials.2 The learner accesses the tutorial through their web browser, where a narrow left-side frame presents directions and poses questions. This frame is also where the learner submits responses. The rest of the browser window is a larger right-side frame that allows the learner to engage with live web content, navigating, scanning, and scrolling in order to follow directions, complete tasks, and find answers.

This model of tutorial strengthens learning, formative learning in particular, in several ways. The use of live web content lends the experience more authenticity than captured content like screencasts. Learners have the autonomy to make their own decisions about how best to complete tasks or respond to what they are seeing, in addition to having control over their pacing and the time taken to complete the tutorial. For example, a student can be prompted to navigate to a library database and submit a search query that is relevant to their topic and answer questions that require critical reflection on the results.

Frame-based tutorials also offer advantages to the librarian who produces them. They may have a significant production time —one study has estimated that a Guide on the Side tutorial requires three times as long to develop as a screencast with equivalent content3—but they may provide time savings in the long run by being more scalable. Frame-based tutorials can be reproduced and easily updated in response to changes in web content or customized to meet the needs of different courses, learning outcomes, or student groups. UVM librarians have used templates and adapted the five FWIL tutorials to tailor tutorials to the needs of other courses and disciplines.

Evaluating products

UVM adopted Guide on the Side principally because of this scalability—each FWIL tutorial may be taken by up to 2,500 freshmen every year—and its suitability for formative learning. Another important consideration was the fact that Guide on the Side is free to install and maintain. As an open source program, it may be “free like kittens” but, once installed, the requirement for back-end maintenance is low.

Yet Guide on the Side also has characteristics that, for UVM, are limitations. Most frustratingly, it has no functions for aggregating or reporting data generated by students’ responses to tutorial questions and quizzes. Instead, each tutorial can route an HTML email “certificate” to the librarian or platform administrator. Analyzing the data in these certificates is then a time-consuming matter of manual data processing, which for UVM has been prohibitive beyond small samples.

In 2016, Springshare launched LibWizard Tutorials, a new module within its LibWizard product and an alternative platform for frame-based tutorials. The two platforms are ostensibly very similar: they create tutorials with the same two-frame structure, they both integrate interactions with live-web content, and they have similar quizzing features. But on closer inspection, LibWizard offered several useful functions that were not available in Guide on the Side, while lacking others that were integral features of our existing tutorials. What might be gained from a switch to LibWizard, and what might be lost or compromised?

Developing an evaluative approach

We needed a detailed review, evaluation, and comparison of the two products. To do this, our instructional design librarian (also our Guide on the Side administrator) and our coordinator of library instruction developed a set of criteria that embodies our needs and priorities. We collectively reviewed our existing tutorials, paying closest attention to the ones with the highest usage and the deepest integration into the curriculum, and documented the characteristics that would be necessary in any platform. Our drafting was enhanced, and to a certain degree validated, by considering the research literature on learning behavior and the characteristics of effective learning objects.4


One of UVM’s Guide on the Side tutorials, created by Erica DeFrain. View this article online for detailed images.

The same tutorial, recreated in LibWizard. View this article online for detailed images.

We then sought input from our library’s director of instructional services, our administrator of Springshare licenses, and our lead users of tutorials across UVM’s two libraries. Facilitating this collective review of our tutorial needs ensured nothing significant had been missed.

We now had our evaluative criteria, which we placed into a simple rubric, organizing them into four categories:

  • Formative learning. Features that support the student’s ability to achieve a tutorial’s learning outcomes completely and in a manner that confers learning and confidence.
  • Summative assessment. Features that support quizzing and grading.
  • Data management. Features that support the aggregation and reporting of performance data.
  • Ease of use. Features that simplify the platform’s adoption, use, and administration.

Each criterion was assigned a level of priority:

  • Essential. Characteristics that are integral to how our tutorials work and the attainment of learning outcomes.
  • Important. Elements that are broadly desirable but not essential.
  • Optional. Elements that offer minor enhancements and would not determine a decision on platform adoption.

We were then able to apply this rubric to each platform, identify their respective strengths and weaknesses, and consider the significance of each strength and weakness, and make a final decision.5

Advantages of an evaluation rubric

This method of evaluating tutorial platforms offers several benefits for any library that needs to select from a choice of comparable products.

1.

Defining your needs

What does your library need from a tutorials platform? The process of creating an evaluation rubric helped us to define and articulate our tutorial needs, based on local circumstances: our students’ levels of information literacy, the points in the curriculum where we have instructional contact with them, the objectives of the library instruction program, the objectives of the FWIL initiative, and the practices of our instructional librarians.

It can be a way to define tutorial needs on a program level. For example, our rubric shows very clearly that our library—especially our instruction in the FWIL initiative—needs a tool supporting formative assessment more than a tool for summative assessment.

It can also be a way to define the features that make a platform suited to different objectives. Creating the rubric drew our attention to the value of customized feedback for different incorrect options in a multiple-choice question. If a student has selected a certain incorrect option, it may reflect a certain mistake and custom feedback can explain what the learner needs to do differently to get the right answer. The capacity to present this kind of feedback is a feature of Guide on the Side that we were using but had previously undervalued.

2.

Evaluating platforms, relative to needs

A rubric makes it possible to evaluate a tutorials platform, relative to the needs that have been defined. To what extent does a platform contain the features you need? If some features are lacking, are they essential, important, or optional? A rubric is also an effective means for guiding a group’s collective evaluation and creating a uniform structure for everyone’s input.

3.

Comparing platforms

The comparative aspect of the evaluation helps to identify the areas where one platform is better suited to local needs than the other—and the significance (or otherwise) of each advantage. In our situation, our evaluation showed us that we were considering similar platforms with small functional differences that, as we see it, embodied significant differences in tutorial design. Our evaluation led to the conclusion that Guide on the Side is oriented towards formative learning, while LibWizard is more oriented towards summative assessment.

4.

Creating a tool for the evaluation of future products

We developed our rubric to support the immediate need to select a platform, but we now have a tool for re-evaluating a product in case of future upgrades or for evaluating any new frame-based tutorial platform that might come to market. Given the value of data on student performance, it seems likely that more platforms will emerge in the near future.

5.

Creating a tool for identifying unsatisfied needs and advocating for product development

In one sense, our rubric is a wish list of all the things we would like to see in a tutorials platform and, unsurprisingly, neither of the two platforms offered every feature on the list. But identifying “missing” features is important for thinking about a platform’s prospects for further development. It can be the basis for communicating with vendors. By providing detailed feedback on LibWizard, UVM has had some very positive dialogue with Spring-share about platform design and development. Finally, identifying missing features is a basis for continuing the conversation within instructional librarianship about what we need from a tutorial platform.

Learning from the process

UVM’s comparative evaluation of these two tutorial platforms has not lead to a final decision. We have adopted LibWizard for some tutorials, but continue to use Guide on the Side when the objective is formative learning. Those decisions aside, the process itself has been instructive.

The evaluation showed us clearly, and with much specificity, that the two platforms offer qualitatively different learning experiences that relate to our local instructional needs in different ways. The possibility of switching from one to another requires us to think deeply about how our design practices would need to be rewired in a new platform. In the meantime, our evaluation is a useful resource as we continue to monitor product developments and engage in dialogue with platform users and providers.


Notes
1. Originally created by Erica DeFrain, UVM’s then-instructional design librarian.
2. Sult, L. Mery, Y. Blakiston, R. Kline, E. , “A New Approach to Online Database Instruction: Developing the Guide on the Side,”. Reference Services Review 41 ( 2013 ): 125-33 –, doi: [CrossRef] .
3. Mikkelsen, S. McMunn-Tetangco, E. , “Guide on the Side: Testing the Tool and the Tutorials,”. Internet Reference Services Quarterly 19 ( 2014 ): 271-82 –, doi: [CrossRef] .
4. A good overview of the literature is provided in Katherine Stiwinter’s “Using an Interactive Online Tutorial to Expand Library Instruction,”. Internet Reference Services Quarterly 18 ( 2013 ): 15-41 –, doi: [CrossRef] .
5. Our rubric is posted at http://go.uvm.edu/libtutrubric.
Copyright © 2017 Graham Sherriff

Article Views (2017)

No data available

Contact ACRL for article usage statistics from 2010-April 2017.