ACRL

College & Research Libraries News

User reaction to a computerized periodical index

By Douglas J. Ernest Acting Business& Economics Librarian Colorado State University

and Jennifer Monath Assistant Reference Librarian Colorado State University

InfoTrac’s track record at Colorado State University.

Periodical indexes, a standby for accessing li- brary resources, have remained essentially unchanged since their introduction many decades ago. Now a new approach to indexing is at hand. InfoTrac, a turnkey software/hardware data package utilizing videodisc technology, was developed by Information Access Company (IAC) and made available to libraries in the spring of 1985. The database consisted of half a million citations to articles found in approximately one thousand business, general interest, and legal periodicals, and several newspapers.

The citations are mounted on a videodisc which is changed and updated monthly. Coverage is retrospective to January 1982. Hardware consists of IBM personal computers, Hewlett-Packard Think- jet printers, and Pioneer videodisc players. The system can accommodate up to six user stations, with each station consisting of a PC and printer. From the first, IAC intended that InfoTrac provide a search capability for general users that required little or no assistance from library staff.1

The user searches InfoTrac by keying in the word or phrase for the topic sought and pressing the color-coded “search” key. The program searches the videodisc and displays either the full index containing citations to the subject or responds with “no exact match for your request,” and displays that alphabetical portion of the thesaurus that most closely corresponds to the suggested subject. The automated search process should last no more than ten seconds, even when four users are searching simultaneously.2 The searcher can move the cursor to any subject heading in the thesaurus or to any see or see also reference in the main index, and by pressing the “search” key have the system display citations entered at the reference. The subject heading or see or see also headings do not have to be keyed in once they have been selected using the cursor. The user can browse line by line or screen by screen through either the thesaurus or the full index. Subject headings and citations can be printed at any point, either entry by entry or a screen at a time. Upon completion, the user can return the system to its original display by pressing a “start/finish" key.3

At present the InfoTrac database includes selected material from several IAC sources: Magazine Index, National Newspaper Index, Business Index, and Legal Resources Index. The videodisc has a capacity of 2-2.5 million references, approximately one gigabyte, the equivalent of five years of citations. The database uses mostly Library of Congress subject headings, with each citation including article title, author, periodical title, date, beginning page number, and number of pages in the article. Volume numbers have, however, been omitted. Menu screens provide the user with assistance in regard to the scope of the database, searching procedures, and function keys.4

IAC first demonstrated InfoTrac to the library community at ALA Midwinter in January 1985. Believing that the system could only receive a thorough test under actual user conditions, the company offered InfoTrac to a number of libraries on a trial basis.5 The University of California, San Diego, was one of these “beta” sites. Use of the system there was continual, and patrons were pleased with it; the ability to have citations printed was especially valued. One potential difficulty did emerge. Whereas InfoTrac designers had intended that a typical search could be accomplished in five minutes or less, UC San Diego librarians found that some users spent considerable time searching and printing. Hours could be consumed by one person. Another problem proved to be a familiar computer system bugaboo: users assumed the database was comprehensive in scope and consulted it accordingly. On the whole, though, the library staff, like the patrons, reacted favorably to InfoTrac.6

The Colorado State University Libraries (CSUL) became involved early on as a test site for InfoTrac. Beyond the beta test sites, CSUL was the first library to express an interest in testing the InfoTrac system. In the spring of 1985, Stephen Green, the head of the Reference Department, met with an IAC representative who agreed to using CSUL as a test site. In July 1985 InfoTrac was exhibited at ALA again, this time in Chicago. The system drew considerable interest from libraries around the country, and the University of Colorado (CU) and the University of Northern Colorado (UNC) also became interested in participating as test sites. The testing period for all three libraries took place in the fall of 1985; all then chose to purchase a four- station InfoTrac.

The InfoTrac system at CSUL was set up on September 24, 1985. During the first four to five weeks the system worked without a hitch and was well received by users. Beginning the last week of October, initial system instability problems developed. These problems slowly degraded to the point where the system was down totally. Every component in the laser control cabinet was replaced three times. An IAC microsystems analyst visited the site, replaced some parts and brought the system up, but it stayed up for only three days. The same problems we had been experiencing began to recur, that is, the system would lock and die. No matter what was done, the same problems arose repeatedly. Instability became a permanent problem and unreliability became the status quo. This led to increasing and persistent user frustration among the students, staff and faculty. The InfoTrac system was working very smoothly in the other Colorado libraries. It became quite apparent that this was not a system design flaw. As time passed, with much equipment replacement occurring, we began to look beyond the system to the environment for solutions. We noticed that fluorescent bulbs hooked into the same electrical circuit flickered from time to time. Therefore, it was surmised that these flickerings might be causing a power surge. In other words, we had a dirty line. A dedicated line was installed, and since that time virtually no problems have occurred.

Another contributing factor to the system’s instability may have been the daily turning on and off of the monitors, memory and laser disc. Currently InfoTrac is being left on 24 hours daily under normal circumstances. As a result of our experiences with system instability, we highly recommend having a dedicated electrical line for the system and leaving the system on as much as possible.

A user survey was conducted during the test period at CSUL. A total of 229 surveys were returned and analyzed. Undergraduates proved to be the heaviest users. All 9 CSU colleges were represented: Business; Professional Studies; Natural Resources and Forestry; Veterinary Medicine; Agriculture; Arts, Humanities and Social Sciences; Engineering; Natural Sciences; and Human Resources Sciences. Of these, the Business College and the College of Arts, Humanities and Social Sciences were the predominant users. The three leading majors were Marketing, Computer Information Science, and Management. Business topics were searched most frequently, with social sciences a close second.

The majority of respondents were already familiar with either Magazine Index or Business Index; approximately 95% preferred InfoTrac because it was easier to use, saved time, and was more accurate. They also found the instructions for use very clear, validating IAC’s claim that InfoTrac is user friendly. Respondents overwhelmingly wanted InfoTrac purchased in spite of the fact that there was often a wait to use it. Another frequently offered comment was that more stations were needed.

A similar user survey from UNC indicated that patrons thought InfoTrac was a great enhancement to the library, that it could be used without any formal training, that it was easy to use and timesaving, and that the “help” screens provided valuable information for using the system. However, data collected showed that users wanted a better printer and preferred to have UNC’s serials list on the system.

InfoTrac users in action.

A user survey at CU also indicated that patrons were highly satisfied with InfoTrac and were particularly happy with its ease of use. Respondents preferred the system to printed periodical indexes by a margin of four to one. Suggestions were made to obtain more terminals and establish a thirty minute time limit on usage.

Over all, InfoTrac was a big success at all three Colorado test sites. Users readily adapted to using the system and quickly became dependent on it, and, as a result, great dissatisfaction arose at CSUL any time the system was down. At least one student was seen to walk up to a terminal, curse when he noted that it was out of order, and leave the library. Others showed great persistence in gaining access to the system. Some drove to UNC, thirty miles distant, to use the InfoTrac there. Others were willing to postpone their literature searches until the system had been repaired, refusing to use print indexes in the interim because “InfoTrac is so much faster!’’

Apart from the lengthy periods of down time, CSUL experienced many of the same successes and frustrations of InfoTrac as did UC San Diego. Chief among the frustrations was the extended use of the system by individual patrons. Such users would monopolize a station for anywhere from thirty minutes to two hours. This behavior was sometimes accompanied by a tendency to print reams of citations. One student was observed to copy every citation under the heading “airlines” as fast as they appeared on the screen. Obviously no attempt was made to discriminate among the various subheadings. One way to eliminate lengthy searches would be to set a time limit on at least some of the terminals and place the terminals at a height that would require the user to stand. The ability to print citations free of charge is one of the major attractions of the system and a certain amount of excess use can be expected.

As the survey indicated, most students were able to use InfoTrac without asking for assistance from library staff. The searches they conducted apparently were satisfactory, as few complained about not finding information on their subject. Probably not all were using the system as effectively as possible. Although one searches by a term or terms, the search itself is not keyword. Instead the system matches the search against the subject thesaurus to find an appropriate heading. For example, the best strategy for searching articles on parks in Colorado would be to first search “Colorado,” browse the subject thesaurus until the subheading “Colorado—Parks” is found, then use the cursor to command the system to search that heading. A direct search using “Colorado—Parks” or “Colorado Parks” as a heading or phrase would not work. We at CSUL have yet to determine how many users are aware of the value of the thesaurus in framing their searches. A closer examination of their searching techniques would probably reveal what has already been noted in other library end user contexts: “…an expectation that the computer is intelligent or at least has some powers of interpretation.”7

The idea that a computer-based system has some ability to interpret requests may, in part, account for the popularity of InfoTrac. The system is utilized by both undergraduate and graduate students to search a large variety of topics, including some that are technical and scientific in nature, and therefore not appropriate considering the contents of the database. It is commonplace for library patrons to wait patiently to use InfoTrac rather than consult printed periodical indexes, despite the fact that the latter are situated only a few feet away. Indeed, for some, it was as though printed indexes had ceased to exist; when InfoTrac crashed, they could conceive of no other method of locating information.

Another facet in the response to the system is probably the fact that it is the first end-user system to become available at CSUL. An online catalog is in the design stage, but still some months away. As the only computer show in town, InfoTrac has drawn a real crowd. On the other hand, CSUL users are therefore perhaps less computer- sophisticated than library patrons in other situations; hence, there were only a few requests for searching devices such as keyword and Boolean searching, both absent in InfoTrac. Most complaints received thus far concern the fact that CSUL does not subscribe to all the periodicals indexed in InfoTrac, rather than any perceived limitations of the database itself. Curiously, there has been little comment here on the fact that InfoTrac provides more timely citations than do the print indexes. Perhaps users simply assume that a computer by nature has to be more current than a book.

Library users at Colorado State University have been highly satisfied by InfoTrac, particularly since down-time problems have apparently been solved. The system is quite user-friendly and is used by undergraduates, graduate students, and faculty alike. InfoTrac, like any index, has limitations, but these are less apparent and less significant to users than to librarians. The fact that InfoTrac is computer-based lends to it an aura of “magic” that print indexes lack. This suggests that users need careful instruction to inform them of the limitations of a computer-based system; otherwise, they may assume it is universal in scope. Further, one suspects that end-user searching techniques are not as sophisticated or effective as they could be, just as their use of the card catalog often defies the efforts of catalogers. ■■

Notes

  1. Richard Carney, “InfoTrac: an Inhouse Computer-Access System” Library Hi-Tech 3 (Issue 10, 1985):91-92; “Information Access Company Introduces Videodisc System,” Information Technology and Libraries 4 (March 1985):70-71.
  2. Richard Carney, “Information Access Company’s InfoTrac,” Information Technology and Libraries 4 (June 1985):152.
  3. “Information Access Company Introduces Videodisc System,” p. 70.
  4. Carney, “Information Access Company’s InfoTrac,” p. 152; Carney, “InfoTrac,” pp. 91-92.
  5. Carney, “Information Access Comapny’s InfoTrac,” p. 150; Carney, “InfoTrac,” p. 91.
  6. Carney, “InfoTrac,” p. 92; Judith Herschman and Kristen Maultsby, “InfoTrac: Impressions from a Beta Site,” Library Hi-Tech 3 (Issue 10, 1985):93—94.
  7. Jean Dickson, “An Analysis of User Errors in Searching an Online Catalog,” Cataloging and Classification Quarterly 4 (Spring 1984) :35.
Copyright © American Library Association

Article Views (By Year/Month)

2025
January: 3
February: 11
March: 12
April: 9
May: 15
June: 19
July: 18
August: 11
September: 18
October: 17
November: 32
December: 12
2024
January: 2
February: 0
March: 4
April: 5
May: 6
June: 7
July: 5
August: 4
September: 4
October: 3
November: 2
December: 4
2023
January: 2
February: 0
March: 0
April: 6
May: 3
June: 1
July: 2
August: 0
September: 2
October: 1
November: 0
December: 3
2022
January: 5
February: 0
March: 0
April: 4
May: 3
June: 4
July: 6
August: 4
September: 11
October: 4
November: 4
December: 2
2021
January: 9
February: 5
March: 3
April: 5
May: 2
June: 2
July: 2
August: 0
September: 1
October: 8
November: 1
December: 2
2020
January: 15
February: 7
March: 8
April: 4
May: 6
June: 4
July: 11
August: 9
September: 3
October: 9
November: 8
December: 4
2019
January: 0
February: 0
March: 0
April: 0
May: 0
June: 0
July: 0
August: 5
September: 8
October: 8
November: 9
December: 11