Wordtrade LogoWordtrade.com
Social Science


Review Essays of Academic, Professional & Technical Books in the Humanities & Sciences


Library Science

Content Evaluation of Textual CD-ROM and Web Databases by Peter Jacso, Carol Tenopir (Database Searching Series: Libraries Unlimited) This richly illustrated professional guide from the award‑winning author and renowned expert Peter Jacso provides comprehensive guidelines for evaluating a diverse variety of textual database types available on CD‑ROM and on the Web. As libraries remain the center of guidance for information, librarians and media specialists find themselves in the crucial role of having to critically evaluate database content to meet the information requirements of their patrons. Offering a systematic approach to a set of evaluation criteria, this outstanding textbook and professional resource is essential for librarians, media specialists, faculty and students of library and information science, and those who critique and review databases.

 Information systems identify a database by a variety of database (file) numbers or acronyms. In such situations, an explanatory description, such as "the DIALOG version of Ulrich's" or "Bowker's CD-ROM version of Ulrich's," is used instead of file 480, or Ulrich's Plus on Disc. In other cases, when only one version of a database is used for illustration throughout the book, the name as it appears on the online system is used, such as Social SciSearch for the DIALOG version of the ISI citation database. Graphs may have shorter acronyms for space reasons (such as SocSCI), but the related narrative text will use the preferred longer name format. In the case of search screen shots, the names or codes used by the host system will appear. such as file 7 for Social SciSearch. Again, the surrounding text will provide adequate clues.

Many of the tests for evaluating database content characteristics are software‑dependent. DIALOG provides the most and by far the best possibilities for testing and is used the most often in this book. It is also the most widely used service by schools of library and information studies. However, examples are shown also from other implementations of the databases, including the online and CD‑ROM search software of Ovid, SilverPlatter, Bell & Howell Information and Learning, H. W. Wilson, and several others (mostly designed for Web implementations) that are uniquely developed for a particular database by the original content provider, such as Barnes & Noble, Amazon.com, or the All Movie Guide. (Only the URL of free Web databases will be given in the text, and only when a specific example may make it necessary for the reader to consult the site or a specific page of the site. The sites are easy to find by their names cited here, using the various Web directories.)

You may not have the time, resources, or interest to evaluate databases by all the criteria discussed in this book. Some criteria may not apply to the database in which you are interested. Nevertheless, these criteria will be helpful in creating a checklist of questions to ask the database publishers, especially in the case of the fee‑based services, when you are entitled to get detailed answers to such questions as the pattern of coverage of purportedly core journals, the time lag between the publishing of primary documents and the availability of their records in a database, or the availability of specific data elements, such as document type or LC classification number across the entire database.

The content of databases keeps changing, mostly for the better, although there are exceptions. Information Science Abstracts was taken over in 1998 by Information Today. Although it has been improving in some respects, it does not come close in quality (as was expected) to Internet & Personal Computing Abstracts (IPCA), formerly known as Microcomputer Abstracts, the flagship database of Information Today.

Although records added since 1998 are of better quality (certainly the ones "handed down" by IPCA), the database added half as many records a year as before and dropped essential information science journals or reduced their coverage to a bare minimum, as illustrated in chapters 4 and 5 about database dimensions and database source coverage. Other database publishers may have solved some of the problems discussed in this book. In your evaluation you will need to examine the then‑current version of the databases. Unless noted otherwise, the tests for this book were done between July 1999 and November 2000, with some of them being run again in July 2001 when reading the galley proofs of the book.

I have provided consulting services for several of the companies mentioned in this book, and I am a columnist for Information Today, and the Gale Group, which are, in turn, producers of files and in some cases also publishers of databases. I made all efforts to not let this fact influence my judgment about the quality of their databases.


Headline 3

insert content here