Font Size: a A A

An examination of evaluation methods for comparing two information retrieval systems supporting teacher performance

Posted on:2005-06-08Degree:Ph.DType:Dissertation
University:Indiana UniversityCandidate:Schatz, Steven CraigFull Text:PDF
GTID:1459390008492299Subject:Information Science
Abstract/Summary:
This study examines existent and new methods for evaluating the success of information retrieval systems. The theory underlying current methods is not robust enough to handle the current volume of information. Traditional measures rely on judgments of whether a document is relevant to a particular question. A good system returns all the relevant documents and no extraneous documents. There is a rich literature questioning the efficacy of relevance judgments. Such questions as: Relevant to who? When? To what purpose? are not well answered in traditional theory.; In this study, two new measures (Spink's Information Need and Cooper's Utility) are used in evaluating two systems, comparing these new measures with traditional measures and each other.; Two very different systems of searching were used to search the same set of 500 documents. One system, a text based system, resembled most common web search engines. The other system used a series of meta data tags for searching.; Thirty-four educators searched for information using both search engines and evaluated the information retrieved by each. The participants searched a total of four times---twice using each system. Construct measures, derived by multiplying each of the three measures (traditional, information need, and utility) by a rating of satisfaction were compared using two way analysis of variance.; Results indicated that there was a significant correlation between the three measures---so the new measures provided an equivalent method of evaluating systems and have some significant advantages---including no need for relevance judgments and easy application in situ. While the main focus of the study was on the methods of evaluation, the evaluation in this case showed that the text system was better than the tag based system.
Keywords/Search Tags:System, Methods, Information, Evaluation, New
Related items