Computer Science – Information Retrieval
Scientific paper
2010-10-09
Computer Science
Information Retrieval
7 pages, 3 figures, LWA 2010, Workshop IR
Scientific paper
This paper is about an information retrieval evaluation on three different retrieval-supporting services. All three services were designed to compensate typical problems that arise in metadata-driven Digital Libraries, which are not adequately handled by a simple tf-idf based retrieval. The services are: (1) a co-word analysis based query expansion mechanism and re-ranking via (2) Bradfordizing and (3) author centrality. The services are evaluated with relevance assessments conducted by 73 information science students. Since the students are neither information professionals nor domain experts the question of inter-rater agreement is taken into consideration. Two important implications emerge: (1) the inter-rater agreement rates were mainly fair to moderate and (2) after a data-cleaning step which erased the assessments with poor agreement rates the evaluation data shows that the three retrieval services returned disjoint but still relevant result sets.
Mayr Philipp
Mutschke Peter
Schaer Philipp
No associations
LandOfFree
Implications of Inter-Rater Agreement on a Student Information Retrieval Evaluation does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Implications of Inter-Rater Agreement on a Student Information Retrieval Evaluation, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Implications of Inter-Rater Agreement on a Student Information Retrieval Evaluation will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-221026