An Evaluative Model for Information Retrieval System Evaluation: A User- centered Approach
نویسندگان
چکیده
The key technology for knowledge management that guarantees access to large corpora of both structured and unstructured data is Information retrieval (IR) Systems. The ones commonly used on an everyday basis are search engines. This study developed and validated an evaluative model from user’s perspective meant to assess these systems using the user-centered approach. Items used and validated in other related studies were used to elicit responses from over 250 users. The reliability and validity of the measurement instrument (MI) was demonstrated using statistics such as internal consistency, composite reliability and convergent validity. After assessing the reliability and validity of the MI, the resultant evaluative model was estimated for goodness-of-fit using the structural equation modeling (SEM) technique. Results confirmed that the suggested model is valid and will be useful to researchers who wish to use it. Thus, this study suggests both the parameters and methods employed to formulate the model for use in user-centered studies for the evaluation of IR system. Both the evaluative model, factor analytic methods of data analysis, could be used to understand and present more factors (parameters) that will be usable for IR system evaluation using the user-centered approach.
منابع مشابه
Behavioral Considerations in Developing Web Information Systems: User-centered Design Agenda
The current paper explores designing a web information retrieval system regarding the searching behavior of users in real and everyday life. Designing an information system that is closely linked to human behavior is equally important for providers and the end users. From an Information Science point of view, four approaches in designing information retrieval systems were identified as system-...
متن کاملReview of ranked-based and unranked-based metrics for determining the effectiveness of search engines
Purpose: Traditionally, there have many metrics for evaluating the search engine, nevertheless various researchers’ proposed new metrics in recent years. Aware of this new metrics is essential to conduct research on evaluation of the search engine field. So, the purpose of this study was to provide an analysis of important and new metrics for evaluating the search engines. Methodology: This is ...
متن کاملکاربست مدل بازیابی تخصص برای یافتن نویسندگان خبره
This research applied Expertise Retrieval model for finding expert authors, and used evaluation methods of Information Retrieval systems for measuring the performance of those models. Current research is an experimental one. Besides, a variety of methods including survey method has been used in the research process. Various models were developed for finding expert authors, all built on a known ...
متن کاملUser-centered Measures vs. System Effectiveness in Finding Similar Songs
User evaluation in the domain of Music Information Retrieval (MIR) has been very scarce, while algorithms and systems in MIR have been improving rapidly. With the maturity of system-centered evaluation in MIR, time is ripe for MIR evaluation to involve users. In this study, we compare user-centered measures to a system effectiveness measure on the task of retrieving similar songs. To collect us...
متن کاملEvaluation in Information Retrieval
Over its 40 year history information retrieval evaluation evolved with an emphasis on laboratory experimentation, the “Cranfield paradigm”, in response to early demands for experimental rigour. The current and highly successful TREC experiments follow this model. However the demand for results that are robust and scalable has caused an emphasis on system-centered evaluation at the expensive of ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2011