Automated evaluation of search engine performance via implicit user feedback

Himanshu Sharma, Bernard Jansen

Research output: Chapter in Book/Report/Conference proceedingConference contribution

19 Citations (Scopus)

Abstract

Measuring the information retrieval effectiveness of Web search engines can be expensive if human relevance judgments are required to evaluate search results. Using implicit user feedback for search engine evaluation provides a cost and time effective manner of addressing this problem. Web search engines can use human evaluation of search results without the expense of human evaluators. An additional advantage of this approach is the availability of real time data regarding system performance. Wecapture user relevance judgments actions such as print, save and bookmark, sending these actions and the corresponding document identifiers to a central server via a client application. We use this implicit feedback to calculate performance metrics, such as precision. We can calculate an overall system performance metric based on a collection of weighted metrics.

Original languageEnglish
Title of host publicationSIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval
Pages649-650
Number of pages2
DOIs
Publication statusPublished - 2005
Externally publishedYes
Event28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2005 - Salvador
Duration: 15 Aug 200519 Aug 2005

Other

Other28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2005
CitySalvador
Period15/8/0519/8/05

Fingerprint

Search engines
Feedback
Information retrieval
Servers
Availability
Costs

Keywords

  • implicit user feedback
  • search engine evaluation

ASJC Scopus subject areas

  • Information Systems

Cite this

Sharma, H., & Jansen, B. (2005). Automated evaluation of search engine performance via implicit user feedback. In SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 649-650) https://doi.org/10.1145/1076034.1076172

Automated evaluation of search engine performance via implicit user feedback. / Sharma, Himanshu; Jansen, Bernard.

SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval. 2005. p. 649-650.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Sharma, H & Jansen, B 2005, Automated evaluation of search engine performance via implicit user feedback. in SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval. pp. 649-650, 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2005, Salvador, 15/8/05. https://doi.org/10.1145/1076034.1076172
Sharma H, Jansen B. Automated evaluation of search engine performance via implicit user feedback. In SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval. 2005. p. 649-650 https://doi.org/10.1145/1076034.1076172
Sharma, Himanshu ; Jansen, Bernard. / Automated evaluation of search engine performance via implicit user feedback. SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval. 2005. pp. 649-650
@inproceedings{0d872ddfc51247a3a5e5986c4faca7c8,
title = "Automated evaluation of search engine performance via implicit user feedback",
abstract = "Measuring the information retrieval effectiveness of Web search engines can be expensive if human relevance judgments are required to evaluate search results. Using implicit user feedback for search engine evaluation provides a cost and time effective manner of addressing this problem. Web search engines can use human evaluation of search results without the expense of human evaluators. An additional advantage of this approach is the availability of real time data regarding system performance. Wecapture user relevance judgments actions such as print, save and bookmark, sending these actions and the corresponding document identifiers to a central server via a client application. We use this implicit feedback to calculate performance metrics, such as precision. We can calculate an overall system performance metric based on a collection of weighted metrics.",
keywords = "implicit user feedback, search engine evaluation",
author = "Himanshu Sharma and Bernard Jansen",
year = "2005",
doi = "10.1145/1076034.1076172",
language = "English",
isbn = "1595930345",
pages = "649--650",
booktitle = "SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval",

}

TY - GEN

T1 - Automated evaluation of search engine performance via implicit user feedback

AU - Sharma, Himanshu

AU - Jansen, Bernard

PY - 2005

Y1 - 2005

N2 - Measuring the information retrieval effectiveness of Web search engines can be expensive if human relevance judgments are required to evaluate search results. Using implicit user feedback for search engine evaluation provides a cost and time effective manner of addressing this problem. Web search engines can use human evaluation of search results without the expense of human evaluators. An additional advantage of this approach is the availability of real time data regarding system performance. Wecapture user relevance judgments actions such as print, save and bookmark, sending these actions and the corresponding document identifiers to a central server via a client application. We use this implicit feedback to calculate performance metrics, such as precision. We can calculate an overall system performance metric based on a collection of weighted metrics.

AB - Measuring the information retrieval effectiveness of Web search engines can be expensive if human relevance judgments are required to evaluate search results. Using implicit user feedback for search engine evaluation provides a cost and time effective manner of addressing this problem. Web search engines can use human evaluation of search results without the expense of human evaluators. An additional advantage of this approach is the availability of real time data regarding system performance. Wecapture user relevance judgments actions such as print, save and bookmark, sending these actions and the corresponding document identifiers to a central server via a client application. We use this implicit feedback to calculate performance metrics, such as precision. We can calculate an overall system performance metric based on a collection of weighted metrics.

KW - implicit user feedback

KW - search engine evaluation

UR - http://www.scopus.com/inward/record.url?scp=84879848684&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84879848684&partnerID=8YFLogxK

U2 - 10.1145/1076034.1076172

DO - 10.1145/1076034.1076172

M3 - Conference contribution

SN - 1595930345

SN - 9781595930347

SP - 649

EP - 650

BT - SIGIR 2005 - Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval

ER -