Skip to main content

Research Repository

See what's under the surface

Advanced Search

Reproduction of experiments in recommender systems evaluation based on explanations

Polatidis, Nikolaos; Pimenidis, Elias

Authors

Nikolaos Polatidis



Contributors

Chrisina Jayne
Editor

Abstract

The offline evaluation of recommender systems is typically based on accuracy metrics such as the Mean Absolute Error (MAE) and the Root Mean Squared Error (RMSE), while on the other hand Precision and Recall is used to measure the quality of the top-N recommendations. However, it is difficult to reproduce the results since there are different libraries that can be used for running experiments and also within the same library there are many different settings that if not taken into consideration when replicating the result might vary. In this paper, we show that it is challenging to reproduce results using a different library but with the use of the same library an explanation based approach can be used to assist in the re-producibility of experiments. Our proposed approach has been experimentally evaluated using a real dataset and the results show that it is both practical and effective.

Publication Date Aug 26, 2018
Peer Reviewed Peer Reviewed
Series Title Communications in Computer and Information Science
Series Number 893
Book Title Engineering Applications of Neural Networks
ISBN 9783319982038
APA6 Citation Polatidis, N., & Pimenidis, E. (2018). Reproduction of experiments in recommender systems evaluation based on explanations. In E. Pimenidis, & C. Jayne (Eds.), Engineering Applications of Neural NetworksSpringer International Publishing AG. https://doi.org/10.1007/978-3-319-98204-5
DOI https://doi.org/10.1007/978-3-319-98204-5
Keywords recommender systems, evaluation, explanations, reproducibility
Publisher URL https://doi.org/10.1007/978-3-319-98204-5
Additional Information Additional Information : The final publication is available at Springer via https://doi.org/10.1007/978-3-319-98204-5

Files








You might also like



Downloadable Citations

;