Advertisement

Explaining recommendations in an interactive hybrid social recommender

Explaining recommendations in an interactive hybrid social recommender Explaining recommendations in an interactive hybrid social recommender
Chun-Hua Tsai, Peter Brusilovsky

IUI '19: 24th International Conference on Intelligent User Interfaces
Session: Explanations in Recommender Systems

Abstract
Hybrid social recommender systems use social relevance from multiple sources to recommend relevant items or people to users. To make hybrid recommendations more transparent and controllable, several researchers have explored interactive hybrid recommender interfaces, which allow for a user-driven fusion of recommendation sources. In this field of work, the intelligent user interface has been investigated as an approach to increase transparency and improve the user experience. In this paper, we attempt to further promote the transparency of recommendations by augmenting an interactive hybrid recommender interface with several types of explanations. We evaluate user behavior patterns and subjective feedback by a within-subject study (N=33). Results from the evaluation show the effectiveness of the proposed explanation models. The result of post-treatment survey indicates a significant improvement in the perception of explainability, but such improvement comes with a lower degree of perceived controllability.

DOI::
WEB::

Recorded at the 24th International Conference on Intelligent User Interfaces, Los Angeles, USA, March 16 - 20 2019

SIGCHI,IUI 2019,hybrid recommendation,Recommender systems,explanation,user-driven fusion,HCI design and evaluation methods,

Post a Comment

0 Comments