Is it Really Fun? Detecting Low Engagement Events in Video Games
The gaming industry has witnessed remarkable growth in recent years, attracting millions of people who engage in its products both as a hobby and for professional purposes (e.g., e-sports). Video games are software products that have a unique and fundamental requirement: They must be engaging. Previous research introduced approaches aimed at measuring engagement, some of which specifically designed for video games. Such approaches could be useful for practitioners since they can be adopted on the large collection of gameplay videos daily published on platforms such as Twitch and YouTube to allow developers to monitor players’ engagement and detect ares in which it is low. Such specialized approaches have been evaluated on datasets in which the engagement was manually assessed by \textit{external} evaluators based on the face of the player (we call it \textit{perceived} engagement). We still do not know whether such approaches can capture the \textit{real} engagement of players. Also, it is unclear to what extent practitioners would be willing to adopt such approaches in practice. In this paper, we provide two contributions. First, we ran an experiment with human 40 players aimed at defining a dataset of gameplay sessions in which participants self-reported their \emph{real} engagement after every minute. We captured both their face and the gameplay. Based on this data, we compared state-of-the-art machine learning-based approaches to detect lowly engaging sessions. Our results show that the best model correctly classifies engagement in 74.7% of the cases and ranks video games in terms of their \textit{real} engagement very similarly to how players would rank them (Spearman $\rho$ = 0.833). Second, to assess the practicality of adopting such approaches in an industrial setting, we conducted two semi-structured interviews with senior game developers, who provided generally positive feedback and interesting insights for future developments.