Given an unexpected query result, query explanation problem aims to automatically generate an explanation to resolve it. Existing provenance-based explanation methods identify a group of records to delete from a table. However, this deletion-based output is often not actionable practically. Inspired by the counterfactual explanation from the ML community, we propose AutoAction, an actionable query explanation framework, which aims to automatically generate an actionable explanation by identifying the updates to a group of records. There are three challenges to develop AutoAction: i) how to design an effective objective function to quantify explanations, ii) how to devise an efficient search algorithm, and iii) how to support real-world constraints. We propose novel solutions to address these challenges. We also conduct extensive experiments with both real-world and synthetic datasets and did a user study. The results show that AutoAction can automatically generate useful actionable explanations more efficiently than baselines.
Copyright is held by the author(s).
This thesis may be printed or downloaded for non-commercial research and scholarly purposes.
Supervisor or Senior Supervisor
Thesis advisor: Wang, Jiannan
Member of collection