ACI Journal Articles

Title

Explaining Autonomous Drones: An XAI Journey

Document Type

Article

USMA Research Unit Affiliation

Army Cyber Institute

Publication Date

2021

Abstract

COGLE (COmmon Ground Learning and Explanation) is an explainable artificial

intelligence (XAI) system where autonomous drones deliver supplies to

field units in mountainous areas. The mission risks vary with topography,

flight decisions, and mission goals. The missions engage a human plus AI team

where users determine which of two AI-controlled drones is better for each

mission. This article reports on the technical approach and findings of the project

and reflects on challenges that complex combinatorial problems present

for users, machine learning, user studies, and the context of use for XAI systems.

COGLE creates explanations in multiple modalities. Narrative “What”

explanations compare what each drone does on a mission and “Why” based on

drone competencies determined from experiments using counterfactuals.

Visual “Where” explanations highlight risks on maps to help users to interpret

flight plans. One branch of the research studied whether the explanations helped

users to predict drone performance. In this branch, a model induction user

study showed that post-decision explanations had only a small effect in teaching

users to determine by themselves which drone is better for a mission. Subsequent

reflection suggests that supporting human plus AI decision making

with pre-decision explanations is a better context for benefiting from explanations

on combinatorial tasks.

Record links to items hosted by external providers may require fee for full-text.

Share

COinS