Date of Award

2023

Document Type

Thesis

Degree Name

Master of Science (MS)

Department

Computer Science

Committee Chair

Vineetha Menon

Committee Member

Bryan Mesmer

Committee Member

Harry Delugach

Subject(s)

Human-computer interaction, Artificial intelligence, Virtual reality, Pattern recognition systems, Explainable AI (XAI)

Abstract

The exponential nature by which Artificially Intelligent (AI) models grow enables evermore complex tasks to be achieved. Since these tasks tend to augment human capability, interacting with these models presents a unique way to commission assistive automation. The introduction of Explainable AI (XAI) enables AI stakeholders to converse with models, understand model reasonings, and comprehend biases while considering reliability requirements. The introduction of this Explainable capability in Human-AI teams presents a unique approach to addressing the complexities behind Human-Computer Interactions. Our application of Explainable AI for Intelligent Systems to augment decision-making is poised to shine a light on the challenges, feasibility, and considerations for adoption in real-time explainable systems. This thesis presents several contributions including: a virtual environment to measure the Human-AI teaming dynamic, an analysis of the experimental behaviors observed in this team, and the framework we use to integrate AI model explanations into our virtual environment.

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.