Skip to main content
Eastern Academy of Management International 2024

Full Program »

Explainable Ai, Gig Workers’ Acceptance and Worker-Management Relationships In The Gig Economy

Authors:

Miles M. Yang | (miles2yang@gmail.com)
Macquarie University Australia
LinkedIn: 
Orcid: 

Ying Lu | (candy.lu@mq.edu.au)
Macquarie University Australia
LinkedIn: 
Orcid: 

Fang Lee Cooke | (Fang.Cooke@monash.edu)
Monash University Australia
LinkedIn: 
Orcid: 

Keywords: acceptance of AI-driven decision, explainable AI (xAI), gig worker, scenario-based field experiment, worker-management relationship


Abstract: This study examines the intersection of explainable artificial intelligence (xAI) and the gig economy, questioning the prevailing assumption that detailed explanations inherently enhance workers’ understanding and acceptance of AI-driven decisions. Applying cognitive load theory and cognitive fit theory, we argue that counterfactual explanations enhance worker acceptance and improve worker-management relationships, while the complexity of local explanations may increase cognitive load and undermine these benefits. Through a field experiment involving 1,107 gig workers, we find that both counterfactual and local explanations independently foster AI decision acceptance. However, local explanations can adversely affect the relationship between counterfactual explanations and acceptance. Moreover, acceptance mediates the influence of counterfactual explanations on worker-management relationships, with local explanations moderating this mediation. Our study enriches the theoretical understanding of xAI in the gig economy and underscores the importance of a nuanced approach to AI explanation design and communication.

 


Powered by OpenConf®
Copyright ©2002-2023 Zakon Group LLC