21th AIAI 2025, 26 - 29 June 2025, Limassol, Cyprus

Uncovering Gender Biases in AI-Powered Low-Code and No-Code Solutions

Tsoukalas Spyridon, Voutyrakou Dialekti Athina, Katsiampoura Gianna, Karelis Marios, Mikalef Patrick, Avlonitis Markos

Abstract:

  Low-Code and No-Code (LCNC) platforms are gaining popularity due to their affordability and accessibility, enabling users without extensive technical expertise to create software solutions. Artificial Intelligence (AI) is increasingly integrated to enhance these platforms, enriching their capabilities in software and product development. However, this convergence raises ethical concerns regarding biases embedded in the underlying algorithms and trained datasets of these AI models. This paper investigates the intersection of AI and LCNC platforms, focusing on gender bias. Specifically, it explores whether gender stereotypes are perpetuated within AI-Powered LCNC solutions, contributing to unfair decision-making and reinforcing societal inequalities. To examine this, we developed an AI-Powered Low-Code solution incorporating several well-established AI tools, including those from OpenAI, and designed two experiments. The first experiment tested gender-role biases in professional associations, while the second explored gender biases in sentence completion tasks. Our findings indicate that AI-Powered LCNC solutions can inadvertently perpetuate gender biases, reinforcing stereotypes related to professions, education, and traditional gender roles. This paper also discusses the implications of these findings, offering insights to support ongoing efforts to promote fairness and reduce bias in AI-Powered LCNC solutions.  

*** Title, author list and abstract as submitted during Camera-Ready version delivery. Small changes that may have occurred during processing by Springer may not appear in this window.