UNDERSTANDING HOW UNDERGRADUATE STUDENTS PERCEIVE BIASES IN AI-GENERATED IMAGES, A RESEARCH-THROUGH-DESIGN PROBE

DS 131: Proceedings of the International Conference on Engineering and Product Design Education (E&PDE 2024)

Year: 2024
Editor: Grierson, Hilary; Bohemia, Erik; Buck, Lyndon
Author: Silva, Veronica; Buzzo, Daniel; Hernández-Ramírez, Rodrigo; Ayanoglu, Hande
Series: E&PDE
Institution: IADE, Portugal; CODE, Berlin; The University of Sydney, Australia
Page(s): 258 - 263
DOI number: 10.35199/EPDE.2024.44
ISBN: 978-1-912254-200
ISSN: 3005-4753

Abstract

The speed and proficiency of generative Artificial Intelligence (AI) systems have proliferated in recent years, enabling more people, including design students, to use AI-generated images for their projects. However, it has been well documented that the Large Language Models supporting AI generators have incorporated troublesome gender and race biases during training (Wellner et al., 2020). Undergraduate students, whose visual culture and critical skills are still in development, often lack the capacity to identify such biases in the images they obtain when using AI generators. This can lead to visual outputs that perpetuate prejudiced representations of people (Hall et al., 2023). To better understand the nature of this problem and potential ways to mitigate it, we conducted a design probe study on a group of first-semester undergraduate design students in Lisbon, Portugal. The results of this study can be used by teachers to guide their students better and researchers to develop methodologies to help younger generations identify biases in AI generative systems. The impact of this research extends beyond the classroom and can benefit other educators and designers of future AI generative systems. Most importantly, it can contribute to curtailing the perpetuation of race and gender biases in today's society.

Keywords: AI-Generated Images, Bias in AI, Undergraduate Design Students, Design Probes, Research Through Design

Download

Please sign in to your account

This site uses cookies and other tracking technologies to assist with navigation and your ability to provide feedback, analyse your use of our products and services, assist with our promotional and marketing efforts, and provide content from third parties. Privacy Policy.