Author: Mark Paterson

by Mark Paterson, University of Pittsburgh Problems of racial and gender bias in artificial intelligence algorithms and the data used to train large language models like ChatGPT have drawn the attention of researchers and generated headlines. But these problems also arise in social robots, which have physical bodies modeled on nonthreatening versions of humans or animals and are designed to interact with people. The aim of the subfield of social robotics called socially assistive robotics is to interact with ever more diverse groups of people. Its practitioners’ noble intention is “to create machines that will best help people help themselves,”…

Read More