Skip to content

Apple’s AI summaries include racial & gender biases, if the query is vague enough

When specifically tailored queries made to test Apple Intelligence using developer tools are intentionally ambiguous about race and gender, researchers have seen biases pop up.

Smartphone screen displaying text notifications from messaging, news, TV, and home lock status in the notification center.
Examples of notification summaries on an iPhone

AI Forensics, a German nonprofit, analyzed over 10,000 notification summaries created by Apple’s AI feature. The report suggests that Apple Intelligence treats White people as the “default” while applying gender stereotypes when no gender has been specified.

According to the report, Apple Intelligence has a tendency to ignore a person’s ethnicity if they are caucasian. Conversely, any messages that mentioned another ethnicity regularly saw the notification summary follow suit.

Continue Reading on AppleInsider | Discuss on our Forums