Olay’s #DecodeTheBias Campaign Reflects Anti-Tech Bias – Center for Data Innovation
The makeup company Olay recently launched a public campaign to #DecodeTheBias—an initiative to “help decode bias in search engines.” Search engines are biased, they argue, because search results for “beautiful skin,” “beautiful face,” or “beautiful woman” do not include enough women of color. While it is easy to blame the algorithm—there is perhaps no activity more likely to win you friends on the Internet than to complain about Big Tech—attacking the technology is misguided and ultimately distracts from the real issues.
As Olay states on its website, “#DecodetheBias highlights one major issue—how data, computer code, and AI reinforce exclusionary beauty standards and exclude women of color. These algorithmic systems rule our world and we have to ensure these systems are not continuing to perpetuate harmful forms of bias and discrimination.” According to the company, the solution to this problem is to increase the number of female coders, and they have offered to send up to 1,200 girls to coding camp—one for every post on social media that uses their hashtag.
To be fair, part of Olay’s message is legitimate. Yes, companies should consider how to improve the accuracy of their algorithms to remove bias. And yes, we need more women and minorities in STEM.
But there is also something ironic and self-serving about a company from the cosmetics industry—which is premised on the idea that individuals need to alter their appearances to appear beautiful—leveling the charge that the problem is the technology.
After all, as Naomi Wolf described over 30 years ago in The Beauty Myth, from glamour magazines to mass media, it is the cosmetics, fashion, and thinness industries that create and perpetuate these standards of beauty that are particularly harmful for women and girls. Similarly, the lack of diversity in the fashion and beauty industry is a well-known problem, as well as the cultural dominance of a white standard of beauty that is particularly harmful to Black women. The algorithms might be trained on data reflecting societal biases, but those biases came from the beauty industry.
The implication of Olay’s campaign is that a better algorithm would produce unbiased results. But what would be the “right” search results for “beautiful woman”? Yes, women of color should be fairly represented, as should women from the trans, queer, disabled, and plus-sized communities. But any classification of some people as “beautiful” will necessarily leave out others.
Or consider a search for “beautiful skin.” Again, all skin colors should be well represented. But what about people whose skin has acne, wrinkles, sags, uneven tone, or visible pores? Olay actively markets products for all of these types of skin conditions. Not to mention that the company still sells skin whitening cream. Isn’t the company merely “perpetrating harmful forms of bias and discrimination”—the very activity it says it is trying to prevent?
Finally, consider the fact that searches for “ugly skin,” “ugly face,” or “ugly woman” also tend to not include women of color. Is the goal of #DecodeTheBias to change that? The issue is less with the answers given by search engines and more with the questions we are asking of it.
Blaming data, code, or AI might be a convenient scapegoat, but it ignores the source of the problem. Technology allows us to hold a mirror up to society. If you don’t like what you see, then work to change society, don’t blame the mirror.
Image credit: Unsplash