Yet another view...
...When it comes to trailblazers in the field of color equity, Google doesn't grace the top of many lists. But there's a contingent within the company trying to change that. At its I/O 2022 conference, Google introduced a tool it intends to use to improve color equity through representation. It's a set of ten color swatches that correspond to human skin tones, running the whole gamut from very light to very dark. And Google open sourced it on the spot.
Fairness is a major problem in machine learning. It's already difficult enough to reduce human values to an algorithm. But there are different kinds of fairness -- twenty-one or more, according to one researcher. Statistical fairness is not the same as procedural fairness, which is not the same as allocational fairness. What do we do when different definitions of fairness are mutually exclusive? Instead of trying to write one formula to rule them all, Google has taken a different approach: "Start where you are."
Where we are is in a state of desperately unequal digital representation.
Google is the largest search purveyor on the planet, by a long shot. Run an incognito search on Google Images for "CEO," and what you get is a sea of white male faces, two of whom are Elon Musk.
Search for "woman," and it's absolutely true that the results skew young, slender, white, able-bodied.
But one of the faces the search returned was a deepfake of a pale young woman, generated by NVidia's StyleGAN. I've written about this specific deepfake before in a different article, so it surprised me to see her face again. I had to double check that I was in incognito mode -- but I was....