One of Amazon where their automatic resume filter almost always suggested males. Because the training data had a lot of males. The model picked up the spurious correlation.
One is when YouTube recommended more peadophilic videos because of a botched metric. It lead to an echo chamber.
I remember two cases:
One of Amazon where their automatic resume filter almost always suggested males. Because the training data had a lot of males. The model picked up the spurious correlation.
One is when YouTube recommended more peadophilic videos because of a botched metric. It lead to an echo chamber.
It's always tricky to pick metrics
Glad you wrote about this. I’ve been thinking about it since xkcd wrote a comic about it last week.
Glad you liked it