4 Comments

I remember two cases:

One of Amazon where their automatic resume filter almost always suggested males. Because the training data had a lot of males. The model picked up the spurious correlation.

One is when YouTube recommended more peadophilic videos because of a botched metric. It lead to an echo chamber.

Expand full comment

Glad you wrote about this. I’ve been thinking about it since xkcd wrote a comic about it last week.

Expand full comment