Discussion about this post

User's avatar
Vaibhav's avatar

I remember two cases:

One of Amazon where their automatic resume filter almost always suggested males. Because the training data had a lot of males. The model picked up the spurious correlation.

One is when YouTube recommended more peadophilic videos because of a botched metric. It lead to an echo chamber.

Expand full comment
Wood Rodgers's avatar

Glad you wrote about this. I’ve been thinking about it since xkcd wrote a comic about it last week.

Expand full comment
2 more comments...

No posts