Thursday, October 6, 2016

Cathy O'Neil on Econtalk

I listen to this podcast with Cathy O'Neil on Econtalk.
O'Neil argues that the commercial application of big data often harms individuals in unknown ways. She argues that the poor are particularly vulnerable to exploitation. Examples discussed include prison sentencing, college rankings, evaluations of teachers, and targeted advertising. O'Neil argues for more transparency and ethical standards when using data.
She's a clear leftist, to the point where she almost went on a tangent criticism of Fox News. But she's a leftist who can articulate correctly the objections the other side make. For example she, like any good leftist, believes that tuition is so high because universities are in an arms race for the best students. But she also understands that when government throws a bunch of money into demand, prices are going to rise. Both seem true, so I guess all that's left to talk about are magnitudes. At the end of the day, she gave a lot of reasons why some statistical points are true, but didn't say much to convince me that they were big.

Mostly she just stuck to the quality of the algorithms she criticizes, but more than once she articulated a moral opposition to the algorithm being used regardless of whether it was measuring what it was supposed to measure. My skepticism ears perk up when someone believes that something everyone thinks is great is useless, and even if it were useful it would be unjust to use anyway. Her moral objections to using data that proxies race or class makes me wonder why she doesn't advocate banning employers from using university degrees for hiring.

Overall it sounds to me that the solution is an algorithm that spots lousy algorithms. It sounds coy, but I'm serious.

She advocates more transparency in what metrics are embedded in the algorithm. What isn't mentioned is how that open information can be used to cheat the algorithm. She's very aware of how school districts can exploit certain metrics that make them look better to the algorithm, but not necessarily enhance their performance. If they didn't know what was in the algorithm they wouldn't be able to do that.

I felt like her and Russ were tiptoeing in order to not get into an argument. It worked, and I felt like the interview went well for both of them. But what I heard a lot of what, "yeah, I agree that what you say happens, but my thing is what happens all the time!"

Cathy has a lot of firsthand experience working as a statistician in the private sector, and she likes to use a lot of firsthand stories as evidence. Some of these stories involve a sort of pure evil that only exists if its being interpreted from the out-group. I'm thinking of the predatory venture capitalist who "wants to be treated like a first class citizen, and wants other people who are being prayed upon to be separated."

Yeah, I'm sure that's how it all went down.