Found in 11 comments on Hacker News
nix0n · 2023-02-13 · Original thread
People are already misusing statistical models, in ways that are already causing harm to people.

See this HN thread from 2016[0], which also points to [1](a book) and [2](PDF).

I definitely agree with you that it's going to get a lot worse with AI, since it makes it harder to see that it is a statistical model.

[0]https://news.ycombinator.com/item?id=12642432 [1]https://www.amazon.com/Weapons-Math-Destruction-Increases-In... [2]https://nissenbaum.tech.cornell.edu/papers/biasincomputers.p...

randcraw · 2021-10-12 · Original thread
"Explainable" just means you didn't build your app/service using any technology that isn't interpretable. Her argument is that the strategy of reverse engineering a method to convert it from inexplicable to explicable is inherently less effective than maintaining explicability at all times in the app's genesis -- from the design phase through implementation.

But Rudin's Premise is philosophical more than practical. If the problem at hand is better solved using a black box (in terms of accuracy, precision, robustness, etc), her premise says simply, don't do it. Unfortunately in the cutthroat world of capitalism, that strategy can't compete with the cutting edge.

Where Rudin's Premise is more suitable is in writing regulations to address AI app problems where social unfairness is unchecked (like the COMPAS app that advises legal authorities on meting out parole decisions without explaining its reasoning). There are many such (ab)uses for AI today in social services or policing which merit rethinking since AI-based injustice so offer bedevils the proprietary lack of transparency in such apps.

Another excellent discussion of problems like these is Cathy O'Neil's book "Weapons of Math Destruction". Too bad she couldn't share the Squirrel prize. https://www.amazon.com/Weapons-Math-Destruction-Increases-In...

Jtsummers · 2021-03-23 · Original thread
The data usually has clear biases present against certain ethnic groups and economic classes. Also you have to look into which laws are broken and feed into the data (again, reflects back on the first sentence). If jaywalking and other minor crimes go into the prediction algorithms, are those crimes treated equally throughout the area and population? Is it really the case that there's no jaywalking in the middle class neighborhoods or is it just that the police only apply it in the poor neighborhoods? This creates a bias in patrols where they step up in areas with more charges, which makes sense on the surface until you examine which areas those are and why they have more charges in that area or amongst that population.

For a fuller treatment on this I recommend Weapons of Math Destruction by Cathy O'Neil (https://www.amazon.com/Weapons-Math-Destruction-Increases-In...).

flother · 2018-03-01 · Original thread
And Cathy O’Neil's book Weapons of Math Destruction, mentioned in the episode, is also well worth reading.

https://www.amazon.co.uk/dp/0141985410

> In other words, by the time we notice something troubling, it could already be too late.

For me, this is the key motivating point - the horse may have left the barn by the time we act. A lot of times people say this is exaggeration but "Weapons of Math Destruction" is a nice read on unintended side effects of this phenomena [0].

[0] https://www.amazon.com/Weapons-Math-Destruction-Increases-In...

Calling BS on big data is really important, but this article is weak. The New Yorker should be doing better. Try Weapons of Math Destruction by Cathy O'Neill for a much more informed critique.

https://www.amazon.com/Weapons-Math-Destruction-Increases-In...

jasode · 2017-04-23 · Original thread
The "critical thinking" Rob Kitchin is talking about is analyzing algorithms' impact with a social lens. Because algorithms affect people's lives, we shouldn't be content with letting them be opaque black boxes.

It seems to have overlap with the themes in the book by Ed Finn "What Algorithms Want - Imagination in the Age of Computing".[1]

Both say that algorithms are intensely studied from a technical perspective. E.g. O(log n) is better than O(n^2), etc.

Their idea is that the algorithms themselves are creating their own "culture" or "reality" and this should be studied through the lens of "humanities" or "sociology" instead of just "mathematics".

E.g. neural net or statistics algorithm computes that Person A is better credit risk than Person B. However, observers notice that Person B is always black and therefore claim that algorithms are (re)creating racial inequality. Or algorithms that provide sentencing guidelines for convicted felons. Or algorithms that diagnose medical problems.

Other writings with somewhat similar themes:

- Cathy O'Neil, "Weapons of Math Destruction - How Big Data Increases Inequality and Threatens Democracy"[2]

- Eli Pariser, "The Filter Bubble"[3]

There doesn't seem a universal term coined that generalizes the ideas in all 4 of those books but nevertheless, I'm sure more and more writers will notice they are talking about similar ideas.

Side observation about language usage... What I notice in all 4 books is that authors are using the word "algorithms" as a catch-all term for "machine learning". They're not really concerned about building-block algorithms such as "quick sort" or "discrete Fourier transform". What they're all talking about is "Facebook machine learning" is imposing X on us, or "Google's machine learning" is making us think Y. For some reason, the word "algorithm" has gained more currency than "machine learning" in these pop science books.

[1] https://mitpress.mit.edu/books/what-algorithms-want

[2] https://www.amazon.com/Weapons-Math-Destruction-Increases-In...

[3] https://www.amazon.com/Filter-Bubble-Personalized-Changing-T...

Dowwie · 2016-11-21 · Original thread
For more information about "ethics and algorithms", read: "Weapons of Math Destruction" [1] or at least listen to the EconTalk podcast between the author and host Russ Roberts [2]

[1] https://www.amazon.com/Weapons-Math-Destruction-Increases-In...

[2] http://www.econtalk.org/archives/2016/10/cathy_oneil_on_1.ht...

clumsysmurf · 2016-08-20 · Original thread
Yet another book out recently which explores this topic:

"Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy"

https://www.amazon.com/Weapons-Math-Destruction-Increases-In...

Fresh book recommendations delivered straight to your inbox every Thursday.