Sentiers No.20

Ursula K. Le Guin, Lanier, AI, Luddites, Cities

Learn more

Feedback is always appreciated but this week I have a small request; it’s always a challenge to decide exactly which articles to feature more, which to discard so this newsletter remains readable, and which to include only as links. If you have opinions on things I spend too much time on or pieces you would have liked to have a better grasp of before clicking (in other words, not just a link), hit reply and tell me about it!

House keeping: When I quote a tweet that isn’t a thread, I’ll add an 🐦 emoji so you know it’s a tweet and there’s nothing else to read beyond the quote. (Except of course everything else the often great people have to say.)

Ursula K. Le Guin
No doubt the biggest news in my feed this week, the passing of Ursula K. Le Guin. A monumental writer who will be sorely missed, especially in these times.

“One of the many, many things Le Guin gave us was a subtle one: that the “science” in science fiction could also be the social sciences, and that, indeed, without it, no science fiction could be entirely complete” 🐦

Jeet Heer with an excellent thread on Ursula K. Le Guin, Boasian anthropology & the trajectory of 20th century science fiction.

Le Guin was part of a great shift in science fiction, often called New Wave, which had many dimensions (literary, countercultural, feminist) but was also a move from xenophobia to xenophilia.

“Capitalism’s grow-or-die imperative stands radically at odds with ecology’s imperative of interdependence and limit. The two imperatives can no longer coexist with each other; nor can any society founded on the myth that they can be reconciled hope to survive.”🐦

Ursula K. Le Guin: A Rant About “Technology”

Technology is the active human interface with the material world.

… But the word is consistently misused to mean only the enormously complex and specialised technologies of the past few decades, supported by massive exploitation both of natural and human resources.

Jaron Lanier interview: on VR, LSD, and where Silicon Valley went wrong
Fantastic interview by Ezra Klein, touches on a lot of topics. Thought provoking ‘recap’ by Lanier of the Arab Spring, Gamergate, Black Lives Mater, Alt-Right, Me Too sequence of events where movements based on good intentions end up reverberating into gigantic negative feedbacks and negative movements. There’s also a good part on openness and being ourselves in early blogs vs today vs social media vs podcasts. I think the thing they don’t specifically mention but is part of their discussion is intimacy. There is less negative feedback in podcasting (according to Klein) and in newsletters (according to me) because we feel just enough more intimacy to see the people. Hearing the voice, getting the message from the person in email. Different than through a feed in social media. Doesn’t mean it’s perfect and nothing bad happens but those two ways of communicating feel more comfortable. So far.

What I find interesting about the three following articles is that, although they are right about the problems with black boxes and transparency, they also point to issues with individuals also having biases and being blackboxes (i.e. hunches based on years of experience). Algos could be more knowable (documented, transparent), quicker, and more widely available. The big gaps in prior research (for bias, ethics, etc.) and massive over stating of current capabilities shouldn’t push us away from the great potential, when done right.

We Need to Open Algorithms’ Black Box Before It’s Too Late

Problems such as this one are fairly easy to fix, but many companies simply don’t go to the trouble of doing so. Instead, they hide such inconsistencies behind the shield of propriety information. Without access to details of an algorithm, in many cases even experts can’t determine whether or not bias exists.

Others are working to come up with ways to test the fairness of algorithms by creating a system of checks and balances before an algorithm is released to the world, the way a new drug has to pass clinical trials.

A Popular Algorithm Is No Better at Predicting Crimes Than Random People
Good look at the COMPAS algorithm, random people, information overload, bias, and knowing what our algos operate.

Artificial Intelligence’s ‘Black Box’ Is Nothing to Fear
(Although there are some problems with this article and Rachel Goodman has a good thread on Twitter highlighting some of them.)

Human intelligence can reason and make arguments for a given conclusion, but it can’t explain the complex, underlying basis for how we arrived at a particular conclusion.

++ Element AI opens London outpost with focus on ‘AI for good’

Why the Luddites Matter
And just in case I sound too positive (😉) above, the true Luddite vision is good to keep in mind when looking at technologies.

That which makes the Luddites so strange, so radical, and so dangerous is not that they wanted everyone to go back to living in caves (they didn’t want that), but that they thought that those who would be impacted by a new technology deserved a voice in how it was being deployed.

Smart City Portrait: Barcelona
A number of the projects mentioned here are not new but a good overview of everything they’re doing, the general ideas behind it, and especially the citizen focus.

Data ownership and data protection issues must then be addressed so that Barcelona can make this important resource – its ‘city data commons’ – available to its people and businesses, who could use the information to strengthen their economic position.

++ A flaneur kind of thread by Craig Mod on Twitter with lots of photos.

love Tokyo. It’s a city that itself becomes a tool — moving through it, leaning on its infrastructure, efficient, dependable, complex but operating rationally (kind of), this is what a healthy city feels like.

++ It’s not you. Commuting is bad for your health.
That subtitle nicely integrates two of my most hated words. “My side hustle is commuting.” Nice short video piece on commuting though.

++ The ‘Retail Apocalypse’ Has a Silver Lining
A lot of what’s in here reminds me of Cory Doctorow’s Maker novel a few years back. Reclaiming empty shopping malls.

The Churn
We Are Truly Fucked: Everyone Is Making AI-Generated Fake Porn Now

An incredibly easy-to-use application for DIY fake videos—of sex and revenge porn, but also political speeches and whatever else you want—that moves and improves at this pace could have society-changing impacts in the ways we consume media. The combination of powerful, open-source neural network research, our rapidly eroding ability to discern truth from fake news, and the way we spread news through social media has set us up for serious consequences.

Which is closely related with this from No.17: Artificial Intelligence Is Killing the Uncanny Valley and Our Grasp on Reality.

++ Plastic Is Riddling Corals With Disease

Their branches and crevices were frequently festooned with plastic junk. “We came across chairs, chip wrappers, Q-tips, garbage bags, water bottles, old nappies.”

Montana just showed every other state how to protect the open internet

Bullock’s executive order stipulates that in order to receive any contract from the state government, an internet service provider must not engage in paid prioritization, block or impair access to online content, or unreasonably interfere with a user’s ability to select and access broadband internet service.