Monthly Archives: March 2015

Mandatory Mind Reading

A tragedy in three acts. You are the doomed hero.


You go to a website on your phone. It's legible for the first split second, and then the "mobile optimized" version renders. That text you started reading? Gone. None for Gretchen Weiners. The scroll lags and life is no longer worth living. There's no button to go back to the real version.


You're thinking of moving to New York because you're tired of the Bay Area. You search for a semi-famous restaurant by name. Google gives you a page full of matches with one keyword in common from San Francisco. You add "New York" to the query. Google does that thing where it looks like it's tired and hangs for a second and a half before giving you total garbage. Three months later in Manhattan you look up the restaurant and it autocompletes the search for you three characters in. The restaurant uses too much salt.


You finish a Slate article. You see two recommended articles you want to read, one about growing marijuana and one about women in tech or something else Slateful. You middle click the first article and it opens in the same tab because Slate is the worst. When you press the back button the other article is gone and all of your recommendations are about marijuana. You smoke a blunt and your company's next twelve technical hires are men.

This is the face of mandatory mind reading. You know what you want and the machines don't care, they know better than you, because look, who are they going to believe, you you, or data you?

(The one page play is over, now for 400 pages of commentary)

When I started writing this, I tried to itemize how much of what I see was algorithmically tailored specifically for me. I stopped because it was everything.

  • Facebook feed
  • Google results
  • Gmail priority inbox
  • Google Drive rankings (I'm really dependent on Google, aren't I?)
  • Every news site I read except Hacker News

It's 2015 and there are APIs to detect what capabilities a user machine has and adjust accordingly. It's 2015 and data mining has gotten sophisticated enough that pretty much everything can be customized on the fly and be kind of close to what the user wants.

These are both good things, but electric voodoo telepathy should be used responsibly. I spend so much time trying to trick software into working when it should notice what I'm doing instead of going off its model of who I am. I have to copy and paste web pages because I need what I'm seeing right now on these pixels here, not whatever the server deigns to give me the next time I ask. I've been working on a personal app that just opens what I've got open in desktop Chrome on Android Chrome and visa versa because I have to switch so often just to get through some sites.

Mind reading doesn't really work for tools. I sometime use my bottle opener as a screwdriver —it's got a little flat part that fits into the screw head and I have no idea where my screwdriver is. However, when the time comes to open a beer, I would be really disappointed if it were magically replaced with a screwdriver.

Please, let me drink in peace.

Some Problems Shouldn't Be Solved

I recently tried to register for an online account with the post office.

Screen Shot 2015-03-12 at 3.10.49 PM

I did not succeed.

Ridiculous password requirements are a subset of a larger problem: computers make it possible to enforce ridiculous rules, and so those ridiculous rules are made. If I had to wait in line at the post office to see a clerk who would register me, how would they possibly enforce this? How much training would they have to have?

I'd hand them a word, they'd see if it fit and tell me, I would appeal if they rejected it. They'd call in someone from the back and we'd waste about 15 minutes trying to figure out what the rules actual are:

CLERK 1: Your password's got to be exactly 10 characters.

ME: I thought that meant at least 10.

CLERK 1: A little help!?!?

Clerk 2 emerges from the back

CLERK 1: Does "password need 10 characters" mean at least 10, or exactly 10?

CLERK 2: At least ten.

CLERK 1: Alright, well it doesn't matter, you didn't use a special character.

ME: I did, I used a caret.

CLERK 1: I don't think that's special.

ME: Come on, that's a special character.

CLERK 2: Not special enough.

ME: What about a pound sign?

CLERK 1: Special enough.

CLERK 2: I don't think that counts.

WOMAN IN LINE: Excuse me, I'm on my lunch break, and I just have one password to change, would —

CLERK 1: You'll be helped when it's your turn!

CLERK 2: How about a question mark?

ME: Good enough.

CLERK 1: Well then you need another character then, because a question mark is a special character, not a character.

ME: That's ridiculous, that totally—

CLERK 2: Not a character.

Computers make this kind of stupidity possible.

Let's say you're in a desert walking along in the sand when all of a sudden you look down, and you see me, who happens to be a tortoise in this story, crawling toward you. You reach down, you ask me if I want to share some files with you. I try to click the button to share, but I can't, not without your help. But you're not helping. Why is that?


I know why you greyed it out: you wanted to let me share under some circumstances, and this is not one of them. This UI pattern is very widely and very justly loathed, but a better design only helps me if the reasons why sharing is disabled are sane.

Is sharing "blah" prohibited because it belongs to another user? That's simple, just tell me. But there's a decent chance it's the fault of my employer's enterprise groupware package with 4000 business rules added on. How do you tell me that I can't share it because it contains a file that has a naming scheme that matches with a pattern that when combined with another present pattern means that it's the output of program A, which when circumstance X happens, then means that if...

Again, think of how a person behind a desk would enforce that kind of rule system. They wouldn't, that's how.

Computers enable a certain kind of product micromanagement. Any complicated whim can be enforced fully and without question. "Well then," we think, "we'll just make the user do what we want them to." This is at the root of a lot of software sadness. Please, let's think before we make someone jump through a hoop: it takes less time to implement than it does to pass through it.