Human Assumption Behavior

A couple months ago, I returned from a cross-country trip during which I met with business and education leaders in Raleigh, New York, and Chicago. It was a busy but useful trip. After returning home, my mind was of course flooded with ideas to sort out and investigate, but I found myself distracted by the memories of what I’m calling “human assumption behavior” I observed.

We humans think of ourselves as advanced, typically right, intelligent, and capable of observing our surroundings. And yet so often when we’re not paying close attention we find ourselves doing some pretty dumb things. We make assumptions; we skip thinking through what consequences will come of our actions (or inactions) because we assume we’re smart, that we’re making wise decisions. We do things that if we saw others doing similar we’d point and laugh. So I’m going to point and laugh (and pretend that I have never, ever done anything similar, which of course is foolishness and an example of human assumption behavior).

“I’m not trying to micromanage, but…”

On one of my flights, as the plane was loading, I overheard a gentleman sitting behind me talking on his cell phone. I wasn’t paying close attention, but what I overheard made me giggle a little on the inside. He was clearly some sort of leader, a team manager or perhaps the CEO of a small company. He was on a conference call with his team, which seemed to consist of at least 4 or 5 people, but it could have been more. Someone on the other end of the phone, I think it was a UX engineer or some sort of designer, was giving a verbal walk-through of a user flow of some sort.

The CEO on the phone “listened” while gathering his things, then interjected: “I’m not trying to micromanage; and you know me, I don’t want to micromanage, but…” From that point until the plane came to a full stop, the doors opened, the passengers unloaded, and the group of folks seated around me in the plane had all exited into the terminal, the CEO non-stopped micromanaged. He was (I’m not joking) “drawing” out and designing a UI for his UX engineer over the phone.

What I found particularly interesting was that by prefacing his feedback with “I’m not trying to micromanage,” it somehow became OK in his mind to micromanage. By saying he didn’t want to do something, he convinced himself it was then OK to do it.

Describe and Enable, Not Prescribe

Recently, I’ve been working with a very smart and experienced individual to design a new technology + process product. I can’t get into the details of it right now for confidentiality reasons, but basically we’re trying to solve a pretty significant pain-point for most mid-size to large companies. Part of the solution is technology, but a healthy part of the solution (I’m going to guess better than 50%) is a change to process. The technology helps support the new processes, but it doesn’t on its own fix anything.

To have the solution work, we need employees to use the technology in a specific way. My business partner thought it best to have the technology force or compel users to comply with the specific flow we’re thinking up, to prescribe behavior with the technology. I disagreed. I think UX should describe and technology should enable, but neither should prescribe behavior.

How does a word processor prevent users from writing bad content? It doesn’t. It has tools that help improve quality, but no widely-used word processor actually prescribes how a user should write content. The most they’ll do is offer up a few example templates.

I believe it’s best to trust users to do the right thing, knowing full well that some will make mistakes and pervert technology in dumb ways. Give users the tools necessary to fulfill their jobs, but don’t treat the users like parts of a machine. Just like the CEO who “didn’t want to micromanage” felt compelled to micromanage, I think UX should not try to micromanage users.

“Information causes change. If it doesn’t, it’s not information.”

There are these “facepalm” moments, times when someone says or writes something that sounds profound but is so obviously incorrect and ridiculous. I’ve noticed a lot of folks (sometimes myself included) will hear these things and be “emotionally impressed” by what we think is profound insight or wisdom; but if we think through the statement even just a little, we can quickly realize how stupid it is. Here are a few examples I’ve recently encountered:

Information causes change. If it doesn’t, it’s not information.

When I’m flying my airplane, I have a lot of instruments that provide me information about the status of my airplane and the flight. Most of the time that information doesn’t evoke change. It tells me that I don’t need to change. It tells me I’m on course. It’s extremely useful to have this information because if I’m on course, I don’t want to get off course, which is what will happen if I make changes when on course. If I’m off course, then yes, the information should evoke a positive change. Change is not inherently always a good thing. It’s good if you’re off course.

If you continue to question what you believe, you can enter a world where there’s nothing you can do to see anything as negative.

It’s vogue these days to think that it’s best not to judge, to not discriminate between good and bad. It comes from a deep desire not to be ridiculed or to be judged. If I fear being judged, then I will promote the idea that being critical of anything is wrong (except that it’s OK to be critical of being critical). But it’s critical we be critical, most especially of ourselves.

Being critical is even in the first half of the buffoonish quote. “If you continue to question what you believe…” To question what you believe is to be critical of what you believe. It means you’re routinely if not persistently re-evaluating, judging, discriminating past assumptions.

There are universal truths. There are things in the world that are indeed negative. We can (and should) question what believe, but the goal of questioning is not to get stuck in an infinite loop, never finding answers. The goal is to find answers, resolve the questions, and move on. It’s wise to come back to the same questions later to re-evaluate our previous findings based on new experience, but that doesn’t invalidate the goal of not getting stuck in an infinite and pointless loop of indecision.

Artists are not included in the debate on how we build the economy for the future.

Last I checked, nobody is giving me any preferential inclusion over artists in the “debate” on the future economy. And what is a debate anyway? There are plenty of people writing about it, arguing with each other. Mostly on blogs. Do artists not have blogs?

Notice the passive voice: “Artists are not included.” That passive voice tries to hide the assumption that someone’s excluding artists. Who’s excluding them? If the “debate” on the future economy is raging mostly on blogs, and if artists aren’t involved, isn’t it the artists who are excluding themselves by not participating?

Assuming Policy Compliance

This morning, I was reading an article about Target’s recent bank card security failure. What was of particular interest was how Trustwave (a security auditor) was included in a law suit to recover losses. Trustwave conducted security audits on Target.

I’ve long said that typical outside security audits asymptotically approach worthlessness. Most “security experts” I’ve worked with in my technology career have read a lot of books but have a difficult time contemplating real-world technical security. They audit based on best practices, not on what makes sense. If you design a system that complies with their documented best practices but hides critical flaws, you’re golden. If you design a system that doesn’t comply but is actually far more secure, you’ll get dinged on your audit.

There’s an assumption that once an audit is complete and best practices are prescribed, a company that puts in place a policy to review the best practices on a routine basis will always be in compliance and will therefore be secure. Essentially, it’s the belief that if there’s a policy, then folks will be in compliance with the policy, and all will be well.

I’ve seen this pattern elsewhere. When we were boarding for that same flight with the CEO who “didn’t want to micromanage,” one of the flight attendants made an announcement asking folks to turn their luggage a particular way when stowing it in the overhead compartments so as to most efficiently use the storage space. She assumed that by making the announcement, the passengers would universally comply. (I’m not sure why she made this assumption since I can only image that any flight attendant would know full well the folly of such an assumption after the experience of even their very first real-world flight.)

A bit later on, there was a late arriver who was seated in a forward section of the aircraft and had a carry-on item. As a result of her assumption, the flight attendant never bothered to look inside any of the overhead compartments to see if there was any space available. She assumed they were all full. She was about to move the carry-on item off the plane into checked baggage when the passenger stood up, opened the overhead compartment above his head, rearranged the luggage, and found plenty of space for his carry-on.

This wasn’t a good blog post

I don’t have some sort of grand conclusion or insight here. I suppose this post is more a series of rambling observations about a particular human behavioral phenomenon. Maybe they’re a set of behaviors that I only think are related. I think it’s valuable to take note of the behavior of others, analyze them, and apply the lessons to our own lives. I think it’s easier for us to spot flaws in other people than it is to spot our own flaws. So instead of starting with introspection and “discovering” that we’re doing pretty well, maybe we ought to see patterns in other people (both good and bad patterns), and then see if we can filter the bad patterns from our behaviors and practice the good patterns, turning them into habits.

Leave a Reply