End lusers are only part of the problem

16.08.2006
The response to last week's column about introverts had me thinking about some of the experiences I've had with technical professionals. Last year, I wrote an article titled "Dumb and Dumber" that discussed security problems that resulted from "stupid users." My book, Spies Among Us, also presents case studies of some of my penetration tests, where the behavior of users resulted in billions of dollars of potential loss.

As I wrote that column, I remembered the old cartoon of one programmer telling other programmers, "You all start programming, I'll go find out what the users want." Sadly, there's a lot of truth to that cartoon. I've had many of experiences where the obliviousness of computer programmers rivals that of the users.

It can run faster -- but it can't get out the door

In one case, a fellow programmer came to me to complain about how some people he promised to write a program for were suddenly all over him for the program. I asked him how long they'd been waiting. He replied that it was over a month but asked, "Why are they all of sudden complaining now?" A week or so later, I asked him if he ever got the program delivered. He replied that he had finished it, but was rewriting it in assembly language because he thought it would run faster.

Geek-to-user translation service

In another case, I was on a project team that was improving a system in use by an Army field unit. After we got the first phase of the update implemented, a user came over to me and the lead programmer and said something to the effect of, "This system sucks. It's too slow and it doesn't give me what I asked for when I want it."

The bewildered (and upset) programmer replied, "I can't fix 'sucks.' I need to know specifically what 'sucks' means."

The user was standing there, equally bewildered. I told the programmer, "The system performance level is unacceptable."

The programmer replied, "Well, the reason the system may run slow is that the semaphores inherent in the Unix operating system, as well as the overhead typically associated with a relational database management system, use a great deal of the processor availability."

The user turned back to me. I explained, "There's nothing we can do about it. It's as fast as we can make it."

The user said, "OK" and walked away "knowing" that we understood his concerns.

Created in his own image

I worked on a project team installing a system in the Pentagon. The project manager left, and the our boss picked the most senior programmer to take over as project manager. He decided that he wanted to review every subroutine.

Without getting too geeky here, everyone else on the programming team had a uniform way of indenting "if-then" statements. The new manager preferred a different way and said that he couldn't read the code the way everyone else wrote it. He then started going through every program file to reindent the code his way.

He also started to microanalyze the program logic. For example, there are sometimes multiple ways to create a logic test, such as "If x=True" or "If x is not False." The project manager went through every program with the programmers to find out why they wrote the logic the way it was. He then argued with the people to find out why they didn't do it the way he would have done it.

This was only the beginning. Needless to say, the project, which was ahead of schedule when the manager took the reins, fell weeks behind. Also needless to say, nobody came to his defense when the client lodged major complaints about the delay.

Where's the joystick?

I was in an acquisition office overseeing the acquisition of a system for signal analysts. You analyze a signal by determining the high points and the low points of the wave on a computer screen. During the predeployment testing of the system, while it was still in the developer's facility, a user from the field took one look at the system and said, "Where's the joystick?"

The universal response was, "What joystick?"

The user then told everyone that the signal analysts had a joystick on the current system that allowed them to move the cursor around the screen to the points of the signal. The developers told him that there was no joystick in the requirements document that the user helped to write, just a requirement saying, "The system will allow the user to graphically mark the points of the signal." The developers then showed him how he could use the arrow keys to move around the cursor, albeit one cursor block at a time, around the screen -- naturally a process that took 10 times as long as it would have with the joystick.

When presented with real-world information on how users currently interacted with the system, the developers naturally chose to throw the facts out -- telling the user that they didn't want to specify a joystick in case the developers found something better and faster (eg. thumbwheels). Meanwhile, here are the arrow keys...

Jam tomorrow, and jam yesterday, but ...

Many years ago, I worked at a large government contractor that was trying to reach Capability Maturity Model Level 5. That standard defines a stringent systems development process, and all defense contractors were supposed to achieve it. I believe the Software Engineering Institute was in on the process of defining that standard.

As part of achieving Level 5, the contractor must be able to train its staffers in their methodology. Because I have teaching experience, I was chosen to attend the "Train the Trainer" session for the systems engineering course. The methodology and the course were developed by one of the company's most senior and best-respected engineers. Of course, he had his doctorate and loads of experience developing systems for years.

I hadn't forgotten the earlier joystick incident, and sometime toward the end of the course, when we covered the whole methodology, I told the doctor that it appeared that there was very little user involvement in the process. He told me that I was mistaken, because during the requirements phase, systems engineers interviewed users to determine what requirements they might have. He also pointed out that users were closely involved in the final acceptance testing.

I asked him why there couldn't be end-user reviews throughout the process to provide input. He replied that since they were involved in the final acceptance testing, that this was unnecessary. When I pointed that it could be years between the requirements-gathering phase and the acceptance testing, and that it is possible that things could change or that developers could misinterpret user requirements, the instructor told me that this wouldn't happen because the requirements would be written well, because the methodology is so strong. He also added that I probably didn't have experience on good projects. Long story short, I was informed the following week that I was not chosen to be a trainer, and the contractor continues to get hit for failing megacontracts even though it has such a strong methodology, according to the person who developed it, anyway.

And beyond

I have more stories, and I suspect many of you do too. Clearly, improperly socialized security professionals create their own issues, whether because they don't bother taking reality into account or because they arrogantly assume users will make more than minor changes to their ways in order to accommodate the security system. Security policies that are difficult to follow will be bypassed by the users, and there's a good change they'll create their own security problems in the resulting confusion.

It's also important to remember that when a regular user screws up, generally there's only so much damage they can do on a well-managed system. A small programming error, on the other hand, can wreak major havoc. A couple of lines of bad code in a power-management facility, for instance, can cause and have caused major outages. The recent leak of America Online user searches was caused by a small mistake from a privileged user. Just remember the new old saying, "To err is human; to really screw up takes an administrator."