Over and over again in this course, we see the echoes of a deep pattern that we exposed in the writing of a one bit program. We have the problem domain, the system semantics, and a mapping between the two created by the programmer in the light of the desire. This pattern is the central act of computer programming. It may not be an understanding in itself, but the ability to do this is the only evidence that one can have that one has actually understood a problem in the terms of a given semantics. If the semantics are rigorous and testable like those of a digital computer, one might claim a
deep' ortrue' understanding, but this is suspect because someone can always pop over the horizon and say, `See it this way!'
This pattern is so important we want to focus attention on it. Although we have avoided fatuous jargon without any real meaning behind it, we want to introduce a term, `The Knight's Fork', to tag this pattern. We've borrowed the term from chess. In it, a Knight sits on the board and can make a number of L-shaped moves. The other pieces are all constrained to move on diagonals or orthogonal paths, but the Knight's L shapes allow it to threaten two pieces, each themselves constrained to their own worlds, and thus accomplish something useful in any case.
This kind of pattern occurs over and over again, but everywhere we can track it down to the writing of the one bit program. A computer system can be in many states and evolve according to its own internal logic. the reality the computer is following can also be in many states, and itself evolve. Because of the designer's insight, a critical aspect of the problem can be abstracted and captured in the computer, the same pattern in both cases, such that in any case, computer and reality will conform with each other. The test cases, informed by a model of the problem and of the system, will cover the permissible (and possibly impermissible) state space of the inputs according to an insight of the writer, such that in any case, the system's state evolution will be verified. The designer, looking at a need to perform data manipulation, will exploit features of the data that genuinely indicate structure in the data, and map this to features of the language, as in the canonical:
while((c = getchar()) != EOF)
All architectural design involves teasing apart a problem by looking at the needs from as many directions as possible until it reveals the structure within itself that the system designer can use to defeat it...
The Knight's Fork always uses an inherent deep structure of the problem domain. Checking that a proposed deep structure is real and not just a coincidence is very important. If a designer exploits a coincidence, the result will be
clever' rather thanelegant', and it will be fragile, liable to explode into special cases provisions all over the resulting system code, with all design integrity lost. Weinberg gives the example of a programmer writing an assembler. He discovered that he could do table lookups based on the opcode number and so designed his program. But the hardware people did not hold the opcode numbering scheme sacrosanct, and when they made a valid change, the program design broke.
The Personal Layered Process
A Zen koan tells of a wise monk who visited a great teacher. He entered the teacher's room and sat before him.
When you came in' asked the teacher,which side of the door did you leave your stick?' The monk did not know. `In that case, you have lost your Zen'.
After you have seen the structure of your program and are ready to implement it, there is still a great deal to keep control of. Even if you can see the critical lines of code there are still a great many others to type in. The discipline required is far greater than any formal process could control and must be applied intelligently in each new situation.
Your process will break a task down so far, and then you must take over. Like a track-laying vehicle, you must structure your work as it develops. After a while, you get to the point where you can do this in your head, very quickly indeed, because you can get leverage out of two techniques.
You can only expand the part of your plan that you are working on. At one point in an activity to add a change to some source might be held in your mind as:
- Identify all files that include functions:
Book all files out of version control.
3.1. Change modread.c
3.1.1. Hack ModelOpen()
3.1.2 Hack ModelRead()
3.1.3. Hack ModelWrite()
3.1.4. Hack ModelClose()
3.2. Change appfile1.c
3.3. Change applile2.c
Book files back in.
Update conman system.
The fact that the process definition can't spell out every little step and so doesn't insult your intelligence in a futile attempt to do so, doesn't absolve you from the duty to do the job for yourself. And it's quite proper to leave how this is done up to you - it allows you to do the necessary organization in your head or any other way that pleases you. Some people like to write down little lists of files to modify on scraps of paper and cross them off as they do them, but leave the rest of the process in their heads. They can remember where they are in the big picture, but if they're interrupted in the middle of a big list, they might get confused.
The second important technique is that you can change your plans. The core concept of TQM is that we must understand what we are setting out to achieve if we are even going to know when we have got there. This means that we need to be able to say honestly what we think we are doing at any time, but does not stop us from changing our minds! For example, we might add to the example above,
3.1.5. Sort out all the headers :-(
at any time as we are changing the function definitions and our bored little minds are roving backward and forwards and realize that the prototypes will be wrong too.
We do not need to remember which bin we threw our morning coffee beaker in to have the total understanding of where we are in our organizable work. Instead, we can take control of the spirit of TQM and organize ourselves with full consciousness of what we are doing. As we do this, all the usual benefits of thinking about what we are doing come about. We can see opportunities to automate the boring typing with scripts and macros, and within the PLP we can always ask the question `How would I undo this action', which is what makes people who don't accidentally delete all their source, and have to wait two hours for the administrator to retrieve last night's tape backup.
As a final comment on this topic, we often need to use a PLP to control the complexity of even the simplest job in a professional engineering environment. The ritualisation of PLP can become hypnotic. To keep proportion, always ask yourself if there is a 30-second hack that would accomplish the task, and if you can just do it, don't waste time on elaborate self-created rituals. Always keep a backup!