Saturday, August 2, 2008

Pattern for Pattern

I was buried deep in the coding trenches, again. Once I get going, the lines fly by, each fitting comfortably into place, finding its own groove.

As I sat back and looked over what I had written, I achieved one of those "Aha" moments. That instance when you've come to a deep understanding that has eluded you for a long time, possibly years. The type of thing that is so obvious, that once you know it you're left to wonder why it took this long for you to really see it.

But before dumping out my revelation, it is best to understand this type of knowledge in its own context. A little bit of history helps.


SOME EXPOSURE

My first exposure to the Object Oriented (OO) paradigm was back in the late eighties. I saw C++ used successfully for developing complex, yet highly visual programs. The one-to-one correspondence between the things on the screens and the objects in the source code meant that any desired changes to the presentation of the program were easily mapped back to the correct responsible source code. If you saw something you didn't like, it was obvious where it needed to be changed in the code. The extra work to decompose the code as objects in this case was hugely paying off.

While I appreciated the strengths of the paradigm, it was a long time before my employers came to allow the related technologies into production systems. Software development is often slow to change.

The early nineties left me programming with a strict Abstract Data Type (ADT) philosophy, similar to objects but a little different. The code is decomposed into lists, arrays, trees, etc. These form the building blocks for the system. It's a very similar decomposition to Object Oriented, but it is not enforced by the implementation language. It is close enough that you could use Object Oriented design techniques.

In those days Object Oriented programming was gaining in popularity, but a movement towards real-world objects had taken over. The result was the application of 'brute force' towards the development of objects. The results were ugly.

Instead of thinking about it in abstract terms, the dumbed-down solution was to just start pounding as much behavior into every object as possible. Presumably, this was to get going fast and avoid deep thought. As with all "ain't it easy if you just follow these 3 simple rules" approaches it works for simple code, but bloat and poor performance were soon to follow. Then came un-maintainability, the death nail to any fragile system.

At this point two good things happen: Java arrived, and the Design Patterns book by the GoF was published. The language was clean and simple, but the patterns were the big leap. To me it was a renewal of abstract programming techniques into the Object Oriented world. I spent considerable time on a pet project implementing many of the design patterns in Java, just to see how well they worked.

Bending the focus back towards the abstract helped in trying to elevate the quality of the code. Programming isn't hard, but it isn't thoughtless either, you still need to spend some internal wetware cycles to find a clean readable ways to accomplish your goals. You have to be doing it for a very long time before it becomes so instinctual, that you no longer need to think about it. Thus schemes to make programming simple and easy never work.


OVER THE YEARS

I ping-ponged between ADTs, and OO code over the next few years. In one notable case -- possibly because of the language but also because it was easier to debug -- I shied completely away from OO code, and fell back into ADTs. I didn't have the luxury of being able to spend a lot of time fixing problems, so the code had to be ultra simple. ADTs don't have the 'philosophical' problems that objects have, there are no conflicting 'right' ways to use them. Put them together, and off you go.

It was in and around that point that I kept feeling that the decomposition of the world into this object-type model was exceedingly awkward at times. Some things fit easily into objects, but many did not. If there isn't enough value to justify the decomposition, all of the extra work is wasted, and quite possibly negative.

I was following the developments and discussions, but I kept failing to climb on the "objects rule" train. But worse, I was beginning to suspect that there was something horribly wrong with patterns as well.

At one point, in a job interview someone ask me what patterns I would use to create an email server. That question really bothered me. I fumbled through my answer, not convincingly, but in the back of my head I knew that the question itself was completely wrong. Had I been keen on the job I might have questioned it there and then, but it was just one of many warning signals to avoid this particular position. Some fights aren't worth having.

A while back, I returned back to the Object Oriented world, specifically using Java. For all of the advances in libraries, and frameworks and things, I was very disappointed with the state of the code I was seeing. It should have been more elegant and less fragile. Most of it wasn't. It seems that the standards and conventions skewed back towards being brute force again, probably because programmers get scared, so they belt out the code.

What was difference this time, was how Design Patterns, which I thought were a great movement, had somehow managed to get subservient into more brute force techniques. How could something so abstract, be wound down into with such crude implementations. I knew there was a reason for this, but I couldn't put my finger on it, until now.


A BIT OF A REVELATION

At this point in my career it's fairly safe to say that I am proficient with Objects and Design Patterns, but I'm not what you might call a "true believer". I know all the techniques and rules, but for readability I'll often stray from them. I know all of styles and conventions, but I only pick from some of them. I'm comfortable building up a huge Object Oriented system, but if I was extremely tight on resources, again, I'd prefer ADTs. I use the technologies because they are there and popular, but I am far from convinced that they are truly paying for all of their insecurities. For some code it works well, for some it does not.

Back to the problem at hand.

So here I was trying to leverage polymorphisms to save me having to splat out a lot of permutations of simple code. I also wanted to use inheritance to cut down on the duplicated sequences of code. The thing I was trying to write broke down into three base objects, one of which broke out into dozens of different but similar ones. The parent, I used as an abstract factory for creating the kids. Each of the child objects was to act as a proxy controlling a specific set of underlying system widgets. The proxy aspect was critical because it meant that I could explicitly limit resource use in the case where there where huge structures, and there were going to be huge structures. From the outside the kids shouldn't be visible, just the parent. I want to encapsulate their differences away from the rest of the system. And most of this should be in the same file, so its easier to see and visually inspect the code for inconsistencies.

So, it was a heirarchy, composition type model, where one of the nodes acted as a singleton factory facade towards proxies, that were partially flyweights. And there was an observer to collate and redirect events. Really, there were a whole lot of design patterns, that I wanted to get working in this time code space.

Now at this point, one has to understand that once I became suspicions of design patterns I stopped trying to push them hard. I felt it wasn't right to name something xxxxFactory for instance, but I had no clue as to why? It just felt vaguely Hungarian. I kept using them, but I wanted to deemphasis their presence.

In this code I kept to the rule about calling all Objects for their real-world names, while avoiding any mention of their underlying pattern construction. Also, even though I know the patterns, I didn't limit my self to their strict implementation. I simply wanted them to be vaguely like a singleton or factory, not actually like that. They are patterns after all, just starting places for code construction, not Lego-like building blocks.

And that is where and when it hit me. Hit me hard.

Of course, patterns are just patterns, not building blocks. Not ADTs. Not things that you use to compose a system. They are just suggestions. But that, is not really the big revelation, I had always believed that. The real understanding came from looking back at my code.

I had just badly implemented seven or eight patterns, but I had only done so with a few basic classes.

Ohh. Dang. Ahhhh. Yes. It hit like a meteorite.

A pattern, being just a pattern means that for any sequence of code multiple patterns can and will "overlap".

They'll all be placed in the same small set of code. You can have a Singleton/Proxy/Factory or a Composite/FlyWeight/Iterator, or whatever crazy combination of patterns will make the underlying objects most appropriately behave in the simplest manner that makes them usable.

The way we model the data inside of the computer is not composed of a bunch of patterns as building blocks, instead it is composed of a bunch of objects that span several different patterns all at once. It was how this had been flipped, that was bothering me so much.

Then because there are many possible patterns existing in the same space, the obvious corollary is that you clearly don't want the pattern names to influence the objects.

My hunch about not calling the objects xxxxFactory was absolutely correct. That's just ugly and will cause trouble over time, for exactly the same argument that proved it was a bad idea to stick the 'data type' into a variable name. Hungarian notation was clearly wrong. Code changes, but programmers are lazy. Thus things quickly get misleading, which wastes time, lots of it.


AND FINALLY

So, that is it underneath. Patterns, I've alway known shouldn't be taken as literal building blocks for code construction. Doing so results in massively complex code because the implementation of overlapping patterns as many disconnected artificial objects is a massive increase in complexity for no real reason. That type of "right" way to do it, isn't even close.

The worst thing we can do as developers is to deliberately introduce unnecessary complexity into the solution. Things are complicated enough without having to make it more so.

Following some rigid, fixed breakdown for coding comes at the expense of making it readable. Oddly the industry has always had a bipolar nature where many of the younger programmers pine about how programming is an art form, while looking for quick and easy ways to mindlessly build code. It's a funny sort of hypocrisy, that gravitates programmers towards the extremes.

Patterns are a place to start, and they should overlap each other. They are very different than data structures, and should not be confused with a set of axiomatic building blocks. Patterns shouldn't appear in class or method names, but then again neither should type or data-structure. The "thing" that the object represents in the real-world, abstract or not, should be used a simple name for the object. That keeps it obvious and cuts down on the length of the names.

That fixes some of my complaints about OO, but I still have trouble applying the code to highly long sequences of actions. Noun based code is easy, but protocol based stuff is just damned ugly. Still, I am looking for ways to just plunk an algorithm into one single big containing object, ignoring the proper way to use the paradigm, so that it is easy to debug. It might not be "right", but at 4am I'm less interested in being right, and more interested in getting back to sleep. One has to keep their priorities straight.

9 comments:

  1. Spot on.

    I've always used patterns in a back announcing fashion. Describing things I've already built.

    I tend to let the requirements mold the algorithm into the shortest path from input to output data. Obviously experience comes into that a lot, including knowing the patterns, but I certainly don't place any weight into representing the patterns via specific classes and standard representations.

    That's probably the difference in thinking of patterns as general ideas, or specific implementations. For instance I think of the iterator pattern as embodying the loop control statment within a class so that it can be passed around. Not as a specific implementation of IIterator with moveNext etc.

    ReplyDelete
  2. >>...one of those "Aha" moments...

    the feeling of these moments are called "metanoia"
    =)))

    ReplyDelete
  3. Thanks for the comments :-)

    @trb,

    Before design patterns became popular I used to refer to the same basic idea as "mechanisms", but they tended to be a bit broader than patterns, yet more general. The're all just ways to recapture the thinking, so we can work faster.

    @contrapunctus,

    From the wikipedia definition of metanoia, I'm not sure if the idea of "repentance" is actually correct or not :-) True repentance in coding only happens a decade later, when you're fixing your own code. Only then will you find that you're truly sorry :-)

    Paul.

    ReplyDelete
  4. Really good reading with morning coffee. Thank you.

    ReplyDelete
  5. Just use Python :-)

    ReplyDelete
  6. @jan,

    Glad to hear that, I've seen a few people complaining that I didn't just reduce it all to a giant set of bullet points, but I don't think reference formats are particularly sticky (easily forgotten material).

    @anonymous,

    Technologies by themselves don't make your life any easier unless you understand how to use them. In programming, there is always some structural layer that you must add on top of the code in order to achieve simple, readable, thus elegant code. A language, or its culture may help a bit in this regard, but it's certainly not guaranteed.

    Paul.

    ReplyDelete
  7. Hello Paul,

    Your espoused view of design patterns falls more in line with what Christopher Alexander had in mind when he wrote the book A Pattern Language. With that book, he was beginning a twenty to thirty some-odd year investigation as to what constitutes good design in architecture. Those 'patterns' were simply a catalog of features that he identified as commonly present in buildings that could be said to exhibit good design. He later extended this work quite a bit in looking for the underlying principles whose application lead to those 'patterns' naturally falling out from the design process. Most of this later work is captured in his four volume work "The Nature of Order".

    So, I suspect when you look back at the system you're building you'll see where certain patterns exist in the code even though your design goals were to write concise, clear, and abstract code and not to implement Java design patterns.

    ReplyDelete
  8. The correct term is "death knell".

    ReplyDelete
  9. @js,

    Sorry, I'm a bit late in replying. Christopher Alexander comes up a great deal in "Patterns of Software", by Richard P. Gabriel. It's been a while since I've read it, but I remember it was an interesting read.

    @tony,

    Thanks for the comment. I'm guessing that "death knell" is a substitution for "aha moment". In as much as I think people use them incorrectly, I still think patterns are important. In the early days I used to call them mechanisms -- prescribed ways to implement certain repeatable problems.

    We want these types of abstractions, we just have to be careful how we use them. Going to the extremes always causes a problem whether or not the underlying ideas were reasonable.

    Paul.

    ReplyDelete

Thanks for the Feedback!