When are design patterns the problem instead of the solution?

Come on folks, please read the whole quote, read it carefully. Or even better, read the essay.

Paul Graham criticizes, among other things, C-like languages for not providing adequate means of abstraction. Within the scope of the essay his criticism of patterns is an afterthought, or rather a case-in-point for his main argument. His reasoning goes like this:

It is logical to use common strategies to solve recurring problems. In really abstract languages it is possible to formalize those strategies and put them into a library. Whenever you need to use them, you merely #include them, instantiate them, expand them, or whatever. C-like languages in contrast do not provide the necessary means of abstraction. That is witnessed by the fact that there exists something like "patterns". A pattern is such a common strategy that cannot be expressed by library code and thus has to be expressively written every time it is applied.

Paul Graham does not think that patterns are evil by themselves. They are a symptom of languages that fall short in providing means of abstraction. In that respect he is almost certainly right. Whether we should use different languages because of that, is of course another discussion.

The original poster of the question, on the other hand, is wrong: Patterns are not "symptoms of not having not enough abstraction in your code", but are symptoms of having not enough means of abstraction in your language.


Patterns are really just a way of describing how things work. It is a way of classifying them. Are there some programs that overuse them? Sure. The biggest advantage of having patterns is that by classifying something as this or that, everyone is on the same page (assuming they have the level of knowledge to know what is being talked about.). When you have a system with 10,000 of lines of code, it becomes necessary to be able to quickly determine how something is going to work.

Does this mean that you should always use patterns, no. That will lead to problems to force things into a classification, but you shouldn't shy away from them either.


I don't think the patterns per se are the problem, but rather the fact that developers can learn patterns and then overapply them, or apply them in ways that are wildly inappropriate.

The use of patterns is something that experienced programmers just learn naturally. You've solved some problem X many times, you know what approach works, you use that approach because your skill and experience tell you it's appropriate. That's a pattern, and it's okay.

But it's equally possible for a programmer who's less skilled to find one way to do things and try to cram every problem they come across into that mold, because they don't know any other way. That's a pattern too, and it's evil.