Readers of this column may have noticed a certain predilection towards superannuated preoccupations. Like Software Verification (February 5, 2009). Like Structured Programming (April 22, 2009). On both of these topics I came out with a rather nuanced opinion, to put it euphemistically. Here I continue the tradition of the superannuated, except you’ll find me out-raving the consensus: I’ll argue that Object-Oriented Programming is not only the greatest since sliced bread, but even greater than what the consensus has been able to fathom.
To see why this is so, let’s step way back and note the often overlooked fact that programming requires thought. Deep thought. Sometimes so deep that it hurts. That this is my experience may say more about me than about programming. But listen to E.W. Dijkstra, who was a famously clever programmer. He was a student of physics in Leiden, where Lorentz and Ehrenfest had been professors of theoretical physics. He was ambitious, being groomed for a career in theoretical physics. Theoretical physics in the 1950’s, where quantum electrodynamics was unfolding, with sanguine expectations for the unification of the four fundamental forces of nature. As a part-time job he was programming (what’s THAT?). Yet he decided to switch to programming because he found it a greater intellectual challenge .
Programming requires thought. Programming is done in a language. Now, some philosophers came up with the idea that language shapes thought. There’s a lot of controversy about that, but that’s caused by the mistake of applying the idea to natural languages. We may think that our mother tongue is very far removed from that of the Hottentots, but in the universe of all possible languages, natural and artificial, the natural ones all occupy the same small corner. Probably they all severely constrain thought, but in much the same way. In mathematics, however, a new notation (i.e. addition to language) may enable breakthroughs. For example, the philosopher Alfred North Whitehead (1861-1947) seems to have written
By relieving the brain of unnecessary work, a good notation sets it free to concentrate on more advanced problems.
Something to keep in mind when comparing existing programming languages or designing new ones.
Which brings me to object-oriented programming. Here the idea is to model an application from the ground up in terms of objects. In one widely used approach , one notes the nouns and the verbs appearing in an English description of the application. The nouns are then considered as candidates for the classes; the verbs as candidates for the methods. This is a bit like inventing a new notation. Is it a good one? At least it has the advantage that its methods and classes are adapted to the application.
Something more powerful is possible. For example, prototyping a compiler in Prolog is so much easier than it is other languages that it looks like magic. Those familiar with Lisp or functional programming languages probably have plenty of examples with similar magic. From this I would just like to take away the notion that we aim at a more powerful aid to thinking than the nouns and verbs of a description in English. Whitehead was more likely to be excited about the phenomenon that some programming tasks are miraculously easier in one language than in another. Whatever makes this magic happen, it is likely to be at a higher level than words.
But is object-oriented programming restricted to translating the nouns and verbs of an English description into classes and objects? If we can model in transactions, invoices, customers, and that kind of thing in an object-oriented way, why not the key concepts that make a particular programming paradigm miraculously effective for the application?
Let us consider functional programming. In procedure-oriented languages, numbers can be (1) named, (2) assigned to variables, and (3) returned as function results. In these languages, functions are underprivileged in that they share with numbers property (1), but not (2) and (3). The motivation for functional programming languages is to accord to them all the privileges of numbers, and thus make them “first-class objects”, as was the going terminology in functional programming, long before object-orientation hit the headlines.
It was only a matter of time for the consequences of this view to be worked out. In fact, it was done twenty years later by Felleisen and Friedman. It took the form of a one-page appendix to . However, because of the absence in Java of generics at the time, it only works for integers as datatype. The lesson was that object-oriented programming may indeed be a way to implement functional programming, but that one needs generics. What about logic programming, a very different paradigm, and possibly complementary to functional programming? If it has not happened already, someone will soon analyze a high-level description of the datastructures and operations of SLD-resolution, the basis of logic programming, in the manner of . He will duly note the important nouns and verbs in the English description and create classes and methods accordingly. The magic of logic programming, in the situations where it is effective, is then available to the user of these classes.
In this way we have available two suites of classes: one arising from functional programming and one from logic programming. Just as classes serve to organize what would otherwise be an unruly jumble of methods and globals, we need a higher level of organization to bring order to classes. Gamma, Helms, Johnson, and Vlissides called this level that of “frameworks” and distinguished it from design patterns:
When we wrote “Design Patterns” we were excited about frameworks and forecast that there would be lots of them in the near future. Time has shown, however, that frameworks aren’t for everyone. Developing a good framework is hard and requires a large up-front investment that does not always pay off. 
In the case of frameworks for functional and logic programming the hard work has already been done. The existing modest success of these ghetto-ized programming paradigms is a reasonable assurance of sufficient pay-off for corresponding frameworks.
This is my argument for using a high-level description in English of a promising programming paradigm and subjecting it to the standard object-oriented modeling treatment. The resulting object-oriented framework acts like a virtual machine tailored to the targeted programming paradigm. Such a framework is only interesting if it gives rise to efficient code. How efficient is “efficient”? My position on this is that, say, application programming in Java is efficient (one level of virtual machine), but using a virtual machine written in Java is not. Unfortunately, that leaves C++ as our only choice. As  showed, generics are needed and it is not clear that those of C++ are up to the task.
Here is another approach, one suggested, to me at least, by the intriguing essay “The Origins of Lisp” by Paul Graham’s . The best summary is Graham’s own:
In 1960 John McCarthy published a remarkable paper in which he did for programming something like what Euclid did for geometry. He showed how, given a handful of simple operators and a notation for functions, you can build a whole programming language. He called this language Lisp, for “List Processing”, because one of his key ideas was to use a simple data structure called a list for both code and data.
Graham continues by explaining the primitives. He implements them in Common Lisp.
For me the exercise is less convincing because of the worry that one may be cutting corners via the machinery of the underlying Common Lisp. More familiarity with Lisp than I have may be enough to dispell such suspicions. My unease would be completely laid to rest if one would implement lists at a lower level, one that is not connected to Lisp. A way to make such a list implementation really painless would be to implement it as an abstract data type by means of a page or so containing a few class declarations in C++. The result would be an object-oriented framework ready for use in those situations arising in practice where functional programming and/or Lisp-ish tricks would make the programming task at hand miraculously easy to do.
The method followed by Graham in his resurrection of McCarthy’s paper is also interesting for logic programming. The method suggests asking whether logic programming can be construed as a handful of primitive operators on a simple datastructure. The answer is No. Or, maybe, Almost. Because one only needs to make a little sideways step, from logic programming to Prolog, and then we do have this desirable situation. Prolog is based on a simple datastructure and a handful of operators on it. The datastructure is the Rational Tree of Colmerauer. It may be argued that the Warren Abstract Machine, the standard implementation of Prolog, can be interpreted as an implementation of rational trees, but that would be to miss the point. The power of Lisp resides in the fact that the seven primitive operators of McCarthy are as easily and as naturally accessible from any user’s Lisp code as are the functions defined by the user. This reflexivity is lacking in Prolog.
Two of the commanding heights of the programming landscape, functional and logic programming, find in one of these two ways a natural basis in a good-enough object-oriented programming language. As for the third commanding height, state-oriented programming, why, that is endemic in any object-oriented programming language. That’s why I find myself mightily impressed by the magics of functional and logic programming, but even more by that of object-oriented programming.
 P.L. Frana, Oral history interview with Edsger W Dijkstra,
Charles Babbage Institute, University of Minnesota, Minneapolis.
 Rebecca Wirfs-Brock, Brian Wilkerson, and Lauren Wiener:
“Designing Object-Oriented Software”. Prentice-Hall, 1990.
 Matthias Felleisen and Daniel P. Friedman:
“A Little Java, a Few Patterns”, MIT Press, 1997.
 Erich Gamma. “Design patterns — ten years later”.
Manfred Broy and Ernest Denert, editors, Software Pioneers,
pages 688–700. Springer-Verlag, 2002.