In my previous APP article I recounted how the Japanese Fifth-Generation Computer Systems (FGCS) project was launched with much fanfare in 1982 and came down in flames in 1992, taking logic programming with it. In this article I tell something about how logic programming came to be preferred over the more obvious choice of Lisp for FGCS. I’m interested in the phenomenon of people developing a fierce loyalty to one language or the other. By considering the early history of these languages I hope to give some insight into this phenomenon.
A note about terminology. Logic programming is a method: use logic clauses to represent procedures and data; use resolution inference as the computation step. Prolog is the name of the programming language that embodies these ideas. Both the method and the language played a central role in the FGCS project.
In the opening conference of 1981 Fuchi identified logic programming as the central software methodology for the project. In the closing conference in 1992, Furukawa recounts that in 1976 the research group at the Electro-Technical Laboratory in Tokyo (from where FGCS was later to be launched) had extensive experience with Lisp, but were new to Prolog. As was just about everybody else: in 1976 the language had only spread to a few centres beyond Marseille.
Fuchi had been interested in logic programming [1, page 9] since reading Kowalski’s 1974 paper [7]. This probably caused Furukawa to be on the look-out for Prolog when he visited Stanford Research Institute (SRI) in Palo Alto in 1976. Indeed he did find a copy of the Fortran source of the first version of the Prolog interpreter to be distributed outside Marseille. In 1974 David H.D. Warren had brought a copy to Edinburgh. By 1976 Harry Barrow had taken a copy from Edinburgh to SRI. According to Warren [1, page 9] “Harry had not been able to get the system running himself, but gave the sources to Furukawa.” At the Electro-Technical Laboratory (ETL) in Tokyo a group including Furukawa started experimenting with Prolog in 1976. The US had acted as transmitter of the Prolog bug without being itself infected.
Of the activities at ETL Furukawa reports some Prolog programs for database indexing. Apparently the experience was sufficiently positive for ETL to acquire David H.D. Warren’s DEC-10 implementation of Prolog. Furukawa in [2]:
I wrote a program to solve the Rubik cube problem in DEC-10 Prolog. It ran efficiently and solved the problem in a relatively short time (around 20 seconds). It is a kind of expert system in which the inference engine is a Production System realized efficiently by a tail-recursive Prolog program. From this experience, I became convinced that Prolog was the language for knowledge information processing.
We can take this to mean that Furukawa had fallen in love with Prolog.
For a history of Prolog I turned to “The Birth of Prolog” by Alain Colmerauer and Philippe Roussel [3]. The first steps were taken in 1971 by a group in the University of Aix-Marseille led by Colmerauer. After several preliminary versions aimed solely at natural language processing, the “Final Version” was completed in 1973. Under the influence of Kowalski [7] clausal logic had become the language and a restricted form of resolution served as procedure call mechanism. This was the first version to go under the name “Prolog” and it was the first that was used for applications other than natural language processing. In early 1974 a copy of the source was taken to Edinburgh after a month’s visit by Warren to Marseille. From Edinburgh it rapidly spread to several European centres (Budapest, Leuven, and Lisbon were early recipients) and to Canada.
From 1974 to 1977 there was much joyous experimentation, following the lead of Marseille. In Lisbon Helder Coelho started collecting neat programs in a draft book called “How to Solve It in Prolog”. In 1975 Warren started writing a Prolog compiler in Prolog for the DEC-10. When this was completed in 1977, it turned out to be a spectacular improvement on the 1973 Marseille interpreter. Warren and his co-workers were emboldened to compare their implementation to a state of the art Lisp compiler, with surprising results [6]. Prolog’s list reversal ran at 50% to 70% of the speed of Lisp. This is surprising because lists are translated to logic terms and processed by unification, just like any other term. Symbolic differentiation ran (depending on data) 1.1 to 2.6 times faster than in Lisp. In 1977, Lisp was 18 years old. At that time Prolog was six years old and only a handful of people had worked on its implementation between 1971 and 1977.
The examples that follow are not useful, but are chosen to convey the seductive nature of Prolog.
Consider a program for the predicate “move” such that move(X1,Y1,X2,Y2) means that a Knight can move on an empty chess board from square (X1,Y1) to square (X2,Y2).
With a suitable program, one ask for example:
?- move(U,U,V,V).
meaning that a knight can move from one diagonal square to another in a single jump, and get the answer No. To the question
?- move(U,U,X,Y),move(X,Y,V,V).
meaning that a knight can move from one diagonal square to another in two jumps, one gets the answer Yes, if required with all the possibilities for the variables. The program is:
move(X1,Y1,X2,Y2) :- diff1(X1,X2),diff2(Y1,Y2). move(X1,Y1,X2,Y2) :- diff2(X1,X2),diff1(Y1,Y2). % diff1(X,Y): the difference between coordinates X and Y is 1. diff1(X,Y) :- succ(X,Y). diff1(X,Y) :- succ(Y,X). % diff2(X,Z): the difference between coordinates X and Z is 2. diff2(X,Z) :- succ(X,Y),succ(Y,Z). diff2(X,Z) :- succ(Z,Y),succ(Y,X). % succ(X,Y): Y is the successor coordinate of coordinate X. succ(1,2). succ(2,3). succ(3,4). succ(4,5). succ(5,6). succ(6,7). succ(7,8).
As another example, consider the problem of the twenty-seven digits. This is solved by the lists
[1,9,1,6,1,8,2,5,7,2,6,9,2,5,8,4,7,6,3,5,4,9,3,8,7,4,3] [1,9,1,2,1,8,2,4,6,2,7,9,4,5,8,6,3,4,7,5,3,9,6,8,3,5,7] [1,8,1,9,1,5,2,6,7,2,8,5,2,9,6,4,7,5,3,8,4,6,3,9,7,4,3] [3,4,7,9,3,6,4,8,3,5,7,4,6,9,2,5,8,2,7,6,2,5,1,9,1,8,1] [7,5,3,8,6,9,3,5,7,4,3,6,8,5,4,9,7,2,6,4,2,8,1,2,1,9,1] [3,4,7,8,3,9,4,5,3,6,7,4,8,5,2,9,6,2,7,5,2,8,1,6,1,9,1]
with the property that for each of the digits d = 1, …, 9 there are three occurrences of d that are d places apart. The solution requires the rather elaborate query
?-equals(List, [_,_,_,_,_,_,_,_,_,_,_,_,_,_,_,_,_,_,_,_,_,_,_,_,_,_,_]), sublist([9,_,_,_,_,_,_,_,_,_,9,_,_,_,_,_,_,_,_,_,9],List), sublist([8,_,_,_,_,_,_,_,_,8,_,_,_,_,_,_,_,_,8],List), sublist([7,_,_,_,_,_,_,_,7,_,_,_,_,_,_,_,7],List), sublist([6,_,_,_,_,_,_,6,_,_,_,_,_,_,6],List), sublist([5,_,_,_,_,_,5,_,_,_,_,_,5],List), sublist([4,_,_,_,_,4,_,_,_,_,4],List), sublist([3,_,_,_,3,_,_,_,3],List), sublist([2,_,_,2,_,_,2],List), sublist([1,_,1,_,1],List).
but the program is small, containing just a few general-purpose predicates:
% sublist(X,Y) iff all elements of X occur contiguously in Y in % the same order as in X. sublist([],List). sublist([X|Xs],[X|Ys]) :- append(Xs,_,Ys). sublist(List,[Y|Ys]) :- sublist(List,Ys). append([],Y,Y). append([U|X],Y,[U|Z]) :- append(X,Y,Z). equals(X,X).
I think the early history of Prolog is remarkable: with the involvement of a handful of people, in six years from inception to an elegant, powerful, and efficient language, still one of the Big Four today; in the conceptual universe, if not in mind share. The early history of Lisp is no less remarkable. In the 1950s McCarthy was searching for a language suitable for programming his Advice Taker project. Candidates were Newell and Simon’s IPL-V and the Fortran compiler then being developed by Backus’s group at IBM. IPL had the advantage of a flexible data structure in the form of linked lists. In other respects McCarthy preferred Fortran because of the generality of its nested expressions. Accordingly he experimented with FLPL (Fortran List Processing Language), a version of the Fortran compiler to which lists had been added. It must be remembered that at this time user-defined functions existed in Fortran only as an undocumented late feature and that the function definitions were restricted to a single line [5, page 302]. Not surprising that McCarthy embarked on a list-processing language of his own. By November 1958 an interpreter was running at MIT. The transition to a form that we today recognize as Lisp came soon after, as a side-effect of writing a paper on the project.
One of the features to be checked off in the paper was the universality of the language: could everything computable indeed be computed by programs written in it? Showing a program for a universal Turing machine was one option. One can sympathize with McCarthy preferring the universal function of recursive function theory. So far this theory had always been played in the domain of the natural numbers. McCarthy saw that the numerical nature of the domain was not essential; that any denumerable domain would do. This led to a universal function for functions on symbolic expressions (“S-expressions”). The universal function took as input a description of any function and the description of its arguments. It produced as output the result of evaluating the application of the function to the arguments. To write such a universal function the syntax of function definitions had to be changed. It needed to be represented by data, that is, as an S-expression.
… This EVAL was written and published in the paper and Steve Russell said, look, why don’t I program this EVAL and you remember the interpreter, and I said to him, ho, ho, you’re confusing theory with practice, this EVAL is intended for reading not for computing. But he went ahead and did it. That is, he compiled the EVAL in my paper into 704 machine code fixing bugs and then advertised this as a Lisp interpreter which it certainly was, so at that point Lisp had essentially the form that it has today, the S-expression form …[Quoted by Stoyan [5] from a transcript of “History of Lisp”, a 1974 talk by McCarthy.]
In the paper McCarthy chose ATOM, EQ, CONS, CAR, CDR as a small set of operations for manipulating S-expressions. To these he added QUOTE and COND as logical functions. The status of LABEL and LAMBDA is intriguing: EVAL rewrites lists in which these occur as atoms in such a way that LABEL and LAMBDA are implemented in the process.
We can view McCarthy’s universal function as an elegant axiomatization of the concept of computation [8]. What started as a theoretical exercise became Lisp itself. The universal function forced the language to change so that not only data but also functions were expressed as lists. Due to Russell’s insight and programming skill, the universal function became the interpreter.
To answer the question of why people fall in love with Prolog I could not think of anything better than show some neat examples. There is a better method to do the same for Lisp: read the universal function EVAL with its inseparable twin APPLY. This is indeed what happens early on in expositions of Lisp. It is the pons asinorum: my guess is that those who drop out of the course, do so at this point. Once the bridge has been passed the novices sail on to the end of the course and remain Lisp devotees to the end of their lives.
The relative merits of Prolog and Lisp can be, and have been, endlessly debated. In such debates we should ignore the products of people who write Fortran in Lisp or who use Prolog as Pascal plus pattern matching. In short, we should ignore what McCarthy presumably means by “pornographic programming” [4, page 220]. In such debates only code should be considered that is written with the “Zen of the language” in mind. Characterizing the Zen of Prolog is perhaps better left to another occasion. For Lisp I would do it in the following way.
From a contemporary point of view, what McCarthy has done is to define a virtual machine. When he did this, it was a paper exercise, solely for expository purposes. As a result he was free to make the machine as simple as was possible from a logical point of view: seven primitives acting on lists (let’s ignore here the distinction between S-expressions and lists) consisting of nothing more than a head and a tail. The amazing thing is that it led to a language that was at the time already feasible. An intriguing fact is that the utterly simple machine has become more feasible over the decades, as memories expanded and processors became faster. On the basis of this virtual machine the EVAL/APPLY interpreter is built that is found in every introductory Lisp course.
Once the novice has acquired the Zen of Lisp it is natural to write any program as an extension to the limited capabilities of the original interpreter. The seven primitives are not just virtual machine instructions to be buried and hidden away from sight after the interpreter is completed. On the contrary, they occur frequently in user code. Hilbert said to the opponents of set theory: “No one will drive us from the paradise that Cantor created for us.” Lispers feel the same way about the paradise created by McCarthy.
The FGCS project was started by people familiar with Lisp who had recently encountered Prolog as an intriguing novelty. The vision (or perhaps mirage) of a new generation of computers based on inference and enabled by parallelism tilted the balance toward Prolog. Had it been the other way, the fate of the FGCS project would not have affected the fortunes of Lisp: it was thoroughly established. As history played out, the fatal choice caused the fragile early promise of Prolog to be crushed by the failure of the FGCS project to reach its stated objectives.
[1] “A View of the Fifth Generation and Its Impact” by D.H.D. Warren, Technical Note 265, SRI International, July 1982.
[2] “Launching the New Era”, several authors, contribution by K. Furukawa, Proc. FGCS92, Comm. ACM, March 1993, Vol.36, No.3, pp. 60-65.
[3] “The Birth of Prolog” by A. Colmerauer and P. Roussel. Second ACM SIGPLAN conference on history of programming languages, pp. 37-52, 1993.
[4] “History of Lisp” by John McCarthy. ACM SIGPLAN Notices, vol. 13, no. 8, Aug. 1978; pp 217-223.
[5] “Early LISP history (1956 – 1959)” by Herbert Stoyan. Proceedings of the 1984 ACM Symposium on LISP and functional programming, pp 299-310, 1984.
[6] “Prolog — the language and its implementation compared to Lisp” by D.H.D. Warren, Luis M. Pereira, and Fernando Pereira. SIGPLAN Notices vol. 12, 1977.
[7] “Predicate logic as programming language” by Robert Kowalski. Proceedings of IFIP 1974, North-Holland, pp. 569-574.
[8] “The Roots of Lisp” by Paul Graham. http://www.paulgraham.com/rootsoflisp.html
September 10, 2010 at 4:38 pm |
Very good article, Maarten.
I know very little about Prolog, but Lisp is really interesting on the same way as set theory is interesting. It generates simple, but hard problems. Unfortunately, just like with set theory, we can solve some problems, but we cannot understand what it is all about …