Python slams into its exponential wall

April 6, 2014

My first Python sighting was around 1999, in the part of the building where the hackers hang out. Somebody had a poster on the door saying something like If you would have used Python, then you would have been done by now.

Next stop 2005. Since 1986 I had been a fan of “Structure and Interpretation of Computer Programs” by Harold Abelson and Gerald Sussman. At the University of Waterloo it was only in a fourth-year topics course that this book could make a brief appearance. I was envious of MIT, where masses of first-year students took EECS 6.001, where Abelson and Sussman was used from day one. That was class. In 2005 I heard the sad news that, after two glorious decades, EECS 6.001 was closed down, replaced by a course where the text was … a book written for high-school students. Maybe EECS wouldn’t have picked that book if its language would have been BASIC. Perhaps it had to do with the book’s choice of Python. It was from this news item that I learned that Python is a programming language.

Read the rest of this entry »

Boltzmann’s Brood

March 8, 2014

When Ludwig Boltzmann (physicist, 1844–1906) was criticized for his ugly brute-force calculations, his defence was: “Elegance should be the concern of shoemakers and tailors”. The criticism was of course not about elegance in the sense of fashion. It was about the fact that an elegant structure would have had the power to convince the reader of whatever the calculations were intended to demonstrate.

I think this episode has something to do with the fact that much of what we do on computers depends on software that, among other weaknesses, is in need of frequent and urgent “security updates”. Those who criticize this state of affairs are dismissed as unrealistic. To be realistic is to understand that any useful piece of software consists of millions of lines of code written by many programmers, mostly no longer traceable and certainly not accountable. No one has a clear view of the system as a whole; there is no description that enables anyone to gain such a view. The realists of today, who accept this state of affairs as inevitable, are of Boltzmann’s brood.

Read the rest of this entry »

The H-bomb and birth of the computer, part III: triumph of brains and slide-rules

April 2, 2013

The question that led to von Neumann’s involvement with Edvac in 1944-1945 was whether Teller’s design (the “Super”) for a hydrogen bomb was feasible. The question that von Neumann wanted to settle was whether in Teller’s design a fission explosion causes a self-sustained propagation of fusion in fusible material.

When Eniac was ready for the first trials in December 1945, von Neumann had convinced its owner, the Ballistics Research Laboratory, to give the ultra-secret computation from Los Alamos priority. Only the researchers from Los Alamos, Metropolis and Frankel, had the requisite security clearance to know the subject of the computation. The necessary personnel to help operating the Eniac did not. The problem was finessed by ruling that the equations and the data only were demoted from their lofty top-secret classification. Eniac performed splendidly, all 18,000 tubes working in unison for a sufficient proportion of time to get the computation completed in six weeks.

Read the rest of this entry »

The H-bomb and the computer, part II: the revolution that almost didn’t happen

April 2, 2013

In spite of the tumultuous development of computers, the architecture in the form of the fetch-execute cycle has remained the same from EDVAC design of 1945 to the present day. And we are used to call this basic architecture “von Neumann machine”. This makes John von Neumann a sort of patron saint of our field.

Not everyone accords this status to the great mathematician who lived from 1903 to 1957. One of my favourite architecture books refuses [5, page 32] to use the term because of the supposedly equal contributions by J. Presper Eckert and John Mauchly. Many of those who do accord full credit to von Neumann for the architecture invoke the name in the pejorative sense of “von Neumann bottleneck” and suggest that the architecture has hindered rather than helped the development of computers.

In this essay I review publications that shed light on the origin of the computer and conclude that it was von Neumann who made the critical contributions in 1944 and 1945. In addition I will argue that von Neumann’s was the basic architecture that propelled the computer along its miraculous trajectory covering three orders of magnitude in size and five in cost and processor speed. Finally I will reflect on the fact that von Neumann was not only an extraordinary genius, but also that he combined in his background a most unusual combination of disciplines — a combination that was essential to the birth of the computer. Without this fortuitous confluence of circumstances the development of computers in the period 1950 – 2000 would not have had the explosive character that we actually experienced. Hence “The Revolution That Almost Didn’t Happen.”

Read the rest of this entry »

The H-bomb and the birth of the computer, Part I: Edward Teller’s obsession

April 2, 2013

In 1945 the stored-program computer was invented; by 1950 every country wanted computers. None were for sale, so every country was trying to build them. Not just the big players, like the US and Britain; by 1955 computers had also been built in Switzerland, Russia, the Netherlands, Sweden, and Belgium. At the origin of this wave of enthousiasm was J. von Neumann’s prestige and advocacy. In 1945 he had not only been the first to describe how to build a computer in the modern sense, but he was also infused with the conviction that this was nothing less than a new, universal research instrument with the potential of revolutionizing science. As a result he wanted his design to become as widely known as possible, as soon as possible.

How von Neumann got involved in electronic computing will be the topic of the next instalment. Here I first want to recount why he got involved. It had to do with the unholy alliance of science and war.

Read the rest of this entry »

Flowcharts, the once and future programming language

April 8, 2012

Job-control language, punched cards, and flowcharts are so much relics of the past that those who remember them are starting to die off. There are also research goals of that era that are long forgotten,  not because they have been reached or found not useful, but because they are, well, forgotten. In this essay I consider the ancient research goal of developing code and proof of correctness in parallel. I find that flowcharts are a good starting point, provided one goes back much further than job-control language.

Read the rest of this entry »

McCarthy’s recipe for a programming language

October 31, 2011

On October 23, 2011 John McCarthy died. He is remembered as the inventor of Lisp, for contributions to the theory of computation, and for contributions to artificial intelligence. In this essay I argue that he did more than inventing Lisp; even more than originating the concept of high-level programming: he showed to those willing to see it a recipe for designing high-level languages. I support my argument by turning the clock back to 1950 and imagining how McCarthy’s recipe could have been followed at that point in time. By taking this unusual point of view I hope to bring out more clearly McCarthy’s contribution to programming languages. Read the rest of this entry »

Scruffies and Neats in Artificial Intelligence

September 11, 2011

In a previous essay [0] I traced the Lighthill Affair to the tension between the scruffies and the neats in Artificial Intelligence. As a reminder the official [1] definition of these terms:

” … the neats — those who think that AI theories should be grounded in mathematical rigor — versus the scruffies — those who would rather try out lots of ideas, write some programs, and then assess what seems to be working.”

For a few lines this is a pretty good characterization. But I think it only scratches the surface. In this essay I will explore the contrast in temperament and attitude that exists along several dimensions and is found elsewhere in science.

Read the rest of this entry »

The MIT style in Artificial Intelligence 1958 – 1985

June 7, 2011

The MIT style in Artificial Intelligence 1958 – 1985

In a previous article I illustrated the tension in Artificial Intelligence (AI) between two mentalities, generally referred to as “scruffy” and “neat”. I started by accepting the characterization by Norvig and Russell [1].

“… the neats — those who think that AI theories should be grounded in mathematical rigor — versus the scruffies — those who would rather try out lots of ideas, write some programs, and then assess what seems to be working.”

Critics of my article have opened up so many dimensions of scruffy versus neat that it is hard to know where to start. From my disorganized thoughts one recurrent theme emerged: the distinctive MIT style during the formative years of AI.

Read the rest of this entry »

From the Chronicles of Scruffy Versus Neat: the Lighthill Affair

February 18, 2011

The 1973 Lighthill Affair was an Affair in the sense of the Dreyfus and Profumo Affairs. And although it was scaled down to teacup size, it was big enough to make it into a textbook published twenty years later:

… the Lighthill Report, which formed the basis for the decision by the British government to end support of AI research in all but two universities. (Oral tradition paints a somewhat different and more colorful picture, with political ambitions and personal animosities that cannot be put into print.) [1]

In this article I will put in print some of the things hinted at here, and elaborate on the issues that have remained topical.

Read the rest of this entry »


Follow

Get every new post delivered to your Inbox.