Archive for the ‘Uncategorized’ Category

Conceptual Integrity: why it matters and how to get it

September 8, 2017

In “No Silver Bullet” [1] Frederick Brooks addresses the intriguing question of why some programming languages garner fanatical adherents while others are merely tolerated by their users. Brooks’s answer is that the critical criterion is whether a language has in his words “conceptual integrity”. In this article I try to nail down this nebulous concept and see how it can be used as a guide in language design.

(more…)

“The unreasonable effectiveness of mathematics in the design and implementation of software systems”

July 22, 2017

Alas! If only the current state of affairs were such that a future scholar would feel impelled to write a paper with this title, just like in 1960, when Eugene Wigner wrote his widely quoted “The unreasonable effectiveness of mathematics in the natural sciences”. In the hope that at some future time it will improve the sorry state of software, let us consider how mathematics came to be “unreasonably” (i.e. surprisingly, mysteriously) effective in the natural sciences.

(more…)

Explosive Knowledge: Cryptology in the 20th Century

March 28, 2017

In August 1960 the Pentagon announced that William Martin and Bernon Mitchell had not returned from vacation and said “there is a likelihood that they have gone behind the Iron Curtain”. On September 6 they appeared at a joint news conference at the House of Journalists in Moscow and announced they had requested asylum and Soviet citizenship. They revealed that they had worked for the National Security Agency (NSA). In this way the mission and activities of the NSA were made public for the first time [1]. Although these activities are much more wide-ranging than cryptology, this post will only be concerned with that small part.

All branches of knowledge had vigorously developed in the first half of the 20th century. All of it had been sustained by what I like to call a conversation: an open exchange of knowledge in books and journals. Before World War I this was also true for cryptology; afterwards, traffic on that channel fell silent. By the end of the 20th century the cryptology conversation was intense, wide-ranging, and immensely productive of innovations, of which bitcoin technology is but one example. In this post I trace the chain of events that led cryptology from its dark age, which lasted from 1918 to 1967, to its renaissance. My material is obtained, unless otherwise noted, from Crypto, a book by Steven Levy, published in 2001 [2].

(more…)

Alan Robinson

September 16, 2016

On August 5, 2016 John Alan Robinson died. In him, a great scientist departed, and I mourn a dear friend. His great discovery was the resolution principle in mathematical logic, a discovery that capped two decades of development. In its turn, it spawned a plethora of new developments in computer programming. It became important enough in Artificial Intelligence to become controversial. (more…)

Wildcat Crypto

May 13, 2016

With his “Pretty Good Privacy” (PGP), Phil Zimmermann launched the era of what I call Wildcat Crypto. Up to that point discussions on cryptology had focussed on whether DES, the US government standard, was secure and whether government involvement in its development might have jeopardized its security. To avoid government involvement, PGP used IDEA as alternative to DES for its bulk encryption. Yet IDEA is a close relative of DES and shares its weakness: small, constant-size blocks. True Wildcat Crypto calls for a radical departure in the form of blocks that are much larger, and moreover, vary in length under control of the key. In this article I trace the relevant history and outline my implementation of such a radical alternative.

(more…)

IA or AI?

October 9, 2015

I’ve been reading a paper [1] written two decades ago, which itself is an account of events two decades before that. It is Alan Kay nominally writing about the history of a programming language called Smalltalk. I say “nominally” because already by Smalltalk 72, Kay was losing interest in language matters and wanted concentrate on what the language, as part of the computer-as-medium, could do for thought. Along with various other sources of inspiration that led to Smalltalk (Sutherland’s Sketchpad, the Burroughs B220, the Burroughs B5000, …) Kay mentions attending in 1968 the conference presentation by Douglas Engelbart [2] that has since become known as the “Mother of All Demos”.

In 1968, when timesharing by users behind teletype terminals was regarded as avant-garde, Engelbart gave a demo that featured a number of firsts: a screen display for both and text and graphics, interactive text-editing, a mouse. All that was integrated into a fluidly handled medium. Kay gives testimony to the huge impression this made on him: “Engelbart was a prophet of Biblical proportions”.

(more…)

The H-bomb and birth of the computer, part III: triumph of brains and slide-rules

April 2, 2013

The question that led to von Neumann’s involvement with Edvac in 1944-1945 was whether Teller’s design (the “Super”) for a hydrogen bomb was feasible. The question that von Neumann wanted to settle was whether in Teller’s design a fission explosion causes a self-sustained propagation of fusion in fusible material.

When Eniac was ready for the first trials in December 1945, von Neumann had convinced its owner, the Ballistics Research Laboratory, to give the ultra-secret computation from Los Alamos priority. Only the researchers from Los Alamos, Metropolis and Frankel, had the requisite security clearance to know the subject of the computation. The necessary personnel to help operating the Eniac did not. The problem was finessed by ruling that the equations and the data only were demoted from their lofty top-secret classification. Eniac performed splendidly, all 18,000 tubes working in unison for a sufficient proportion of time to get the computation completed in six weeks.

(more…)

The H-bomb and the computer, part II: the revolution that almost didn’t happen

April 2, 2013

In spite of the tumultuous development of computers, the architecture in the form of the fetch-execute cycle has remained the same from EDVAC design of 1945 to the present day. And we are used to call this basic architecture “von Neumann machine”. This makes John von Neumann a sort of patron saint of our field.

Not everyone accords this status to the great mathematician who lived from 1903 to 1957. One of my favourite architecture books refuses [5, page 32] to use the term because of the supposedly equal contributions by J. Presper Eckert and John Mauchly. Many of those who do accord full credit to von Neumann for the architecture invoke the name in the pejorative sense of “von Neumann bottleneck” and suggest that the architecture has hindered rather than helped the development of computers.

In this essay I review publications that shed light on the origin of the computer and conclude that it was von Neumann who made the critical contributions in 1944 and 1945. In addition I will argue that von Neumann’s was the basic architecture that propelled the computer along its miraculous trajectory covering three orders of magnitude in size and five in cost and processor speed. Finally I will reflect on the fact that von Neumann was not only an extraordinary genius, but also that he combined in his background a most unusual combination of disciplines — a combination that was essential to the birth of the computer. Without this fortuitous confluence of circumstances the development of computers in the period 1950 – 2000 would not have had the explosive character that we actually experienced. Hence “The Revolution That Almost Didn’t Happen.”

(more…)

The H-bomb and the birth of the computer, Part I: Edward Teller’s obsession

April 2, 2013

In 1945 the stored-program computer was invented; by 1950 every country wanted computers. None were for sale, so every country was trying to build them. Not just the big players, like the US and Britain; by 1955 computers had also been built in Switzerland, Russia, the Netherlands, Sweden, and Belgium. At the origin of this wave of enthousiasm was J. von Neumann’s prestige and advocacy. In 1945 he had not only been the first to describe how to build a computer in the modern sense, but he was also infused with the conviction that this was nothing less than a new, universal research instrument with the potential of revolutionizing science. As a result he wanted his design to become as widely known as possible, as soon as possible.

How von Neumann got involved in electronic computing will be the topic of the next instalment. Here I first want to recount why he got involved. It had to do with the unholy alliance of science and war.

(more…)

Scruffies and Neats in Artificial Intelligence

September 11, 2011

In a previous essay [0] I traced the Lighthill Affair to the tension between the scruffies and the neats in Artificial Intelligence. As a reminder the official [1] definition of these terms:

” … the neats — those who think that AI theories should be grounded in mathematical rigor — versus the scruffies — those who would rather try out lots of ideas, write some programs, and then assess what seems to be working.”

For a few lines this is a pretty good characterization. But I think it only scratches the surface. In this essay I will explore the contrast in temperament and attitude that exists along several dimensions and is found elsewhere in science.

(more…)