Sunday, October 15, 2006

something from computer science.

i like reading essays from a particular author/programmer/"startupper" by the name of paul graham.

irrationally, i distrust capitalists and the rich, but any hacker that can use the word "isomorphic" colloquially and correctly is all right in my book; in his essays, paul graham has, so he is all right.

it could also be that he writes essays that i want to believe, like "why nerds are unpopular" (an uplifting piece) and "what you'll wish you'd known" (something like a speech).

anyways, here are two excerpts from his essay, "the hundred-year language" :

Any programming language can be divided into two parts: some set of fundamental operators that play the role of axioms, and the rest of the language, which could in principle be written in terms of these fundamental operators.

I think the fundamental operators are the most important factor in a language's long term survival. The rest you can change. It's like the rule that in buying a house you should consider location first of all. Everything else you can fix later, but you can't fix the location.

I think it's important not just that the axioms be well chosen, but that there be few of them. Mathematicians have always felt this way about axioms-- the fewer, the better-- and I think they're onto something.


so paul knows axioms! dandy! but this second bit seems to me a little insightful.

Languages evolve slowly because they're not really technologies. Languages are notation. A program is a formal description of the problem you want a computer to solve for you. So the rate of evolution in programming languages is more like the rate of evolution in mathematical notation than, say, transportation or communications. Mathematical notation does evolve, but not with the giant leaps you see in technology.

i can't resist another excerpt. as you can imagine, i didn't finish the essay before posting this .. but this is the last, i promise.

it's worth reading because as mathematicians, numbers are numbers, and what paul describes is something like ordinals from set theory. but from a programmer's perspective .. mmmmrrrppgh.

There are more shocking prospects even than that. The Lisp that McCarthy described in 1960, for example, didn't have numbers. Logically, you don't need to have a separate notion of numbers, because you can represent them as lists: the integer n could be represented as a list of n elements. You can do math this way. It's just unbearably inefficient.

No comments: