In comparing C to lisp, one of the first things that comes to mind is that lisp has closures, whereas C does not. This is an example of the expressive "power" of lisp over C. In a similar fashion, I am very interested in hearing what lisp cannot express; or, at least, what it cannot easily express.
Sure, Qi was implemented in CL, but that doesn't mean Lisp isn't blub compared to Qi. Think Greenspun's Tenth Rule, but instead of C and Lisp, Lisp and Qi respectively.
(Too bad I'm posting 11 days late... I would have liked to have heard a rebuttal.)
An interesting article is "The Origins of the Turing Thesis Myth", which explains why it's a myth that a Turing machine can do anything a computer can.
The quick summary is that Turing machines can compute any algorithmic function. However,real computers do more than computing functions, and today's applications cannot be modeled by Turing Machines.
And this is what comes of trying to be flippant around people who know more than I do :)
That's a very interesting paper. I hadn't really thought about the theoretical ramifications of everything we let computers do these days... On the "I agree" front, there really isn't much to say.
Though I'm not convinced that it's strictly untrue that "all computable problems are function-based." For instance, take the robotic car: you could, for instance, model that as sequence of functions being called with new inputs, where each call represented the motion of that car over one "tick," which could be as short or as long as you like. And if there's a hole in that argument (which there may well be), then if worst comes to worse, we can use quantum mechanics to model the section of the universe with the computer running the program and obtain a mathematical description that way :)
Eh, but the problem is the assumption that the world itself cannot be modelled as part of the tape that the Turing Machine eats.
From a quick glance through the paper and the LtU comments it seems that its point is that interactive I/O cannot be modelled by the Turing Machine.
But as I've learned in Haskell, I/O can itself be mathematically treated, specifically by monads: the world-before-i/o-event is the monad that is input to a function, and the world-after-i/o-event is the monad you return. And I'm pretty sure that monads themselves can be modeled by TM: they can be represented by a part of the tape that the TM gets to and modifies, just like any function-to-TM mapping.
The problem is that the paper uses words like 'model' and 'function-based' rather vaguely. You can model I/O with a TM, but you can't actually do it, which is what they're getting at.
Really? Can you list the lisps (common lisps, schemes, whatever) that let you do this kind of stuff? And tell me why the ones that don't aren't "serious", according to you?
So, according to you, Python isn't "general-purpose" and Haskell isn't "serious"? I could list many more languages here, of course, but my point is obvious.
It's useful to be able to do lowlevel bitflipping stuff, that I don't deny; and I am suitably impressed by the number of lisps that have put thought into allowing this. But just because a language lacks such capabilities doesn't mean I'd rather use C, or even Lisp! And often, if I'm really interested in lowlevel stuff, I'm also intimately concerned with things like speed and memory usage that I can't necessarily address even in a highly optimized lisp. For example, all lisps need GCs (what if I want to write a GC? I'd rather not have my GC itself need GC, and I'm not learned enough concerning the implementation of eg. SBCL to, like the writers of T, write a GC in it such that the GC needs no GCing), and any Common Lisp necessarily comes with a huge standard library attached.
This depends on the implementation, not the language. With the great majority of implementation this is true, though. LispWorks and Allegro CL have support for this, but I don't know the real size of the executables, because I've never used them. The problem is that you have to carry the runtime together with the application, because if your application uses 'eval, then to execute it you need a compiler and all the runtime. The comparision with 'grep' isn't fair: grep uses the C runtime that comes pre-installed on the system. You dont' notice it, but it is there and it is quite big. If you had a pre-installed lisp system, then you could deliver small fast starting executables.
> grep uses the C runtime that comes pre-installed on the system.
Quite right. The fact that the OS itself is (usually) written in C means that nearly every OS-using computer has a C runtime.
As an aside, consider executable file sizes in the Windows world, where the OS does not provide its C library to other programs. Many programs in Windows include their own versions of the C library, increasing their sizes.