PG won't be alive in a 100 years.. and there's a difference between slowly searching for an optimal language and not working at arc at all.
It is a bit sour for the community if their effort are not affecting the progress of the language towards its goal.
To be fair, PG has a point though that n months isn't that much if Arc is supposed to be 'finished' in about 30 years from now. We might just have wrongly expected things to move at a somewhat faster pace.
Just in general - is there anything in arc which gives it a big edge to programmers when compared to clojure?
Here are some of the differences, but I can't tell if one of them is crucial. I listed three below for which I've no clue what the impact is.
http://clojure.org/lisps
* The read table is currently not accessible to user programs
* Keywords are not Symbols
* Symbols are not storage locations (see Var)
* immutable data and first-class functions
* *first* is clojure's *car* ;)
> Just in general - is there anything in arc which gives it a big edge to programmers when compared to clojure?
Mutation. It's one thing to allow functional programming. It's another thing to force it.
The only thing constant is change.
> * The read table is currently not accessible to user programs
Neither does Arc, although Anarki does allow redefining of 'ssyntax and 'ssexpand, which almost gives you the same thing.
IMO not giving access to the read table is a good thing. There are very subtle problems with this, starting with: the read table affects all code loaded after the readtable modification.
It affects them whether or not the code was written by you, and whether or not it was written with that readtable definition in mind.
This can cause unfortunate library breakage when two libraries try to modify the same readtable entry; the poor user is thus left at the mercy of which library loaded last.
In fact Arc-F has revoked Anarki's feature which allows 'ssyntax and 'ssexpand to be modified; redefine them all you want, Arc-F will use the built-in traditional 4 ssyntaxes : ~ ! .
HOWEVER, there are currently two reserved context metacommands which will eventually allow ssyntax redefinition at the level of the individual file: 'import-ssyntax and 'interface-ssyntax.
The important thing is that they are context-based, and because they are context-based, they are not global and they will (in general) affect only one loaded file.
> * Keywords are not Symbols
Unimportant - salt to taste.
> * Symbols are not storage locations (see Var)
> * immutable data
Oh, Clojure is not quite completely immutable. Clojure has refs, and they can be mutated within the context of a 'dosync form. Kind of like the Haskell "do" syntax. It's more that Clojure defaults to immutability, and has special syntax to define portions that are imperative.
General computer philosophy.. shouldn't the structures that work well with CPU's be present in the language, and used by the language, but not neccesarily accessible by humans, and the structures that work well with humans be the ones that are accessible to humans.
So while anarki might need to support arrays internally for efficiency, they need not neccesarily be part of the language as far as the users can tell.
If the above is true, then right now the implementation of things that are efficient for the language when it wants to communicate with the cpu are of a lower priority than the things that are efficient for the communication between the user and the language.
I think we can have the best of both worlds. I would love Arc to have an intelligent compiler that figures out the best representation for your data, but I would also like access to the low-level data types.
Why? Because Lisp is supposed to be a reprogrammable programming language, where the user can do anything the compiler can do. If the user has access to the underlying bits and bytes, then they can implement new representations of high-level structures, rather than relying on the compiler writer. In fact, I would argue that Arc should only provide low-level structures in the axioms, and then high-level ones can be implemented on top.
If I wanted to represent a matrix of objects.... say a world populated by virtual creatures... I might prefer to predefine the size of the world and identify creature locations using numeric coordinates. I might then want to use an array of arrays to represent the world ^^. I might even abstract away the position of an object using a "location" type composed of a pair of numeric coordinates and a reference to the world, then have "north" and "east" etc. functions to get locations in those directions. Then I might get the reference or set the reference of a location object and thus query and/or change the state of the creatures in that world.... ^^
In Arc-F for example I might use:
(using <vector>v2) ; vector and vector-of
(def make-world (xwidth yheight)
(apply vector
(w/collect:for i 0 (- xwidth)
(collect:vector-of yheight nil))))
; create a location type
(def location (world x y)
(annotate 'location
(list world x y)))
(defcall location (r)
(let (world x y) r
(world.x y)))
(defm sref ((t loc location) val)
(let (world x y) (rep loc)
(sref world.x val y)))
; 0---> INF
; |
; |
; v
; INF
(def south (loc (o step 1))
(err "'north and 'south expect a location"))
(defm south ((t loc location) (o step 1))
(let (world x y) (rep loc)
(location world x (+ y step))))
(def north (loc (o step 1))
(south loc (- step)))
(def east (loc (o step 1))
(err "'east and 'west expect a location"))
(defm east ((t loc location) (o step 1))
(let (world x y) (rep loc)
(location world (+ x step) y)))
(def west (loc (o step 1))
(east loc (- step)))
I should put some more work into it. The good news is that I just found a nice and simple editor that plays well with arc-f on windows (LispIDE - http://www.daansystems.com/lispide/)
btw. arc-f arc.bat doesn't play nice yet with "Program Files" type of directories. Works fine in C:\arc-f.
Got this error:
> default-load-handler: cannot open input file: "C:\Program" (The system cannot find the file specified.; errno=2)
We could give only lists to the user, and implement them internally as arrays when they are frequently used with indexed access and with lists when they are mainly used with car/cdr. Not an easy thing to implement, though.
I made a suggested implementation of lists as unrolled lists which preserves the use of 'scdr on the Arc Forum a long time ago. http://arclanguage.com/item?id=4146 . Basically it would be effectively arrays.
The problem with the proposed implementation however is that the "cons pointer" is a data structure of at least two cells (a pointer to the root of the underlying array, which includes a pointer to the cdr-table somewhere, and a pointer to the actual entry within the array).
However recently I thought of a method by which all that is necessary would be a (probably tagged) pointer to the actual entry within the array. This would require tagged pointers (i.e. no Boehm-Weiser!).
Basically we would define a tagged pointer type which, when found in a list-as-array, would not be a valid 'car of that entry, but rather would be a pointer to an object containing the real 'car and 'cdr for that object. It might be possible to also use a tagged pointer (potentially the same tag, but say with a null pointer) to denote the end of a list.
Let's call this invalid tag a "NON_CAR_TAG", and let's call the tag for a pointer to an array-as-list-cons-cell a "CONS_ARRAY"
Basically a (list 1 2 3 4) would be represented with the array:
[0] INTEGER(1) // tagged with an INTEGER
[1] INTEGER(2)
[2] INTEGER(3)
[3] INTEGER(4)
[4] NON_CAR_TAG( NULL ) // null pointer
A 'cons object would either be a (potentially nontagged) pointer to a real cons cell, or a CONS_ARRAY() tagged pointer to an entry in an array such as the above.
Now suppose we have the operation (= foo (list 1 2 3 4)). 'foo would then contain a CONS_ARRAY() tagged pointer to the above. Suppose the pointer to the above array is 0xDEADBEE0. 'foo would then be CONS_ARRAY(0xDEADBEE0).
Now suppose that a cell is exactly 16 bytes (just an example for easier reasoning). So (cdr foo) would be CONS_ARRAY(0xDEADBEF0), (cdr:cdr foo) would be CONS_ARRAY(0xDEADBF00), etc. Basically, 'cdr would first check if the input is a CONS_ARRAY() tagged pointer, and just add 0x10 (or the cell size). If the resulting address points to an entry that happens to be NON_CAR_TAG(NULL), it should return a NILOBJ, otherwise it returns the result of the addition.
Now, suppose we then do (scdr (cdr foo) 42). 'scdr should first detect if the the pointer it is given is a CONS_ARRAY() tagged pointer. If so, it determines if the value being pointed to is a NON_CAR_TAG() or not. If it's a NON_CAR_TAG(), it gets the underlying cons cell pointed to by the NON_CAR_TAG and modifies that. Otherwise, it allocates a new cons cell, populates it with the existing 'car and the new 'cdr, tags the cons cell pointer with NON_CAR_TAG(), and replaces the entry:
+---------> CONS: A INTEGER(2)
[0] INTEGER(1) | D INTEGER(42)
[1] NON_CAR_TAG( * )
[2] INTEGER(3)
[3] INTEGER(4)
[4] NON_CAR_TAG(NULL)
Note however that it has a drawback that l.index is still O(N) T.T
Somewhat annoying, as doing a 'diff' is probably not enough.
It'll also become slightly messy if pg's statement from a month ago becomes true, and there will be new versions of arc coming out in october / november. If we pick up features from different versions of arc, would the changes compared to arc2, as well as the changes compared to arc3 need to be documented.
I guess the value of forcing good documentation of the changes is that it becomes easier for pg to distill a new arc version out of the mutants that the community is spawning, as they'll have their features well documented.
@echo off
set arc_dir=C:\Program Files\ARCF
mzscheme -mf "%arc_dir%\as.scm"
This one worked for me. I'm not sure if removing the [][]'s from the file and making it windows line breaks mattered, but at least I had to move some ""'s around to get it to work.
Now just trying to get the launch-an-arc-script script working (from outside the repl). The script that is elsewhere on the website works on linux, but not windows ;). Damn pipes.
This link came from reddit, and the blog mentions some fun but underlighted things in clojure. It is sort of funny that the second 'cool' thing the blog post mentions, it also mentions 'just like in arc'.
I quoted the relevant section of the blog here.
2. Unnamed arguments for short lambdas
Pretty straightforward:
(map #(+ % 4) '(1 2 3))
-> (5 6 7)
;; Multiple arguments.
(map #(* %1 %2) '(1 2 3) '(4 5 6))
-> (4 10 18)
This is roughly equivalent to Arc’s [+ _ 4] form, though allows for more than one argument. The standard lambda form is also similar to Arc’s:
(map (fn [x] (+ x 4)) '(1 2 3)
-> (5 6 7)
Note that this is currently bugged in Arc-F though. Will fix ^^.
The auto-gensyms thing looks cute. It might be possible to hack something like that, although it would require modifying the axiomatic quasiquote operator.
"t and nil are now boolean types, not symbols. Sorry. It's hard to reason about them with the newfangled type-based methods if they're symbol types instead of booleans."
It's a bit hard for me to understand the consequences of this, and as you seem to be sorry about the change, I assume there are some consequences worth noting. Could you elaborate?
(def car (x)
(err "can't apply 'car to object" x))
(defm car ((t x cons))
...some scheme-side code....)
(defm car ((t x bool))
(if x
(err "attempt to scan t")
nil))
Of course we could probably still retain t and nil as symbols. However the user might want to define iterating over symbols for some arcane purpose (exploratory, exploratory...). If 'nil is a symbol, the user has to specifically check for the nil symbol when overloading 'car for symbols.
The reason I'm sorry is really because I'm not 100% sure it's the Right Thing (TM), and because I really had to go on with Arc-F instead of dithering over 'nil.
Ah ah ah.... no! The cute thing is that a 'bool presents the "scanner abstraction". Basically, a scanner is anything that overloads the functions 'car, 'cdr, 'scanner, and 'unscan. Thus, you don't have to check for the type of an object: you just need to pass it through 'scanner. If 'scanner throws, it's not a "list". If 'scanner returns a value, you can be sure that it returns a value that will be legitimately passed to 'car and 'cdr.
From this point of view, a "list" is anything that happens to be a scanner.
So nil is a list. So are 'cons cells. So are anything that overloads 'scanner, 'unscan, 'car, and 'cdr.
Try:
(using <lazy-scanner>v1)
...then play around with (lazy-scanner a d)
Or for a bit of ease of use generate an infinite list of positive integers (after doing the 'using thing):
(generate [+ _ 1] 1)
Edit: to summarize: a list is not a cons cell. A cons cell is a list. ^^
Ah...but...what if you want to define a generic function that operates differently on lists and bools (i.e. not a scanner, but a general generic function). I haven't had a close look at Arc-3F yet, so maybe I need to play around a bit more to uderstand what you're saying :)
Well, a "list" is a "scanner". So your "not a scanner" doesn't make sense, at least from the point of view of Arc-F.
However if you mean "list" as in sequence of cons cells:
(def works-on-cons-cells-and-bools (x)
(err "this works only on cons cells and bools!"))
(defm works-on-cons-cells-and-bools ((t x cons))
(work-on-cons-cells x))
(defm works-on-cons-cells-and-bools ((t x bool))
(work-on-bool x))
Note that you can even define a unifying "type class" function which ensures that the given data is a cons cell or a bool, or is convertible to one (i.e. an analog to 'scanner). For example, you might want a "hooper" type class:
(def hooper (x)
(err "Not convertible to a bool or cons cell" x))
(defm hooper ((t x cons))
x)
(defm hooper ((t x bool))
x)
Then you can convert works-on-cons-cells-and-bools with the type class:
Then, someone can make a type which supports the "hooper" type class by either overloading hooper (and returning a true hooper), or overloading hooper and works-on-cons-cells-and-bools:
choice one:
(defm hooper ((t x my-type))
(convert-my-type-to-cons x))
choice two:
(defm hooper ((t x my-type))
x)
(defm works-on-cons-cells-and-bools ((t x my-type))
(work-on-my-type x))
in reply to the summary: nice.. some day I'll have to look into the bellow of the beast. Arc3F no longer has lists build up from cons?, or lists are build from cons which are a special kind of lists.
Well, I prefer to think of cons-as-lists as one implementation of lists. It's possible to define alternative implementations of lists; all that is necessary is to define the overloads for 'car, 'cdr, 'scanner, and 'unscan. With generic functions in Arc-F, that is enough to iterate, cut, search, filter, join, map, and more on any list, regardless of whether it's made of 'cons cells or 'lazy-scanner objects.