Both nil and (len str) work. I see both sides of the argument as counting from -1 eliminates the 0 corner case.
I like indices that are intuitive with literal numbers. Counting from 0 at one end and from 1 at the other is jarring. When -1 points to the end of string rather than before the last char (cut str 0 (- (len str))) returns the first char instead of the empty string.
With -1 -> before last char:
(def chop ((o str "abcdef"))
(pr "Chop how many chars off the end of \"" str "\"? ")
(= n (coerce (cut (readline) 1) 'int)) ; bug in readline prepends #\newline
(prn "Chopped: \"" (if (is n 0) str (cut str 0 (- n))) "\"")) ; handle corner case
With -1 -> end of string:
(def chop ((o str "abcdef"))
(pr "Chop how many chars off the end of \"" str "\"? ")
(= n (coerce (cut (readline) 1) 'int)) ; bug in readline prepends #\newline
(prn "Chopped: \"" (cut str 0 (- -1 n)) "\"")) ; no corner case, but there's this -1 there
I probably made a stronger argument for -1 pointing to the end of string as it leads to shorter code.
Precendence levels and such are a known problem and have a known solution. There are rather easy algorithms out there that deal with this. The only problem is that because you lack typing, you could end up with things that are not parsable:
Fixed the hanging. Was kinda a fun fix too...manually CPS'd the loading of the libraries and then inserted a call to window.setTimeout after each iteration, which gives the browser's event loop a chance to run.
I keep getting "fatal: sha1 file '<stdout>' write error (Bad file descriptor)" when I try to git-push to the wiki repository, so if someone for whom git works wants to grab the tarball and import it, feel free.
This is something that was introduced in the -git- version of Arc. It was something done by a few users in #arc. It is not part of the original language.
Let's try to keep all this information in one post, that was the idea behind my original poll: To collect it all in one place. This just fragments it again.
1) proper phase-distinction between compilation and evaluation
- not possible with the way that macros are currently done
2) if not (1), then there's no excuse for not making macros first-class
- see how to in: http://arclanguage.com/item?id=842
3) modules (like everyone :)
4) Extend the concept of application to data to all data types by making it programmable in arc
- Currently you can do ((list 1 2 3) 0) and you'll get 1,
- why not allow this for all data-types by having a call-back into arc, such that arc-code can switch on the type.
5) Hygienic macros
6) Removal of 'nil
- 'nil does nothing more than '(), and often it breaks (e.g:
- when printing cyclic structures
- the fact that not all '()'s are replaced by 'nil). )
- Seriously, what's the need for 'nil? Just use '()
7) That's it for now, if you want unicode, but bleh
In general, I think the focus on conciseness, while a good thing, should not be the sole purpose. I think pg is slightly misguided on focusing solely on that. You can get conciseness, if your language is powerful enough, through the use of libraries. Small changes like fn and the way if works are easily done through the use of some macrology.
4 is crucial in my mind. You cannot let the language have more power than the programmer. Also, it lets you build more consistent programs instead of needing your own gets and sets all over the place. This would make it a huge win in decreasing tokens a program has to remember and grok about a new program they are reading.
I really like #4, but what other data types would it support? Arc doesn't currently have anything other than lists, hashtables, and primitives. Unless you mean specifying call and set operations for tagged data types...
For instance, if you look at arc.arc, you'll see the first two macros are defined without the use of the macro mac. Using
(set foo (annotate 'mac some-function))
This sets the type to a 'mac (or macro :)
So yes, I guess I do mean for tagged data-types, but also for any possible future 'native' types that it will come to support. Or even redefining it within the context of your application.
The way I solved this in the Wiki was to give tagged values (except macros) the same calling semantics as their non-tagged equivalents. Thus, you can represent arbitrary datatypes as functions, tagging them to keep track of their type but still calling them.
a) The ability to define within arc what happens when a list is
used as function, instead of it being predefined in scheme
b) The ability to do this for any type and custom types.
And potentially:
c) For annotated values to be able to do this per-object,
by storing an extra field in there that is the getter,
where the default value is the one for the type.
E.g: (annotate 'blabla value getter)
If there is no getter, then use the one for type 'blabla