The source of this problem is not only in chaining itself: redefining standard functions will almost always lead to problems when distinct libraries redefine the same function. That's why other languages have module systems. Moreover, another problem arises from this chaining: performance. The more the chain is long, the more function calls it will need to get the original behavior.
Were you thinking about a multimethod dispatch such that of CLOS?
> The source of this problem is not only in chaining itself: redefining standard functions will almost always lead to problems when distinct libraries redefine the same function.
But this should be supported. For instance, in C++, you can redefine the "standard function" operator+ :
Foo something(Foo a, Foo b){
return a + b;
}
Or in Ruby, you can redefine the "standard function" each:
a = Foo.new()
a.each{ |i|
puts i
}
C++ does it by strict static typing, while Ruby does it by attaching to the object. Neither way feels very Arcish; what is a better Arc solution?
> Were you thinking about a multimethod dispatch such that of CLOS?
The CLOS solution is pretty similar to the Ruby solution: it checks the type passed to the function and dispatch to the correct function. The big difference is that CLOS looks at all the arguments, while Ruby only at the first argument (the object). The CLOS approach seems to me the best to solve the problem, but I think that applying it to functions as basic as car would kill performance without a really good optimizer.
> The CLOS solution is pretty similar to the Ruby solution: it checks the type passed to the function and dispatch to the correct function. The big difference is that CLOS looks at all the arguments, while Ruby only at the first argument (the object)
IMO the difference is big enough for CLOS Way != Ruby Way. ^^
Still, I wonder - how does CLOS implement this? How about for variadic functions?
> but I think that applying it to functions as basic as car would kill performance without a really good optimizer.
Hmm. I've been trying to grok through dynamic dispatching speedup techniques - the Sun Self papers are pretty good, Sun's index doesn't have the papers themselves but you can ask Google for them.
I think that for every method with the same name an hash table indexed on the types of the argument is created, and when the method is called, a lookup is made on the table.
> Hmm. I've been trying to grok through dynamic dispatching speedup techniques - the Sun Self papers are pretty good, Sun's index doesn't have the papers themselves but you can ask Google for them.
I've found those papers in the past (a few weeks ago), but I've never found the will to read them, they are written for people already in the compilers' world and I'm still learning the basics about compilers.
BTW, a good type inferencer + a good function inliner could solve the problem, but I wouldn't know even where to start to implement them :(.
Use a "Polymorphic Inline Cache" (PIC). Basically if a piece of code could call several different methods, we figure out which one it is, then we create a copy of the calling function which does the type checking at the top and specialize all method calls to that type:
> Still, I wonder - how does CLOS implement this? How about for variadic functions?
I don't think CLOS lets you check types on &rest, &optional, or &key parameters. So you couldn't use CLOS for the current behavior of '+.
Also note that CLOS only works on methods with "congruent lambda lists", that is, methods with the same number of required, optional, and keyword arguments. So you can't have
ah, I see. So I suppose this greatly simplifies things then.
Hmm. This certainly seems easier to hack into arc. We could have each method start with a generic function whose lambda list we have to match.
As an aside, currently the base of Arc lambda lists are simply &rest parameters (i.e. optional parameters are converted into rest parameters with destructuring). Should we match on the plain rest parameters or should we properly support optionals?