Arc Forumnew | comments | leaders | submitlogin
1 point by aw 5137 days ago | link | parent

I guess I don't understand the problem then. I thought by your example you were concerned that someone might overwrite ssyntax-rules* with invalid data.

So why wouldn't it work if I overwrote ssyntax-rules* with valid data?

(I.e., by analogy, it's no problem to redefine a function in Arc as long as I give it a working function).



1 point by Pauan 5137 days ago | link

Because my idea was that ssyntax-rules* would be a special data-type (rather than an ordinary alist). This would let me give it a hook, so I can know whenever somebody gets/sets to it. With those hooks, I could easily implement optimizations that could potentially speed things up a lot. Without those hooks, I might not be able to apply those optimizations.

But if you overwrite the global variable, the new value (an ordinary alist) would not have those special hooks, which means the Arc interpreter is no longer notified about changes, and thus, the internal table is never updated.

Like I said, it's an implementation detail. The problem is that if I decide to do it like that, this particular implementation detail is visible to the user in a nasty way. Thus, I'm wary about doing it like that. I guess I should wait until it's actually implemented, then profile to see if it's a big performance hit or not.

-----

1 point by aw 5137 days ago | link

Oh, I see. Here's a thought: another place to check whether ssyntax-rules* has changed is at the start of reading or evaling an expression.

The tradeoff there is that we'd no longer be able to change ssyntax and use the new ssyntax in the same expression. That is, this would work:

  (make-changes-to-ssyntax)
  use->new%@ssyntax
but this wouldn't:

  (do (make-changes-to-ssyntax)
      use->new%@ssyntax)

-----

1 point by Pauan 5137 days ago | link

Yeah, I had the same idea; only checking ssyntax-rules* at the start of each top-level form. That's a reasonable tradeoff too, but since I don't expect the rules to change frequently, it's still far less efficient.

As hacky and clunky as it may be, it looks like hacking eval would be my best bet, as far as efficiency goes (while still behaving as the user expects). I might be able to generalize it somewhat, in case I want to add hooks to other stuff later.

The only major disadvantage is that it works based on the name, not the value. Still better than the alternative, though. Actually, come to think of it, working based on the name is what the user expects, so that should be just fine.

-----

1 point by rocketnia 5137 days ago | link

If at some point you say this in Arc...

  (def foo ()
    (look-up-something-in syntax-rules*))
...then (foo) will always use the most recent version of 'syntax-rules* , regardless of how many times it's been reassigned since the definition of 'foo.

This is one reason it's challenging to do modules in Arc. You have to construct 'foo so that its references to 'look-up-something-in and 'syntax-rules refer to the meanings that exist at the time 'foo is defined (already done in Arc, but something to consider as you do things from scratch), and then you have to somehow make the same name have different meanings in different contexts (hard to do on top of official Arc, thanks to Racket's opaque namespaces, but very possible to do from scratch).

I think the least surprising thing to do would be to have your core functionality work the same way as 'foo does in Arc: It would look up the value of 'syntax-rules* , and that particular binding (meaning) would be a fundamental constant of the language. Ultimately, that's the same thing as hardcoding the name, but with an extra layer of indirection to account for modules.

-----

1 point by Pauan 5137 days ago | link

I'm not sure what you mean. Do you mean that it would always refer to the most recent version of syntax-rules* in the current module? Isn't that expected behavior? If you mean it would refer to the most recent version defined in any module, then that shouldn't be an issue, since they are/should be isolated.

My eval hack is only about updating the internal table rules when somebody changes ssyntax-rules* This avoids checking it too frequently, potentially making it run faster. So, the following should work fine:

  (= foo ssyntax-rules*)
  (= ssyntax-rules* nil)
  
  foo            -> previous ssyntax rules
  ssyntax-rules* -> nil

  (= ssyntax-rules* foo) -> works fine again
And yes, I plan to have lexical scope in PyArc. Each function has it's own environment, and a reference to the outer environment, so it can walk up the chain until it either finds the value or realizes it's undefined.

There's a bug with it right now, but I should be able to fix it. The environments get created properly, but when doing the actual lookup, it thinks the value is undefined when it isn't. That's pretty normal, since the interpreter is still missing lots of functionality.

-----

1 point by rocketnia 5137 days ago | link

"I'm not sure what you mean. Do you mean that it would always refer to the most recent version of syntax-rules * in the current module? Isn't that expected behavior? If you mean it would refer to the most recent version defined in any module, then that shouldn't be an issue, since they are/should be isolated."

The core is a module, right? At least in concept? And shouldn't you be able to modify something in another module when you need to, to fix bugs etc.? (Not that you'd be able to fix bugs in the core, necessarily. :-p )

---

"There's a bug with it right now, but I should be able to fix it. The environments get created properly, but when doing the actual lookup, it thinks the value is undefined when it isn't. That's pretty normal, since the interpreter is still missing lots of functionality."

It sounds almost like you shallowly copy the outer scope when you create a local scope, which means you miss out on later definitions made in the outer scope. Is this right? It's a pretty wild guess, and even if it is right, then I'm interested to know how you're shallowly copying arbitrary environment types. :-p

(EDIT: Oh, you've fixed this already. I'm just late to the game.)

-----

1 point by Pauan 5137 days ago | link

Yes and yes. But there's supposed to be a clean separation between modules. If you assign to a variable, it assigns it in your module. If you want to assign to a different module, you need to be explicit, like so:

  (import foo "foo.arc")
  (= foo!bar 'qux)
So in order to assign to the core module, you would need something like built-ins* to access it:

  built-ins*!ssyntax-rules*  ; modify ssyntax globally
Assuming I choose the name built-ins* of course.

"then I'm interested to know how you're shallowly copying arbitrary environment types. :-p"

Environments are just wrappers around Python's dict type, so I can use the copy() method to make a shallow copy. I might need to make my own custom copy function later, though. Arc tables are just annotated dicts, by the way. But Python lets me modify how classes behave, so for instance I should be able to get cons cells to work in a dict, even though they're mutable (and thus not normally allowed).

-----

1 point by rocketnia 5137 days ago | link

"If you want to assign to a different module, you need to be explicit, like so:"

Ah, I don't know why I didn't see that. ^_^

---

"Environments are just wrappers around Python's dict type, so I can use the copy() method to make a shallow copy. I might need to make my own custom copy function later, though."

I was harking back to "What is an environment? Anything that supports get/set on key/value pairs." Would it also need to support a copy function?

-----

1 point by Pauan 5137 days ago | link

Hm... it shouldn't, no. Only global_env is copied, so Arc types wouldn't need to worry about copying. In the hypothetical case that I would need to copy them, I could coerce them into a table and then do a copy on that. This could all be done transparently, so Arc code wouldn't need to do anything.

global_env is treated specially because it's the base for all other modules: it contains the built-in functions that modules cannot create (at least not in Arc), and also the core functions in arc.arc for convenience.

There are other strategies I could use as well, like special-casing (assign) or similar, but I figured a shallow copy would be easiest.

-----

1 point by Pauan 5136 days ago | link

"...and then you have to somehow make the same name have different meanings in different contexts (hard to do on top of official Arc, thanks to Racket's opaque namespaces, but very possible to do from scratch)."

I'm curious, how are Racket's namespaces opaque?

-----

1 point by rocketnia 5136 days ago | link

That's a good question. It's probably unfair to Racket to just call the namespaces opaque without an explanation. In fact, I'm not sure there isn't a way to do what I want to with Racket namespaces, and even if there weren't, there might be a way to configure Racket code to use a completely different kind of name resolution.

With a Racket namespace, you can get and set the values of variables, you can reset variables to their undefined state, and you can get the list of variables, so it's very much like a hash table. But suppose I want to have two namespaces which share some variable bindings. I may have been able to get and set the values of variables, but it turns out I can't manipulate the bindings they're (in my mind) stored in. I can't create two namespaces which share a binding.

Fortunately, Racket namespaces can inherit from modules, which sounds pretty much exactly what I want, and there's also the "racket/package" module, which may or may not act as an alternative (for all I know). However, you can't just create modules or packages dynamically; you have to define them with particular names and then get them back out of those names somehow. I haven't quite been able to get anything nifty to work, maybe because of some issue with phase levels.

-----

1 point by Pauan 5136 days ago | link

So... you want two different namespaces that inherit from the same namespace? Easy in PyArc (or Python):

  (= shared (new-namespace))
  (new-namespace shared)
  (new-namespace shared)

-----

1 point by rocketnia 5136 days ago | link

I did say "very possible to do from scratch." PyArc's behavior doesn't help in pg's Arc. ^^;

-----

1 point by Pauan 5136 days ago | link

Yeah, that's true. It's one of the (few?) benefits of writing the interpreter from scratch, rather than piggybacking on an existing implementation.

-----

2 points by rocketnia 5136 days ago | link

The less an interpreter piggybacks, the less will be left over to implement when porting the interpreter to another platform. That benefit's enough by itself, IMO. ^_^

But then I tend to care more about the work I'll need to do in the future than the work I do today. When my priorities are the other way around, like for fun-size DSLs that'll hopefully never need porting, being able to piggyback is nice.

-----

1 point by aw 5137 days ago | link

The only major disadvantage is that it works based on the name, not the value.

Is this because you're still thinking in terms of creating a new data type that would notify you of changes?

My thought was at the beginning or read, or at the beginning of eval (whichever worked better), it would check whether ssyntax-rules* had the same contents as the last time it looked.

Now you don't have any special data types or magic going on, just ordinary variables and values.

-----

1 point by Pauan 5137 days ago | link

My idea for hacking eval was that for (assign), it would check if it was assigning to ssyntax-rules* and if so, do something. In other words, implement the hook in eval rather than as a special data type.

That avoids the issue, because it works based on the name rather than the value. If I implemented ssyntax-rules* as a special data type, it would work based on the value, rather than the name. But this way, it works no matter what is assigned to ssyntax-rules*

P.S. Checking ssyntax-rules* at the beginning of each eval sounds dreadfully slow. Eval is called a lot, after all.

-----

1 point by aw 5137 days ago | link

Checking ssyntax-rules at the beginning of each eval sounds dreadfully slow.

Maybe. But then, eval invokes the Arc compiler, and you may find that comparing two lists is negligible in comparison.

-----

1 point by Pauan 5137 days ago | link

I don't have an Arc compiler. Everything is evaled in my interpreter, for better or worse. It can be made fast later, I just want to get it working right now.

-----

1 point by rocketnia 5137 days ago | link

It's not just a matter of performance. That's the kind of semantic difference that'll make it really hard for people with existing Arc code to use their code on your interpreter; in a sense, it won't actually "work" in the sense of being a working Arc implementation. Are you concerned about that?

No need to be; it could very well become a language that's better than Arc (and if modules are in the core, that's definitely a possibility ^_- ).

-----

1 point by Pauan 5137 days ago | link

I'm not terribly concerned about existing code. I would like existing code to work, but it's not a huge priority as in "it has to work no matter what." I've even toyed around with the idea of changing the default ssyntax, which would definitely break code that relies on Arc's current ssyntax.

I'm curious, though, what's the semantic difference? In PyArc, when eval sees a function, it calls apply on it. Apply then calls eval on all of the function's arguments, and then calls the function. How does MzArc differ significantly in that regard?

-----

1 point by rocketnia 5137 days ago | link

In MzArc (RacketArc?), 'eval is only called once per top-level command (unless it's called explicitly). If you define a function that uses a macro and then redefine the macro, it's too late; the function's code has already been expanded.

I rely on this behavior in Lathe, where I reduce the surface area of my libraries by defining only a few global variables to be macros that act as namespaces, and then replacing those global variables with their old values when I'm done. In an Arc that expands macros at run time, like old versions of Jarc, these namespace macros just don't work at all, since the variables' meanings change before they're used.

I also rely on it in my Penknife reimplementation. I mentioned (http://arclanguage.org/item?id=14004) I was using a DSL to help me in my pursuit to define a function to crank out new Penknife core environments from scratch. The DSL is based on a macro 'pkcb, short for "Penknife core binding," such that pkcb.binding-get expands to something like gs1234-core-binding-get and also pushes the name "binding-get" onto a global list of dependencies.

There's also a global list of Arc expressions to execute as part of the core constructor function. Adding to the list of expressions invalidates the core constructor function.

Whenever you call the core constructor, if it's invalid, it redefines itself (actually, it redefines a helper function) by evaluating code that has a big (with ...) inside to bind all those 'pkcb variables. But actually, I actually don't have a clue what 'pkcb variables to put in the (with ...) at that point. Fortunately, thanks to Arc's macro semantics, the act of defining triggers the 'pkcb macro forms, and it builds a list telling me exactly what I need to know. Then I define the core constructor again using that list in the (with ...).

I do all this just to save a little hash-table lookup, which I could have done with (or= pkcb!binding-get (pk-make-binding)). :-p If that's not important to you, I'm not surprised. It's hardly important to me, especially since it's a hash table of constant size; I mainly just like the thought of constructing the bindings eagerly and ditching the hash table altogether when I'm done.

So that example falls flat, and since you're taking care of namespaces, my other example falls flat too. I've at least established the semantic difference, I hope.

-----

1 point by Pauan 5137 days ago | link

Actually, PyArc expands macros at read-time and run-time, as needed. You can even use (apply) on macros.

If that's what aw meant by "Arc compiler", then I misunderstood. I took it to mean "compiling Arc code into a different language, such as assembly/C/whatever"

Given what aw said, though, eval doesn't invoke the Arc compiler... once read-time is over, the data structure is in memory so eval can just use that; it doesn't need to do any more parsing.

So, I'm not sure what aw meant by "eval invoking the Arc compiler."

-----

1 point by rocketnia 5137 days ago | link

Actually, PyArc expands macros at read-time and run-time, as needed.

Ah, okay. The semantic differences may not exist then. ^_^

In ac.scm, the eponymous function 'ac takes care of turning an Arc expression to a Racket expression. Arc's 'eval invokes 'ac and then call's Racket's 'eval on the result, so it is actually a matter of compiling Arc into another language. As for PyArc, in some sense you are compiling the code into your own internal representation format, so it may still make sense to call it a compile phase. The word compile is just too general. XD

-----

1 point by Pauan 5137 days ago | link

Then in that case, I think my previous answer is correct: PyArc does not have an Arc compiler, only an interpreter.

What you referred to as compile phase I refer to as read-time, since I think that's more specific. In any case, everything passes through read/eval, which is actually exposed to Arc, unlike MzArc where eval is just a wrapper.

Thus, implementing an additional check in eval could add up, if I'm not careful. So I think I'll add a hook to (assign) instead, that way it can be really fast.

-----

1 point by rocketnia 5137 days ago | link

"What you referred to as compile phase I refer to as read-time, since I think that's more specific."

I don't know about calling it read-time. Arc being homoiconic and all, there's a phase where a value is read into memory from a character stream, and there's a phase where a value is converted into an executable format, expanding all its macro forms (and in traditional Arcs, expanding ssyntax).

Read time is kinda taken, unless (read) really does expand macros. If compile time isn't specific enough for the other phase--and I agree it isn't--how about calling it expansion time?

---

"In any case, everything passes through read/eval, which is actually exposed to Arc, unlike MzArc where eval is just a wrapper."

I don't know what consequences to expect from that, but I like it. It seems very appropriate for an interpreter. ^^

---

"Thus, implementing an additional check in eval could add up, if I'm not careful."

I see what you mean.

-----

1 point by Pauan 5137 days ago | link

Read actually does expand macros. At least for now. In other words, it does the "read characters into memory" and "convert into executable format" phases at the same time.

Calling it expansion time isn't really correct because it doesn't always expand; sometimes it just takes data like 10 or (foo bar) and converts it into the equivalent data structure.

---

"I don't know what consequences to expect from that, but I like it. It seems very appropriate for an interpreter. ^^"

Well, it means you can overwrite them in Arc, possibly to implement defcall or similar, as opposed to in MzArc where they're hidden.

-----

1 point by shader 5137 days ago | link

Maybe we need an official jargon file for arc, given that we're having a lot of confusion over things like "compile time", what to call pg's original arc, etc.

I realize that there was an attempt at starting a wiki, but it requires permission to edit and is somewhat different from most wikis I'm familiar with. Maybe a more open data platform would be more accessible, such as wikia? I would quickly set one up there if anyone else is interested (and aw isn't too offended ;)

-----

2 points by aw 5137 days ago | link

Well, http://sites.google.com/site/arclanguagewiki/ is up, running, works, is available for editing for anyone who want to edit it, and I've been planning to continue to contribute to it myself. So... certainly anyone can post material anywhere they want, including a different wiki platform if they like it better, and if it's something useful to Arc we'll include links to it just as we include links to other useful material, whether published in a wiki or a blog post or whatever... but I personally would be looking for a more definite reason to post somewhere else myself.

Is there something specific you don't like about Google Sites?

Some of the things I was looking for:

- no markdown. I despise markdown syntax (when writing about code) with the myriad special syntax that I can never figure out how to escape.

- a way to plug in code examples from my upcoming "pastebin for examples" site, which I hope I'll be able to create a Google gadget to do.

- a clean layout without a lot of junk gunking up the page

- simple and easy to use

-----

1 point by Pauan 5136 days ago | link

I propose pgArc, personally. Short, simple, and unambiguous.

-----

2 points by shader 5136 days ago | link

Traditionally we've used "vanilla" to refer to the main branch, but then as a community we don't really care much for tradition anyway :P

-----

1 point by aw 5137 days ago | link

What's the semantic difference?

-----