Arc Forumnew | comments | leaders | submit | antiismist's commentslogin

Hi Georgi

For testing, my workflow is:

- have a local version of my site running: (thread (asv 8080))

- have the production running remotely: (thread (asv 80))

- code some function in your favorite editor

- when ready, copy the code to your clipboard, and paste it into the local version

- after testing, paste that code into your remote REPL - no need to do a (load "...")

-----

1 point by antiismist 6104 days ago | link | parent | on: Poll: Destructive operations naming

It is probably best to avoid any standard that involves non-alphanums, because that character could become a part of the syntax if pg feels like it one day.

-----

2 points by absz 6104 days ago | link

There's already one existing syntax for !, and I think it's unlikely that pg will add another. We're taking advantage of something which is, essentially, defined to be meaningless in arc2.tar but which we can make work on Anarki and using it. And of course, pg can do anything he wants, but that could involve both new ssyntax and new functions. We have been warned that what we're doing is "unsafe," after all. If we do avoid alphanumerics and choose, say, djoin, what happens if pg choose njoin for arc3.tar? Or uses the d prefix for something else? Using existing ssyntax characters is probably pretty safe.

As you might have guessed, by the way, I support the join! standard :)

-----

2 points by almkglor 6104 days ago | link

> If we do avoid alphanumerics and choose, say, djoin, what happens if pg choose njoin for arc3.tar? Or uses the d prefix for something else?

We say "I suggest running a poll on this, pg" ^^

But I still like module syntax T.T However the appended ! convention is winning by a really large margin (waaa hopeless!). So we need to modify the builtin ssyntax/ssexpand to ignore trailing "!" and/or standardize on my ssyntaxes.arc

-----

1 point by antiismist 6103 days ago | link

OK, another point. There is a virtue in having a standard that is backwards compatible with the base Arc release.

For example, I recently borrowed classifier.arc from anarki and ran it with the base release of arc with no troubles. If people start using the ! standard, then to use anarki libs one would have to run all of anarki. Which may be OK, it just has to be accepted that this is the case.

(As a matter of style I like ! too)

(somewhat related tangent:

[One of the best of these is a Gosperism. Once, when we were at a Chinese restaurant, Bill Gosper wanted to know whether someone would like to share with him a two-person-sized bowl of soup. His inquiry was: "Split-p soup?" -- GLS]

http://www.everything2.com/title/the%2520-P%2520convention )

-----

1 point by absz 6103 days ago | link

We're already not backwards compatible. Things like ssyntax.arc, for an extreme example, but also our additions of functions to places like arc.arc, e.g. butlast. That doesn't exist in arc2.tar, but we have it anyway. Anything that uses such functions requires Anarki. This is more extreme (you wouldn't be able to copy just those functions over), but not unique.

Nice story :)

-----

11 points by antiismist 6107 days ago | link | parent | on: Show and Tell: ballerinc.com

Sure, I can share some things I've learned. The caveat of course is that I am no elite hacker, so some of this is going to be obvious or not really correct...

- macros: In news.arc, they are all over the place. So I tried to code in a similar style, but it turns out that if something can be done via a macro vs being done with a function, then it is more natural for me to use a function. I work a lot with the REPL, and it turns out to be really annoying to change a macro, and then I have to track down all the places that use that macro, and "refresh" them to use the new macro. Especially with macros that call macros that call macros, and I change the macro at the end of the chain. Whereas with a function, I just have to paste in the new function and it takes effect right away.

- news.arc stores posts in a hash table. I started off modeling ballerinc that way. But as time goes on, I am starting to move things over to plain old lists. Basically, working with a hash table pushes me into code that uses side effects to get things done, like using each. Whereas with lists, the code is usually cleaner and shorter.

- repl: using the repl on the running web server is pretty amazing. Already I have pushed changes directly to the live site. For ballerinc restarting the server takes about 10 minutes, so this has been a life saver. I don't worry about "releases" at all anymore - just make the feature, test it, and deploy it.

- the codebase is about 386 LOC. For the past week I've been adding features and doing code cleanup, and the LOC keeps going down.

-----

1 point by skenney26 6107 days ago | link

Any advice related to setting up and running your web server would be highly appreciated. I recently got a hosting account and was stuck trying to figure out how to keep the server running after logging out of ssh. Also, have you had any security concerns while developing your applications?

-----

3 points by antiismist 6107 days ago | link

I use the screen utility. Basically, I always have screen running, and the repl / vim running. With the repl running in screen you can detach via control-a d, and then log off, and the repl keeps on running. Then you ssh back in, and can reattach via screen -dr.

I haven't had any security concerns - I have a couple admin pages but of course one has to be an admin to see them.

-----

2 points by zitterbewegung 6107 days ago | link

How do you parse the rss feeds?

-----

2 points by antiismist 6107 days ago | link

That part is a big hack...I have a ruby script that generates an s-expr of the feed, and work from that:

       (each i (read (tostring (system (string "ruby bench.rb " r!url))))
           (add-post (alref i 'title) (alref i 'link) r!league))

-----

2 points by zitterbewegung 6106 days ago | link

There is a function (xml->xexpr) in PLT scheme which could also generate an s-expr of the rss feed.

-----

1 point by antiismist 6106 days ago | link

I had trouble figuring it out ...

If you had a URL for an RSS feed, how would you get an s-expr containing all the links and titles in the items?

-----

1 point by antiismist 6114 days ago | link | parent | on: Help w/ continuations

Thanks for the pointer. Looks like ccc is not going to help me here

-----

2 points by antiismist 6130 days ago | link | parent | on: Response to the Arc Challenge

- Arc is a language, not a web framework. The framework part comes in app.arc.

- "It is worth noting that the quality of Arc's HTML is atrocious, but this may improve with time." I wouldn't say that this is the case. news.arc makes big use of tables, which some people don't like. But you are free to make whatever HTML you want - you get to specify exactly what output you want.

- The part discussing models seems off somehow.

-----

2 points by gidyn 6130 days ago | link

http://validator.w3.org/check?uri=http%3A%2F%2Farclanguage.o...

-----

4 points by antiismist 6130 days ago | link

So?

my site, http://pageonetimes.com, validates a lot better, and it also uses Arc. Most of the errors are from "bad" urls from links.

FWIW yahoo.com has more errors...does that mean that PHP makes "atrociouser" html?

-----

3 points by gidyn 6130 days ago | link

PHP doesn't make any HTML :-)

My point is that the Arc libraries do not "encourage" high quality markup. Compare to WASH, where the libraries will only produce strictly compliant HTML.

-----

5 points by jmatt 6130 days ago | link

As an arc developer, I create compliant or not compliant HTML. It's the developer's choice, not the framework's job. That is nice that their are some frameworks out there that always create strictly compliant HTML but it's not necessary for a great web framework. I think a lot of innovations will continue to come from non-compliant hacked web code.

-----

2 points by conanite 6130 days ago | link

Markaby (a ruby html-generation framework) did something similar - it would raise an error if you tried giving the same id to more than one element, for example. Unfortunately, it was quite slow, and last time I checked was not being actively developed. But it was so, so, so readable. A lot like almkglor's http://arclanguage.org/item?id=5608

-----

3 points by almkglor 6130 days ago | link

But! But! I didn't actually implement it, which means it doesn't actually count. Code or it didn't happen.

What I did implement is: http://arclanguage.org/item?id=5570

I feel that it's almost as good as the other solution I presented, but I wonder what the opinion of others are. One advantage it has is that it's the non-Arc code that has special syntax, unlike the marcup version where it's the Arc code that has the special syntax.

Apparently the mockup 'marcup is a bit more popular ^^

Would anyone prefer the 'marcup version over the current, existing 'w/html?

-----

4 points by antiismist 6136 days ago | link | parent | on: search?

- use google: "site:arclanguage.com ..."

- check out arcfn.com

-----

4 points by antiismist 6138 days ago | link | parent | on: Poll: Priorities

I agree that the lack of libs is a real problem. My real world example: I have have a bunch of RSS feeds, and I want to regularly poll those feeds, extract some content, and add it to my site (pageonetimes.com).

Doing it with Arc alone is too much for me - for mysterious reasons get-http.arc isn't working for me. Even if it did, I would have to write my own rss parser. Instead I am using Ruby for this, and grabbing the RSS content, parsing it, and saving the results is about 10 lines of code using standard libraries.

-----

4 points by stefano 6138 days ago | link

What kind of problems exactly does get-http.arc give you? It's a translation of a piece of Common Lisp code I had written to be able to fetch RSS feeds. The CL version worked well for the task.

-----

1 point by antiismist 6137 days ago | link

Here is what happens:

  arc> (get-request (str->url "http://yahoo.com/"))
  Error: "procedure ...narki/arc/ac.scm:1231:11: expects 2 arguments, given 3: 
  #hash((\"CONTENT-TYPE\" . \"text/... \"LOCATION\" #hash((\"CONTENT-TYPE\" . \"text/..."

(I think the problem lies with me, in that I don't know how to use the library...)

-----

3 points by stefano 6137 days ago | link

I've tried your example and gives me (obviously) the same error, but trying

  (get-request (str->url "http://www.yahoo.com/"))
works. I will investigate further.

BTW, get-request returns a list containing the header of the response and a string containing the page (when it doesn't raise and error, of course).

Edit: bug solved. New version on github!

-----

1 point by antiismist 6136 days ago | link

Thanks you are awesome. So what do you use for RSS parsing?

-----

2 points by stefano 6136 days ago | link

In my CL project I thought to parse it manually, but I've abandoned the project after finishing the HTTP part. I've never tried using it, but you could have a look at treeparse in Anarki, if you haven't done already.

-----

1 point by antiismist 6150 days ago | link | parent | on: Ask AL: pprint + vim

Is there any way to shoot the text over to an already running interpreter and capture the output (to delay startup times..)?

-----

2 points by almkglor 6150 days ago | link

Possibly some mkfifo hackery in Linux?

Possibly do something stupid like:

  mkfifo to_arc
  mkfifo from_arc
  ./arc.sh < to_arc > from_arc
Then just pipe commands through the to_arc and from_arc FIFO's? (this is untested)

Edit: Alternatively, you might want to do this in a thread on an existing arc session, again still using mkfifo hackery:

  (w/infile i "to_arc"
    (w/outfile o "from_arc"
      (w/uniq invalid
        (def foo ()
          (let ip (read i invalid)
            (if (isnt ip invalid)
                (write (eval ip) o)))
          (foo)))))
  (thread (foo))

-----

3 points by antiismist 6152 days ago | link | parent | on: Ask AL: pprint + vim

AL was supposed to be Arc Language, but good point on almkglor too.

-----


I didn't see the alien, but it was one of the reddit cofounders who broke the news, so it is likely.

-----

More