Concurrency is the next big thing in programming languages. If you want to take advantage of the increasing parallel hardware you have to support it. If Arc can't support concurrency, languages like Erlang who can will become the de facto standard.
I've heard that a lot, but I don't quite buy it. Thing is, distributed computing is the current big thing in architectures. If you can't scale your architecture across multiple commodity PCs, you're in for lots of pain, regardless of whether it supports multiple cores or not. Nobody wants to have to buy a SunFire just because their website got big.
If you have distribution, it's no big deal just to run multiple processes on one box and let the OS handle concurrency. You can treat each processor like it's a separate machine, let the OS handle scheduling, use your architecture's distributed-computing features, and silently take advantage of the blazingly fast IPC between different processes on the same machine.
I think in a case like hosting web apps, you can get away with using the OS's scheduling to run lots of instances of your server software and call it architectural concurrency. Witness Rails' standard deployment model with a pack of mongrels. Each rails instance is single-threaded, but you are running several on the machine and let the OS juggle them.
I think the appeal of Erlang and others like it is if you have a single app that you want to speed up with additional cores (it's hard to run a database on a bunch of commodity PCs as separate instances... it's done that way today, but will multimaster DBs really scale to 10,000 machines? Probably have to ditch the central DB for something more parallel) I think that if we have really hit the single core speed limit, then this eventually will matter, but not until the single thread becomes too big for one core to handle. (Assuming software requires more resources over time here... who knows.)
To get away from the webapp example... I imagine, for the foreseeable future, your web browser is really only going to need one core, and having more cores just means you can run more browsers at a time (like a pack of foxes, if you will), but a single browser won't get any speedup. In 10 years, when everything on the web is rendered in OpenGL 5 instead of text markup, maybe your browser will want some more parallelism (or maybe we'll just farm that off to dedicated graphics hardware).
I don't think it's really something critical for Arc right now. It's hard to follow the Erlang model without become Erlang (just like it's hard to do everything Lisp does without becoming Lisp), but I think that having parallelism baked into the language (or at least thought about) would be neat from a personal-coding-fun standpoint.
right now, two cores makes Javascript intensive apps not suck. For instance, without two cores I can't handle Google's Apps, with two cores they function nicely.