Moore's law has actually been failing as of late. I suspect that exponential increase in computer capability is a feature of the yet-nascent state of computer technology. Assuming that it will continue into the foreseeable future is a bad idea. Performance is important; it's just not as important as some think, and in particular, it's not so much all-around performance as the ability to avoid bottlenecks that's important. This is why a profiler for arc would be nice, as would some easy and standard way to hook into a lower level language for speed boosts.
If you generalize Moore's law to cost of computation per second, it isn't failing. In 100 years, network availability, speed, and capacity will lead to yet-unimagined changes.
How will programming languages take advantage of a thousand or a million CPU's?