http://www.heise-online.co.uk/open/Shut ... ews/111534
In the article, Mark Shuttleworth, Ubuntu Big Cheeze, was asking the Python community what they are doing about three big megatrends in computing today: cloud computing, transactional memory, and multicore processors.
I think this is a general question that every programming language, including Lisp, should be asking itself (if a programming language, as such, is capable of asking itself anything

I'll start by saying that I think Shuttleworth's question, as reported by Heise, is slightly incorrect. I would not say that transactional memory is really a large trend right now, and it may never be. Rather, transactional memory is one technique for dealing with a massively multicore, parallel world. There are other ways to deal with that same fundamental problem, "How do I program a massively multicore system without making a mess of it?" Admittedly, TM is pretty elegant. Clojure takes the STM route to handle parallelism.
So, let me revise the question with what I feel are the big trends:
- Intead of "cloud computing" which seems pretty trendy and buzzword compliant, let's instead use "Internet-wide computing." The thought here is that things are going to be increasingly distributed. Programming languages need to provide abstractions to help programmers deal with that. This includes not just simple distribution of processing power "into the cloud" but also the usage a resources across the Internet (think what are typically called "mashups" now, but on steroids).
- Massively multicore - I think Shuttleworth got that right. I like to use the word "massive" associated with this because I think it helps people internalize that the future here is not 8 or even 16 core, but rather hundreds and eventually thousands of cores on a single die. Intel has already shown 80-core test chips, albeit not with full x86 cores. But I have had personal discussions with Intel architects that say that tens of cores is certainly right around the corner.
- Finally, to replace transactional memory, let me add high availability. In the future, you're going to see lots of application requirements that say things are going to have to work all day, every day, forever. The idea of application maintenance is going to dwindle for some applications to near zero. The ability to develop, debug, and upgrade a running application is going to be more important than ever.
Thoughts?