Grandma, your computer is a garden. Cloud computing is a supermarket.
With your garden, you do all the work. If you want to increase the yield, it takes a lot of effort and time.
With a supermarket, you don't have to maintain the garden. And if you suddenly need more food -- say you're having a party -- you can get it right away.
Both gardens and supermarkets are useful.
Now, did you bake me any cookies?
There's certainly a noticeable trend towards functional programming, or at least certain aspects of it. Some of the popular languages that at some point adopted anonymous functions are C++ (C++11), PHP (PHP 5.3.0), C# (C# v2.0), Delphi (since 2009), Objective C (blocks) while Java 8 will bring support for lambdas to the language . And there are popular languages that are generally not considered functional but supported anonymous functions from the start, or at least early on, the shining example being JavaScript.
As with all trends, trying to look for a single event that sparked them is probably a waste of time, it's usually a combination of factors, most of which aren't quantifiable. Practical Common Lisp, published in 2005, may have played an important role in bringing new attention to Lisp as a practical language, as for quite some time Lisp was mostly a language you'd meet in an academic setting, or very specific niche markets. JavaScript's popularity may have also played an important role in bringing new attention to anonymous functions, as munificent explains in his answer.
Other than the adoption of functional concepts from multi-purpose languages, there's also a noticeable shift towards functional (or mostly functional) languages. Languages like Erlang (1986), Haskell (1990), OCaml (1996), Scala (2003), F# (2005), Clojure (2007), and even domain specific languages like R (1993) seem to have gained a strong following strongly after they were introduced. The general trend has brought new attention to older functional languages, like Scheme (1975), and obviously Common Lisp.
I think the single more important event is the adoption of functional programming in the industry. I have absolutely no idea why that didn't use to be the case, but it seems to me that at some point during the early and mid 90s functional programming started to find it's place in the industry, starting (perhaps) with Erlang's proliferation in telecommunications and Haskell's adoption in aerospace and hardware design.
Joel Spolsky has written a very interesting blog post, The Perils of JavaSchools, where he argues against the (then) trend of universities to favour Java over other, perhaps more difficult to learn languages. Although the blog post has little to do with functional programming, it identifies a key issue:
Therein lies the debate. Years of whinging by lazy CS undergrads like me, combined with complaints from industry about how few CS majors are graduating from American universities, have taken a toll, and in the last decade a large number of otherwise perfectly good schools have gone 100% Java. It's hip, the recruiters who use "grep" to evaluate resumes seem to like it, and, best of all, there's nothing hard enough about Java to really weed out the programmers without the part of the brain that does pointers or recursion, so the drop-out rates are lower, and the computer science departments have more students, and bigger budgets, and all is well.
I still remember how much I hated Lisp, when I first met her during my college years. It's definitely a harsh mistress, and it's not a language where you can be immediately productive (well, at least I couldn't). Compared to Lisp, Haskell (for example) is a lot friendlier, you can be productive without that much effort and without feeling like a complete idiot, and that might also be an important factor in the shift towards functional programming.
All in all, this is a good thing. Several multi-purpose languages are adopting concepts of paradigm that might have seemed arcane to most of their users before, and the gap between the main paradigms is narrowing.
Related questions:
Further reading:
Best Answer
It has appeared earlier. In fact, this was the original model of getting access to computing resources back in the 1950s till well into the 1980s, when it was called "time sharing", then in the early 1990s it re-appeared under the name "Client/Server", then in the late 1990s again under the name "Thin Client", then "Application Service Provider".
However, in the exact form we see it today it requires high quality, high reliability, high throughput, low latency, low price, ubiquitous Internet access, which didn't exist until a few years ago, and in fact, still doesn't exist for the vast majority of people (e.g. almost all of Africa, much of Asia, parts of Eastern Europe and South America).