- what are the "problem patterns" that call for the solution of
recursion
I wouldn't say there's such a thing like a problem pattern for the use of recursion. Every function that can be implemented with recursion can also be implemented iteratively, often by pushing and popping a stack.
It's a matter of expression and also of performance. Iterative algorithms often times have a better performance and are easier to optimize. However, recursive algorithms benefit from a clearer expression and thus are often easier to read, understand and implement.
Some things even cannot be expressed without recursion, infinite lists for example. The so called functional languages heavily rely on recursion, as it's their natural way of expression. The saying is: "Recursive programming is functional programming done right".
- is recursion a form of "divide & conquer" strategy or a form of
"code reuse" -- or, is a design pattern in its own right
I would not call it a design pattern. It's a matter of expression. Sometimes a recursive expression is simply more powerful and more expressive and thus leads to better and cleaner code.
- can you give us an example of a real world problem where
recursion comes to mind as an immediate solution
Anything that needs to traverse trees will be properly expressed by a recursive algorithm.
Back in those days, developers were working much closer to the metal. C was essentially a higher level replacement for assembly, which is almost as close to the hardware as you can get, so it was natural you needed pointers to be efficient in solving coding problems. However, pointers are sharp tools, which can cause great damage if used carelessly. Also, direct use of pointers open up the possibility to many security problems, which weren't an issue back then (in 1970, the internet consisted of about a few dozen machines across a couple of universities, and it was not even called like that...), but became more and more important since. So nowadays higher level languages are consciously designed to avoid raw memory pointers.
Saying that "advanced things done in VB.Net or Java are not possible in C" shows a very limited point of view, to say the least :-)
First of all, all of these languages (even assembly) are Turing complete so in theory whatever is possible in one language, is possible in all. Just think about what happens when a piece of VB.Net or Java code is compiled and executed: eventually, it is translated into (or mapped to) machine code, because that is the only thing which the machine understands. In compiled languages like C and C++, you can actually get the full body of machine code equivalent to the original higher level source code, as one or more executable files/libraries. In VM based languages, it is more tricky (and may not even be possible) to get the entire equivalent machine code representation of your program, but still eventually it is there somewhere, within the deep recesses of the runtime system and the JIT.
Now, of course, it is an entirely different question whether some solution is feasible in a specific language. No sensible developer would start writing a web app in assembly :-) But it is useful to bear in mind that most or all of those higher level languages are built on top of a huge amount of runtime and class library code, a large chunk of which is implemented in a lower level language, typically in C.
So to get to the question,
Do you think knowledge on pointers to the young people [...] is important?
The concept behind pointers is indirection. This is a very important concept and IMHO every good programmer should grasp it on a certain level. Even if someone is working solely with higher level languages, indirection and references are still important. Failing to understand this means being unable to use a whole class of very potent tools, seriously limiting one's problem solving ability in the long run.
So my answer is yes, if you want to become a truly good programmer, you must understand pointers too (as well as recursion - this is the other typical stumbling block for budding developers). You may not need to start with it - I don't think C is optimal as a first language nowadays. But at some point one should get familiar with indirection. Without it, we can never understand how the tools, libraries and frameworks we are using actually work. And a craftsman who doesn't understand how his/her tools work is a very limited one. Fair enough, one may get a grasp of it in higher level programming languages too. One good litmus test is correctly implementing a doubly linked list - if you can do it in your favourite language, you can claim you understand indirection well enough.
But if not for anything else, we should do it to learn respect for the programmers of old who managed to build unbelievable things using the ridiculously simple tools they had (compared to what we have now). We are all standing on the shoulders of giants, and it does good to us to acknowledge this, rather than pretending we are the giants ourselves.
Best Answer
I'm not sure COBOL does (it certainly didn't at one time), but I can't quite imagine anybody caring much either.
Fortran has since Fortran 90, but requires that you use the
recursive
keyword to tell it that a subroutine is recursive.PL/I was pretty much the same -- recursion was supported, but you had to explicitly tell it what procedures were recursive.
I doubt there are many more than that though. When you get down to it, prohibiting recursion was mostly something IBM did in their language designs, for the simple reason that IBM (360/370/3090/...) mainframes don't support a stack in hardware. When most languages came from IBM, they mostly prohibited recursion. Now that they all come from other places, recursion is always allowed (though I should add that a few other machines, notably the original Cray 1, didn't have hardware support for a stack either).