Algorithms – Difference Between Recursion and Corecursion

algorithmsrecursion

What's the difference between these?

On Wikipedia, there is little information and no clear code explaining these terms.

What are some very simple examples explaining these terms?

How is corecursion the dual of recursion?

Are there any classic corecusive algorithms?

Best Answer

There are a number of good ways of looking at this. The easiest thing for me is to think about the relation between "Inductive" and "Coinductive definitions"

An inductive definition of a set goes like this.

The set "Nat" is defined as the smallest set such that "Zero" is in Nat, and if n is in Nat "Succ n" is in Nat.

Which corresponds to the following Ocaml

type nat = Zero | Succ of nat

One thing to note about this definition is that a number

omega = Succ(omega)

is NOT a member of this set. Why? Assume that it was, now consider the set N that has all the same elements as Nat except it does not have omega. Clearly Zero is in N, and if y is in N, Succ(y) is in N, but N is smaller than Nat which is a contradiction. So, omega is not in Nat.

Or, perhaps more useful for a computer scientist:

Given some set "a", the set "List of a" is defined as the smallest set such that "Nil" is in List of a, and that if xs is in List of a and x is in a "Cons x xs" is in List of a.

Which corresponds to something like

type 'a list = Nil | Cons of 'a * 'a list

The operative word here is "smallest". If we didn't say "smallest" we would not have any way of telling if the set Nat contained a banana!

Again,

zeros = Cons(Zero,zeros)

is not a valid definition for a list of nats, just like omega was not a valid Nat.

Defining data inductively like this allows us to define functions that work on it using recursion

let rec plus a b = match a with
                   | Zero    -> b
                   | Succ(c) -> let r = plus c b in Succ(r)

we can then prove facts about this, like "plus a Zero = a" using induction (specifically, structural induction)

Our proof proceeds by structural induction on a.
For the base case let a be Zero. plus Zero Zero = match Zero with |Zero -> Zero | Succ(c) -> let r = plus c b in Succ(r) so we know plus Zero Zero = Zero. Let a be a nat. Assume the inductive hypothesis that plus a Zero = a. We now show that plus (Succ(a)) Zero = Succ(a) this is obvious since plus (Succ(a)) Zero = match a with |Zero -> Zero | Succ(a) -> let r = plus a Zero in Succ(r) = let r = a in Succ(r) = Succ(a) Thus, by induction plus a Zero = a for all a in nat

We can of-course prove more interesting things, but this is the general idea.

So far we have dealt with inductively defined data which we got by letting it be the "smallest" set. So now we want to work with coinductivly defined codata which we get by letting it be the biggest set.

So

Let a be a set. The set "Stream of a" is defined as the largest set such that for each x in the stream of a, x consists of the ordered pair (head,tail) such that head is in a and tail is in Stream of a

In Haskell we would express this as

data Stream a = Stream a (Stream a) --"data" not "newtype"

Actually, in Haskell we use the built in lists normally, which can be an ordered pair or an empty list.

data [a] = [] | a:[a]

Banana is not a member of this type either, since it is not an ordered pair or the empty list. But, now we can say

ones = 1:ones

and this is a perfectly valid definition. Whats more, we can perform co-recursion on this co-data. Actually, it is possible for a function to be both co-recursive and recursive. While recursion was defined by the function having a domain consisting of data, co-recursion just means it has a co-domain (also called the range) that is co-data. Primitive recursion meant always "calling oneself" on smaller data until reaching some smallest data. Primitive co-recursion always "calls itself" on data greater than or equal to what you had before.

ones = 1:ones

is primitively co-recursive. While the function map (kind of like "foreach" in imperative languages) is both primitively recursive (sort of) and primitively co-recursive.

map :: (a -> b) -> [a] -> [b]
map f []     = []
map f (x:xs) = (f x):map f xs

same goes for the function zipWith which takes a function and a pair of lists and combines them together using that function.

zipWith :: (a -> b -> c) -> [a] -> [b] -> [c]
zipWith f (a:as) (b:bs) = (f a b):zipWith f as bs
zipWith _ _ _           = [] --base case

the classic example of functional languages is the Fibonacci sequence

fib 0 = 0
fib 1 = 1
fib n = (fib (n-1)) + (fib (n-2))

which is primitively recursive, but can be expressed more elegantly as an infinite list

fibs = 0:1:zipWith (+) fibs (tail fibs)
fib' n = fibs !! n --the !! is haskell syntax for index at

an interesting example of induction/coinduction is proving that these two definitions compute the same thing. This is left as an exercise for the reader.