What are the consequences of Hash-Life running in O(log n)

complexityturing-completeness

After reading about the HashLife algorithm, I found that it runs in O(log n). The Game of Life is also Turing Complete, so in theory we should be able to run any algorithm on a "computer" constructed in GoL.

As a consequence of HashLife's time complexity, could algorithms run faster? e.g. if an algorithm takes 10 seconds to run on a pc, could it run faster in HashLife on that same pc?

An example: An algorithm running takes a 1000 instructions to run. A certain computer can process 1 instruction per second. So that algorithm takes a 1000 seconds to run.

Now, if we take that same algorithm and run it on the "computer" in GoL. It would, because of HashLife being O(log n) take 3 seconds? (assuming O(log₁₀n))

I'm probably overlooking something, since this would be a very important discovery, but I still thought I'd ask about it here.

Best Answer

Since Hashlife runs the Game of Life which is Turing complete, it can run any Turing-computable program. If the complexity for all programs run through Hashlife would be reduced by log(n), that would for example get the (NP-complete) SAT problem (where the best known implementations have O(2^n)) run in O(n). Maybe with multiple Hashlife simulations nested in each other, you could maybe solve every problem in O(log(n)). That could eventually lead to a proof for P=NP.

The problem is that the O(log(n)) only works for the average case, while the worst case still has O(n). Hashlife has three major optimisations:

  1. A Quadtree to be able to generalise operations for all higher level nodes of the tree
  2. Canonicalisation of all nodes in a hashtable to have only one object for every used node layout
  3. Memoisation of the next generation for each node to avoid recalculating it

Both of the latter only work so well because most configurations have many reoccuring patterns and comparatively few different possible nodes. For some of the more complex starting patterns, the Golly simulator starts with the usual very quickly growing speed (when the pattern is still small and regular), but gets slower the more the pattern grows, so it has apparently not purely logarithmic complexity.

You can use memoisation to speed up any other repeated deterministic procedure. You can use trees for efficient data management. You can use canonicalisation for immutable objects to save a lot of memory and calculations.

Related Topic