I am currently reading a book titled "Numerical Recipes in C". In this book, the author details how certain algorithms inherently work better if we had indices starting with 1 (I don't entirely follow his argument and that isn't the point of this post), but C always indexes its arrays starting with 0. In order to get around this, he suggests simply decrementing the pointer after allocation, e.g.:
float *a = malloc(size);
a--;
This, he says, will effectively give you a pointer that has an index starting with 1, which will then be free'd with:
free(a + 1);
As far as I'm aware, though, this is undefined behavior by the C standard. This is apparently a highly reputable book within the HPC community, so I don't want to simply disregard what he's saying, but simply decrementing a pointer outside of the allocated range seems highly sketchy to me. Is this "allowed" behavior in C? I have tested it out using both gcc and icc, and both of those results appear to indicate that I'm worrying over nothing, but I want to be absolutely positive.
Best Answer
You are right that code such as
yields undefined behavior, per the ANSI C standard, section 3.3.6:
For code like this, the quality of the C code in the book (back when I used it in the late 1990s) wasn't considered very high.
The trouble with undefined behavior is that no matter what result the compiler produces, that result is by definition correct (even if it is highly destructive and unpredictable).
Fortunately, very few compilers make an effort to actually cause unexpected behavior for such cases and the typical
malloc
implementation on machines used for HPC has some bookkeeping data just before the address it returns, so the decrement would typically give you a pointer into that bookkeeping data. It is not a good idea to write there, but just creating the pointer is harmless on those systems.Just be aware that the code could break when the runtime environment gets changed or when the code is ported to a different environment.