Linux – MD5 hash calculates differently on server

clinuxmacosmd5

I am running some code that I have written in C which calls the md5 hashing functionality from a hashing library that someone else wrote (md5.c & md5.h). The odd behavior I have been seeing is:

hashing working perfectly = I hash a string, and it comes out to the exact hash that I have verified it to be with multiple other sources.

  1. Hashing functionality works
    perfectly when compiling and running
    on my OSX machine and the hash that
    is computed is exactly as it should
    be.

  2. Same code, no changes is uploaded
    and compiled on the Linux based
    server and it computes a different
    (wrong) hash.

Does anyone have any insight on how exactly this would be possible? Its been driving crazy for the past week and I do not understand why this is even possible. I have also tested it on another machine, compiled and executed and it works perfectly. Its just when I upload it to the server that the hash is no longer correct.

The hashing functionality file can be found at:
http://people.csail.mit.edu/rivest/Md5.c

SOLVED: Thanks everyone
It was the 64-bit arch issue. Its mighty annoying that that slipped my mind to consider that when debugging…….

Best Answer

Try to replace (Md5.c line 41)

typedef unsigned long int UINT4;

by

typedef uint32_t UINT4;

(include stdint.h if needed)

On a 64 bits machine long int are (usually) 64 bits long instead of 32

EDIT :

I tried on a 64 bits opteron this solves the problem.