Java – Why do C# and Java use reference equality as the default for ‘==’

cjavalanguage-designprogramming-languages

I've been pondering for a while why Java and C# (and I'm sure other languages) default to reference equality for ==.

In the programming I do (which certainly is only a small subset of programming problems), I almost always want logical equality when comparing objects instead of reference equality. I was trying to think of why both of these languages went this route instead of inverting it and having == be logical equality and using .ReferenceEquals() for reference equality.

Obviously using reference equality is very simple to implement and it gives very consistent behavior, but it doesn't seem like it fits well with most of the programming practices I see today.

I don't wish to seem ignorant of the issues with trying to implement a logical comparison, and that it has to be implemented in every class. I also realize that these languages were designed a long time ago, but the general question stands.

Is there some major benefit of defaulting to this that I am simply missing, or does it seem reasonable that the default behavior should be logical equality, and defaulting back to reference equality it a logical equality doesn't exist for the class?

Best Answer

C# does it because Java did. Java did because Java does not support operator overloading. Since value equality must be redefined for each class, it could not be an operator, but instead had to be a method. IMO this was a poor decision. It is much easier to both write and read a == b than a.equals(b), and much more natural for programmers with C or C++ experience, but a == b is almost always wrong. Bugs from the use of == where .equals was required have wasted countless thousands of programmer hours.