I think you should stick with the conventions that your co-workers are used to. The principle of least surprise. You don't want to sit and figure out why a simple operation is written in an uncommon manner for no reason, so why should the next guy who is reading your code?
I would take a step back here. You're concentrating on the picky details of the code but missing the larger picture. Let's take a look at one of your example loops:
int offset = 0;
while(true)
{
Record r = Read(offset);
if(r == null)
{
break;
}
// do work
offset++;
}
What is the meaning of this code? The meaning is "do some work to each record in a file". But that is not what the code looks like. The code looks like "maintain an offset. Open a file. Enter a loop with no end condition. Read a record. Test for nullity." All that before we get to the work! The question you should be asking is "how can I make this code's appearance match its semantics?" This code should be:
foreach(Record record in RecordsFromFile())
DoWork(record);
Now the code reads like its intention. Separate your mechanisms from your semantics. In your original code you mix up the mechanism -- the details of the loop -- with the semantics -- the work done to each record.
Now we have to implement RecordsFromFile()
. What's the best way of implementing that? Who cares? That's not the code that anyone is going to be looking at. It's basic mechanism code and its ten lines long. Write it however you want. How about this?
public IEnumerable<Record> RecordsFromFile()
{
int offset = 0;
while(true)
{
Record record = Read(offset);
if (record == null) yield break;
yield return record;
offset += 1;
}
}
Now that we are manipulating a lazily computed sequence of records all sorts of scenarios become possible:
foreach(Record record in RecordsFromFile().Take(10))
DoWork(record);
foreach(Record record in RecordsFromFile().OrderBy(r=>r.LastName))
DoWork(record);
foreach(Record record in RecordsFromFile().Where(r=>r.City == "London")
DoWork(record);
And so on.
Any time you write a loop, ask yourself "does this loop read like a mechanism or like the meaning of the code?" If the answer is "like a mechanism", then try to move that mechanism to its own method, and write the code to make the meaning more visible.
Best Answer
Ultimately, the best answer is to actually test it. Make a method which loops over an empty array with and without checking the length first, call each 100,000 times and see which has a faster runtime.
When I write code like this, I generally ponder this as a trade-off between code readability and perceived or potential performance. In other words, adding a
if (count > 0)
check negatively impacts the readability, even though by a very small amount. So what is the gain? I presume there is very little or no advantage, but like you, I have not actually tested it. What I can tell you is that in all my years profiling, I have never found looping over an empty array to be a bottleneck.Technically, it depends on how the iterator works. For a foreach loop like you have, you are creating an iterator object behind the scenes. My expectation would be, and again, this is unverified, is that the iterator object FOR AN ARRAY would simply be implemented like this (this is Java, but C# is probably comparable):
So iterating over such an iterator should exit out very quickly because
hasNext()
will return false. Notice it's the same check that yourif
statement would do.Now if you are iterating over a linked list, the implementation should be different. If you explicitly check the size, you may be forcing it to scan the linked list. In Java,
LinkedList.size()
checks an explicitsize
variable so it doesn't walk the list, but I don't know how .NET implements it.Another way to look at this is how likely is the collection to be empty? If that is a rare case, then if the theory that iterating over an empty array takes time is true, then adding an explicit check for empty first would only be faster if the collection is usually empty. If not, you are theoretically slowing down your average case just to slightly speed up your worst case.
Again, these are all theories and the timings are on the micro-level, so take it with a grain of salt.
TL;DR
It depends on the type of collection, the runtime platform, and the typical collection size, but intuitively, it seems unlikely to speed anything up in most situations. But the only way to know for sure is to test it.