JavaScript Array Length – Why Count by Last Index?

javascriptlanguage-designlanguage-features

JavaScript seems to calculate the array length property by the number of the last index in the array rather than counting the number of items in the array. Example:

var testArray = ['a', 'b', 'c'];
console.log(testArray.length); //3
testArray[8] = 'i';
console.log(testArray.length); //9

Normally I would use push() and wouldn't add a value to an array using bracket notation, but this was a bit of a surprise. Is there any reason for this implementation of assigning a value to Array.length? The only thing I can seem to come up with is that it just uses the last index as a performance optimization to avoid counting the array items.

Best Answer

The length of an array is the number of elements it contains whereby it does not matter if those elements are valid objects or values or just "undefined". If you add an element at position 8 you automatically generate 5 additional elements between your first three elements and the new one as you can see if you run

for(var i = 0; i < testArray.length; i++){console.log(testArray[i]);}

(Those may or may not take real space in memory, depending on the actual implementation)

Being "equaly spaced" is somewhat the definition of an array (compared to other container types like lists or dictionaries).