AWS Cloudwatch monitoring – what is the difference between Average, Sum, and Maximum

amazon ec2amazon-web-services

I'm setting up alarms for my EC2 instance. This may be basic math, but I'm confused as to what the difference between "average", "sum", and "maximum" as it pertains to a CloudWatch graph. Particularly for Network In/Out graphs.

In the charts below, you can see there was a spike in output traffic (11MB) at the 0800 hour on 11/5.

But how do I reconcile the sum chart with the corresponding chart for "average" and "max". I think "average" is the average number of bytes per minute. But if that's the case, why is the shape of the graph and spikes slightly different?

And I'm not certain how to interpret the Max graph. I think each data point represents the single minute of that hour with the highest number of bytes.

If someone has a good and simple explanation for these different values, I would appreciate it.

Best Answer

Cloudwath keeps the data in 5 minutes or 1 minute intervals (depending whether you choose detailed monitoring or not). You asked for a 3 days representation in 1 hour intervals and this means that for every hour it will compute the data this way (depending how you choose):

Average: it calculates the average of data for every hour (based on your [1 or 5] minutes data)

# for 5 minutes intervals data
# the denominator value of 12 comes from the fact that there are 12 5-minute intervals in an hour
( 2 + 3 + 5 + ... ) / 12

Sum: sums up all numbers received in that hour

( 2 + 3 + 5 + ... )

Max: it chooses the maximum number received in that hour.

max(2, 3, 5, ...)