Coded ui to measure performance

measurementperformance

I have been tasked with using coded UI to measure performance on a proprietary windows desktop application. The need is to measure how long it takes for the next page/screen to display after a user clicks on a control.

For example – a user enters their ID and PW and clicks sign-in. The need is to measure how long it takes for the next screen to display when the user clicks the sign-in button. I understand the need to define what indicates the screen is loaded and ready for use. One approach is to use control.WaitForControlReady and use BeginTimer/EndTimer.

Is coded ui a dependable and accurate way of measuring time?
Is WaitForControlReady the best method to determine when a control is ready for use?

Best Answer

Hard: Log Total Time

If you have a <script> section at the end of your HTML page, it won't get executed until the rest of the page is fully loaded and probably rendered (but check me on that). Note that the client and server probably won't have the same clock time, but you could have a <script> tag at the end of your document send an AJAX request back to the server with a built-in unique token telling the server, "Request (request-identifier) is complete and displayed by the browser."

On the server side, the requests you want to time would have to put a hashtable of the original request-identifiers and timestamps in the user's session. The "request-complete" message from that user will come in as a new request which should look up the request-identifier for the request in the session, get the start time-stamp, and write the end-time minus start-time difference to a log or whatever. Then it should remove that request-identifier from the hashtable.

Actually, you may need to clean up the hashtable, especially if the user cancels a request (it will never complete) or logs out, or just closes their browser. Also if the user requests anything that doesn't end in your all-done JavaScript, such as images, pages you forgot to add the tag to, etc. This probably isn't something you want to leave running in production code.

Simpler: Log Request Time and Rendering Time Separately

Something easier to measure and probably just about as useful from a business perspective is how long it takes the web server to serve the request. The main method of our master-servlet starts with:

// Start timing the request
long startMs = System.currentTimeMillis();

And ends with:

long elapsedMs = System.currentTimeMillis() - startMs;

if (elapsedMs > LONG_TIME) {
    logger.error("Done REALLY SLOWLY: " + elapsedMs + "ms");
} else if (elapsedMs > PRETTY_LONG_TIME) {
    logger.warn("Done SLOWLY: " + elapsedMs + "ms");
} else {
    logger.info("Done: " + elapsedMs + "ms");
}

Now your application logs on your server have errors for long requests. You could even have it page you if something is slow! To tell the difference between browser time and server time, just use the built-in tools in Chrome or Firefox that show how long each part of a page building took. There's a timeline graph and other great stuff built-in there.

ChromeNetworkGraph

Maybe if you have an all-AJAX application it's an issue, but you can build this same timing into your server-side and send a timing result to the server.

Related Topic