[GIS] Measuring data load and redraw times in ArcMap

arcgis-9.3arcgis-desktoparcmapperformance

Rephrasing of the question slightly to address some issues.

Is there are standard way of measuring load times for data, specifically sybolised data, within ArcMap?

In particular, we're comparing load times of different data formats (shapefile, SDE, file and personal geodatabases) across the network for the same (large) sets of data. We're also aiming to test speeds between ArcGIS versions.

At this point I've started hacking together a simple VB script to do some of the timing without having to sit there with a stopwatch as @matt-wilkie suggested. The script (thus far) can be found at http://textsnip.com/8912ac/vb. The script was written in ArcGIS 9.3 but works in 9.2 as well.

To use the script, copy the VB script to your mxd, and add two buttons, called "LoadDatasets" and "SymboliseDatasets". The LoadDatasets button allows the load of one or more feature classes or layers, and times the load. SymboliseDatasets checks the number of layers in the ActiveView, and if there are none, calls the load dialog (but doesn't time it). Once layers are added then the SymboliseDatasets button will symbolise all layers into 10 quantile groups based on their FIDs.

I have fixed the issue of timing the rendering by adding a DoEvents after the ActiveView.Refresh

In the meantime if anyone wants to pick this script up and modify it to make it more useful I'm happy to set this question to community wiki.

Best Answer

We use a stopwatch and a spreadsheet, and measure (a) time from initial load to spinny-globe-refresh stops spinning, (b) time to refresh (press refresh button), (c) zoom to scale N, (d) zoom to scale NN, (f) pan. Repeat at least 3 times for each datastore. Repeat again at different times of day to account for network usage patterns by others.

The results are pretty rough and the testing labour intensive but better than nothing. A script to do the same which could be automated would be awesome. Some test runs had to be repeated more than 10 times because there was so much variability in the results, I assume from network traffic or perhaps intense fileserver disk activity.

The last time I did this, a couple of years ago, indexed shapefiles were the fastest, closely followed by file geodatabases, then SDE, and personal geodatabases dead last. This is averaged results; our SDE datastore was faster at certain scales but not overall for example. The fastest raster was ECW, which is lossy unfortunately. Fastest lossless was geotiffs with pyramids.