[GIS] Slow IE8 responsiveness with OpenLayers and many features

featuresjsonopenlayers-2optimizationperformance

Background

Adding about 700 sites to an OpenLayers Map. Firefox 11 and Google Chrome display the sites just fine. IE8, which is the organization's standard and will not be upgraded for the foreseeable future, bogs down when all 700 sites are displayed.

Each site has a representative icon, from a set of 15 icons. Clicking on each site displays an informative popup. The popup window is loaded on a per-click basis. The HTML for the popup window is loaded into memory once (and only once) from a single template. Clicking the site reveals the popup, which dynamically loads JSON data and fills in the template on-the-fly (to reduce the memory footprint).

This application uses OpenLayers v2.12 (723 kb).

Code

The sites are being added to the "sites" layer as follows:

/**
 * Adds a site to the list of sites. 
 */
function addSite( sites, uuid, label, lon, lat, icon ) {
  sites.push(
    new OpenLayers.Feature.Vector(
      getPoint( lon, lat ),
      {
        uuid: uuid,
        siteType: icon,
        title: label
      }
    )
  );
}

/**
 * Fetches the list of all sites from the query service and adds them
 * to the site layer.
 * 
 * @param siteLayer - The site is added to this layer. 
 */
function addSites( siteLayer ) {
  $.ajax({
    url: MAP_SERVER_SERVICE + 'sites',
    type: 'GET',
    dataType: "json",
    contentType: "application/json; charset=utf-8",
    success: function( data ) {
      var sites = [];

      $.each( data, function( key, val ) {
        addSite(
          sites,
          val.uuid,
          val.label,
          val.longitude,
          val.latitude,
          val.site_classification );
      });

      siteLayer.addFeatures( sites );
    }
  });
}

About 100k of JSON data is returned from the map server service (i.e., the Ajax call).

Question

How would you load ~700 features into IE8 through OpenLayers without bogging it down?

The sites cannot be rendered on map tiles because the icon is dynamic (i.e., different search results will produce different tiles at the same location).

Ideas

Some ideas I have considered include:

  • Upgrade – Probably not feasible to ask everyone who will be using the application to upgrade to IE9.
  • Browser – Might not be feasible to ask people to download Chrome (or Firefox) just to view a web application.
  • Empty Map – This would make the map quite responsive initially, but would still bog down when showing a large number of sites. This is the most trivial of solutions.
  • Optimize – There is no guarantee that even after optimizing OpenLayers that the site will be more responsive.
  • Restrict Area – Define a region of “visibility” (like a rectangular section) and only display sites that fall within that region. This significantly reduces the usefulness of the tool.
  • Site Limits – Constrain the number of sites returned for any given search. This would produce misleading results.
  • Hide Overlapping Features – Hide sites that are partially hidden by other sites. This is probably produces optimal results. This is a type of cluster strategy.

What other possibilities are there to speed up IE8?

For what it's worth, the postDraw and createNode (30.25 seconds) methods of OpenLayers take up most of the application time, according to the IE8 debugger.

Update #1

To illustrate the amount of data:

There are sites that fall outside of the province, but they are literally few and far between.

Zoomed in view:

Update #2

I thought I could "hide" some of the features, but this does not help:

function hideFeatures( layer ) {
  var features = layer.features;

  for( var i = 0; i < features.length; i++ ) {
    features[i].style = { visibility: 'hidden' };
  }

  layer.redraw();
}

It looks like they'll actually have to be removed from the layer to be performant.

Related Links

Thank you!

Best Answer

Use a Cluster Strategy in combination with a BBOX Strategy.

Perhaps you should come at this from a different angle. You are not far off with your Restrict Area idea. Don't dismiss this idea too soon as reducing the usefulness of the tool.

In your current approach you are loading all sites at once and then adding them to a vector layer. This means that at any one time there is the potential that the browser will contain more data than the user can see (depending on zoom level etc) as most data will be outside of their viewport bounds. You can utilise this fact to optimise the loading of the data into the user's browser by only requesting the data from the server that falls within the bounds of the user's current view.

You can implement a BBOX refresh strategy to your vector layer by adding a OpenLayers.Strategy.BBOX strategy object to your vector layer. When the view is invalidated (by a pan or zoom action) the BBOX strategy will cause the layer to request new data from the backend server. To implement this approach you will need to tweak your server side function to accept the additional parameters OpenLayers will add to the data request and you will also need to return the data in a format that can be read by OpenLayers. It is pretty straight forward to implement and you should notice a massive increase in performance across all browsers.

There is an example on the OpenLayers website of implementing the BBOX strategy. You could also consider adding a Cluster strategy for small scales when there is potentially large volumes of features returned from a request, or implement some scale dependent rendering of the layer.