[GIS] Creating Series of Rasters that displays Standard Deviation from Mean using ArcGIS Spatial Analyst

arcgis-desktopcell-statisticsraster-calculatorspatial-analyst

I have 100 rasters of raw rainfall data in which the pixels each represent rainfall for that particular pixel on the landscape for a single year during the 100 year period. I have also created a raster that calculated the mean rainfall received in each pixel over those 100 years.

From these raw data rasters, I want to create a new set of 100 rasters (one for each year) that displays the standard deviation from the mean of each pixel (of the entire 100 years) for each pixel.

I have a feeling this will be likely solved with python coding, which I have a little bit of experience in, but I am unsure of what tool to use? It seems like Cell Statistics is on the right track, since it focuses on individual cells, but doesn't achieve what I am looking for.

Any other ideas on how to tackle this?

Best Answer

For each cell in your 100 rasters you know:

  • Current cell value (total rainfall per year?)
  • Mean rainfall per year at that cell

And you also know:

  • Standard deviation based on 100 years (population)

I would think the simplest approach would be to use the raster calculator in conjunction with a ModelBuilder iterator. You can write a short equation to place your current raster cell value about the mean and then assign the cell the appropriate value for standard deviation.