[GIS] Density analysis producing extreme values

arcgis-desktoparcmapdensitypoint

I have point data of gas flares and I am trying to create a simple density raster of those flares. Results are either 0 or in the 100,000-1,000,000 range (when there's 5 to 40 or so points in each box).

At least as I see it, I am trying to do the most basic of point density analyses. I'm just simply trying to count up how many points are in that box! I'm using a reference raster of 15 arcseconds. My point density dialogue is as below. Note that I've tried it two different ways with very similar results. Any suggestions of where I'm going wrong? I'm guessing it has to do with my search radius, but how do I tell Arc I want my search radius to be the same as the pixel?

Dialogue:
tool dialog

Results:
results

Environments: Processing extent same as reference raster

Input: Gas Flares Points

Population: none

Output: …\Flare_Dens.tif

Output cell size: 0.004167 (15 arcsecs in dec degrees, same as raster I'm snapping to)

Neighborhood: Rectangle

Radius: Map, 0.004167 x 0.004167

OR (getting similar results in the >100,000 range)

Radius: Cell, 1 x 1

Best Answer

Sounds like a projection issue to me. Your data and analysis is being performed in decimal degrees - implying a geographic coordinate system. And yes, your map units are decimal degrees as evidenced by your screenshots. This is part of the problem. Generally in order to do calcuations related to area and length you want to be in a projected coordinate system, which is why I suggest a UTM zone projection using meters.

The area units I refer to in my comment is in the Point Density tool - last parameter which your question doesn't mention and screenshot doesn't show (you need to scroll down or make the dialog box bigger). That controls the unit values of your output raster and performs the unit conversion for the density units - ie your data could be in meters but you could set that to square feet and your output raster units would be density/sf.

The key here is that cell size units and density value units are independent of each other. You can have a one kilometer square area (cell) where the density of gas flares is 0.345 per square foot. You could also have a one foot square cell where the density is 300 per square kilometer. In your case, your cell is a fraction of a square degree. The default area unit would be one square degree. Therefore 18 points in your cell translates to a density of x per some small fraction of a degree, times whatever it takes to get a full square degree (extrapolating out to the chosen/default area unit), equals the extreme values you're seeing. You could set the area unit to square feet or something, but since there's no direct, uniform conversion between a degree and a foot, you still may not get accurate results - which is why it's best to project your data first.

The search radius just controls how far out from a given cell to look for points to count toward the density. You've already taken care of setting it to one cell in the Neighborhood section of the dialog box. The issue isn't that the tool is going beyond your cell to count points, it's a unit conversion/magnitude problem.

Long story short: project your data (something like UTM zone as appropriate to your area) before running Point Density.

Related Question