How can I get a confusion matrix using specific columns in two different rasters in ArcGIS ? I have one raster of risk classes 1,2,3,4 and another raster of land cover A, B, C,D. I am looking for a matrix showing number of cell count for each land cover in each risk class.
[GIS] Confusion matrix in ArcGIS 10.3
arcgis-10.3arcgis-desktopconfusion matrix
Related Solutions
Given the case that you are trying to measure the accuracy of a supervised classification, then generally you also have test samples, usually polygons. In SAGA, one has to convert these polyons into a grid with:
Module: Grid - Gridding -- Shapes to grid
After that, one can use that new grid in the zonal statistic tool found in the module:
Spatial and Geostatistics - Grids -- Zonal Grid Statistics
That leads to a table having (at least) three columns, where one (count UCU) holds the number of cells/pixels per test class and classified class. This table can then be turned into the classical confusion matrix, from which Kappa etc can be calculated.
So for tl;dr answer to your question, No.
Long answer:
The 33..57; your rowsums, these are your models results. Notice that your colsums do add up to 50/class (except the last two, but I assume that you've made a transposition error some where. 49, 51 is close enough).
This implies that as you stated previously, you took a sample of 50 of each of your classes at points with known class identity. So you have 50 units of reference data for each class. You've compared this with your model prediction for the same units of data. If your model was a perfect model with 100% accuracy and precision, your row sums and colsums would all add up to be 50, and only your major diagonal would be populated with values. But this is the real world, so your model is confusing some results
Lets look at how well your model does at predicting class two. Your model predicted that 63/300 points were class two. So overall, it overestimated the amount of class two you should be finding. However, it only found 28 of the 50 points it would have found if it were a perfect model. This implies that not only is it overestimating class two, it also lacks precision with regard to finding class two.
In summary, your results look fine, you just need a bit of help with the interpretation, and you should probably figure out why you don't have the right number of reference points in classes 5 and 6. Other than that, this looks like exactly what you should expect from the kind of classification you are conducting.
Best Answer
I think the tool you want is not Compute Confusion Matrix as it is mainly used for accuracy assessment to compare the classified map with ground truth data. The following I quote from the help above:
The tool you need is Tabulate Area. Tabulate area is used to calculate the area of zones within each another zone. So in your case you may use the land cover data as input data and the risk class as zone data. Then the tool will calculate the area of each land cover within each risk zone. Finally: