I am trying to sample the Global Forest Change dataset using randomly drawn points and the reduceRegion function. The way I construct this is:
var sampledPoints = gfc2014_c.reduceRegions({
collection:points,
reducer: ee.Reducer.mean()
});
Where gfc2014_c
is the GFC datset cropped to sub-Saharan Africa and points are my randomly drawn points. The entire code is here.
I was aiming for a large sample of 1 million pixels. When I try to export in chunks (even chunks as small as 5,000) I get the following error:
Error: image.reduceRegions: Computed value is too large.
What am I doing wrong? I know the GFC dataset is really large, but it must be possible, right?
Best Answer
Based on Justin's comment above and the part of the debugging guide he directed me to, I tried the following:
This seems to have worked: I can now export a chunk of 100,000 observations in 5 hours. Thank you Justin!