Hi,
why does the following fail? I cannot find any caveat about GPU support of min/max on the documentation. Plus, the error message is really not helping. Maybe it has been solved in R2020?
Is there a way to obtain the linear index of the min/max without fecthing the array from the GPU ?
>> A = rand(3, 3, 3, 'gpuArray');>> min(A, [], 'all')ans = 0.0342>> min(A, [], 'linear')Error using gpuArray/minOption must be 'all', 'linear', 'omitnan', or 'includenan'. >> [minimum, index] = min(A, [], 'linear');Error using gpuArray/minOption must be 'all', 'linear', 'omitnan', or 'includenan'.
Thanks!
Best Answer