My goal is: to calculate the minimum, maximum and average values of a raster band within a polygon feature, without using any library, only using PyQGIS:
polygon_geom = polygon.geometry() extent = polygon_geom.boundingBox() provider = rasterLayer.dataProvider() raster_x_res = provider.xSize() raster_y_res = provider.ySize() pixelWidth = demLayer.rasterUnitsPerPixelX() pixelHeight = demLayer.rasterUnitsPerPixelY() block = provider.block(1, extent, raster_x_res, raster_y_res) no_data_value = provider.sourceNoDataValue(1) array = block.as_numpy() validData = [] for row in range(block.height()): for col in range(block.width()): x = extent.xMinimum() + col * pixelWidth + pixelWidth/2 y = extent.yMaximum() - row * pixelHeight + pixelHeight/2 point = QgsPointXY(x,y) if polygon_geom.contains(point): value = array[row, col] if value != no_data_value: validData.append(value) print(min(validData)) print(max(validData)) print((sum(validData))/(len(validData)))
However, len(validData) is 1 unit different from the QgsZonalStatistics Count statistic and the average is also slightly changed. The value respectively of the minimum, maximum, average and count of both the script and QgsZonalStatistics is in the following image:
Image may be NSFW.
Clik here to view.
Does anyone have any ideas what the problem might be?