Looking for Python package for numerical integration over a tessellated domain

This integrates over triangles directly, not the Voronoi regions, but should be close. (Run with different numbers of points to see ?) Also it works in 2d, 3d ...

#!/usr/bin/env python
from __future__ import division
import numpy as np

__date__ = "2011-06-15 jun denis"

#...............................................................................
def sumtriangles( xy, z, triangles ):
    """ integrate scattered data, given a triangulation
    zsum, areasum = sumtriangles( xy, z, triangles )
    In:
        xy: npt, dim data points in 2d, 3d ...
        z: npt data values at the points, scalars or vectors
        triangles: ntri, dim+1 indices of triangles or simplexes, as from
http://docs.scipy.org/doc/scipy/reference/generated/scipy.spatial.Delaunay.html
    Out:
        zsum: sum over all triangles of (area * z at midpoint).
            Thus z at a point where 5 triangles meet
            enters the sum 5 times, each weighted by that triangle's area / 3.
        areasum: the area or volume of the convex hull of the data points.
            For points over the unit square, zsum outside the hull is 0,
            so zsum / areasum would compensate for that.
            Or, make sure that the corners of the square or cube are in xy.
    """
        # z concave or convex => under or overestimates
    npt, dim = xy.shape
    ntri, dim1 = triangles.shape
    assert npt == len(z), "shape mismatch: xy %s z %s" % (xy.shape, z.shape)
    assert dim1 == dim+1, "triangles ? %s" % triangles.shape
    zsum = np.zeros( z[0].shape )
    areasum = 0
    dimfac = np.prod( np.arange( 1, dim+1 ))
    for tri in triangles:
        corners = xy[tri]
        t = corners[1:] - corners[0]
        if dim == 2:
            area = abs( t[0,0] * t[1,1] - t[0,1] * t[1,0] ) / 2
        else:
            area = abs( np.linalg.det( t )) / dimfac  # v slow
        zsum += area * z[tri].mean(axis=0)
        areasum += area
    return (zsum, areasum)

#...............................................................................
if __name__ == "__main__":
    import sys
    from time import time
    from scipy.spatial import Delaunay

    npt = 500
    dim = 2
    seed = 1

    exec( "\n".join( sys.argv[1:] ))  # run this.py npt= dim= ...
    np.set_printoptions( 2, threshold=100, edgeitems=5, suppress=True )
    np.random.seed(seed)

    points = np.random.uniform( size=(npt,dim) )
    z = points  # vec; zsum should be ~ constant
    # z = points[:,0]
    t0 = time()
    tessellation = Delaunay( points )
    t1 = time()
    triangles = tessellation.vertices  # ntri, dim+1
    zsum, areasum = sumtriangles( points, z, triangles )
    t2 = time()

    print "%s: %.0f msec Delaunay, %.0f msec sum %d triangles:  zsum %s  areasum %.3g" % (
        points.shape, (t1 - t0) * 1000, (t2 - t1) * 1000,
        len(triangles), zsum, areasum )
# mac ppc, numpy 1.5.1 15jun:
# (500, 2): 25 msec Delaunay, 279 msec sum 983 triangles:  zsum [ 0.48  0.48]  areasum 0.969
# (500, 3): 111 msec Delaunay, 3135 msec sum 3046 triangles:  zsum [ 0.45  0.45  0.44]  areasum 0.892

Numerical integration is typically defined over simple elements like triangles or rectangles. That said, you can decompose every polgon into a series of triangles. With any luck, your polygonal mesh has a triangular dual which would make integration much easier.

quadpy (a project of mine) does numerical integration over many domains, e.g., triangles:

import numpy
import quadpy


sol, error_estimate = quadpy.t2.integrate_adaptive(
    lambda x: numpy.exp(x[0]),
    numpy.array(
        [
            [[0.0, 0.0], [0.0, 0.0], [0.0, 0.0], [0.0, 0.0], [0.0, 0.0]],
            [[1.0, 0.0], [1.0, 0.0], [1.0, 0.0], [1.0, 0.0], [1.0, 0.0]],
            [[0.0, 1.0], [0.0, 1.0], [0.0, 1.0], [0.0, 1.0], [0.0, 1.0]],
        ]
    ),
    1.0e-10,
)

print(sol)
3.5914091422921017

You can also integrate non-adaptively by providing one of hundreds of schemes for the triangle.