Maximum Weight / Minimum Cost Bipartite Matching Code in Python

When you say "minimum cost matching", I assume that you mean the problem of finding the matching with lowest cost among all maximum matchings.

As of version 2.4 (released 2019-10-17), NetworkX does handle the bipartite case specifically with nx.algorithms.bipartite.minimum_weight_full_matching.

Under the hood, it relies on scipy.optimize.linear_sum_assignment. As of version 1.6.0, SciPy also ships with scipy.sparse.csgraph.min_weight_full_bipartite_matching which does the same thing but for sparse inputs, and which can offer performance improvements for sparse graphs.


After some further investigation, I've found the following two modules particularly helpful (http://pypi.python.org/pypi/pyLAPJV/0.3 and http://pypi.python.org/pypi/hungarian). They are both algorithms implemented in C++ with Python bindings, and run much faster than the NetworkX matching implementation. The pyLAPJV implementation, however, seems to be a bit too fickle for my needs and doesn't properly handle identically weighted edges well. The hungarian module (though supposedly slower than the pyLAPJV algorithm) runs about 3 orders of magnitude faster than the NetworkX implementation on the data sizes I'm currently dealing with. I'm also going to give another look at the code suggested by kunigami, as I believe that it can be run though Shedskin fairly easily to give a reasonably fast implementation.


Have you tried scipy implementation of the Hungarian algorithm, also known as the Munkres or Kuhn-Munkres algorithm?

scipy.optimize.linear_sum_assignment