Spacy - Save custom pipeline
When you save out your model, spaCy will serialize all data and store a reference to your pipeline in the model's meta.json
. For example: ["ner", "countries"]
. When you load your model back in, spaCy will check out the meta and initialise each pipeline component by looking it up in the so-called "factories": functions that tell spaCy how to construct a pipeline component. (The reason for that is that you usually don't want your model to store and eval arbitrary code when you load it back in – at least not by default.)
In your case, spaCy is trying to look up the component name 'countries'
in the factories and fails, because it's not built-in. The Language.factories
are a simple dictionary, though, so you can customise it and add your own entries:
from spacy.language import Language
Language.factories['countries'] = lambda nlp, **cfg: RESTCountriesComponent(nlp, **cfg)
A factory is a function that receives the shared nlp
object and optional keyword arguments (config parameters). It then initialises the component and returns it. If you add the above code before you load your model, it should load as expected.
More advanced approaches
If you want this taken care of automatically, you could also ship your component with your model. This requires wrapping it as a Python package using the spacy package
command, which creates all required Python files. By default, the __init__.py
only includes a function to load your model – but you can also add custom functions to it or use it to add entries to spaCy's factories.
As of v2.1.0
(currently available as a nightly version for testing), spaCy will also support providing pipeline component factories via Python entry points. This is especially useful for production setups and/or if you want to modularise your individual components and split them into their own packages. For example, you could create a Python package for your countries component and its factory, upload it to PyPi, version it and test it separately. In its setup.py
, your package can define the spaCy factories it exposes and where to find them. spaCy will be able to detect them automatically – all you need to do is install the package in the same environment. Your model package could even require your component package as a dependency so it's installed automatically when you install your model.
This same issue came up for me and these are the steps I used:
- 1) Save pipeline after running notebook containing all the different nlp pipeline components e.g. nlp.to_disc('pipeline_model_name')
- 2) Build Package saved pipeline with Spacy: run
python setup.py sdist
in this directory. - 3) Pip install the created package
- 4) Put custom components in
__init__.py
file of package as instructed above - 4) Load pipeline with:
- Import spacy
- nlp = spacy_package.load()