How to pickle and unpickle to portable string in Python 3
pickle.dumps()
produces a bytes
object. Expecting these arbitrary bytes to be valid UTF-8 text (the assumption you are making by trying to decode it to a string from UTF-8) is pretty optimistic. It'd be a coincidence if it worked!
One solution is to use the older pickling protocol that uses entirely ASCII characters. This still comes out as bytes
, but since those bytes contain only ASCII code points, it can be converted to a string without stress:
pickled = str(pickle.dumps(obj, 0))
You could also use some other encoding method to encode a binary-pickled object to text, such as base64:
import codecs
pickled = codecs.encode(pickle.dumps(obj), "base64").decode()
Decoding would then be:
unpickled = pickle.loads(codecs.decode(pickled.encode(), "base64"))
Using pickle
with protocol 0 seems to result in shorter strings than base64-encoding binary pickles (and abarnert's suggestion of hex-encoding is going to be even larger than base64), but I haven't tested it rigorously or anything. Test it with your data and see.
If you want to store bytes in the environment, instead of encoded text, that's what environb
is for.
This doesn't work on Windows. (As the docs imply, you should check os.supports_bytes_environ
if you're on 3.2+ instead of just assuming that Unix does and Windows doesn't…) So for that, you'll need to smuggle the bytes into something that can be encoded no matter what your system encoding is, e.g., using backslash-escape
, or even hex
. So, for example:
if os.supports_bytes_environ:
environb['pickled'] = pickled
else:
environ['pickled'] = codecs.encode(pickled, 'hex')
I think the simplest answer, especially if you don't care about Windows, is to just store the bytes in the environment, as suggested in my other answer.
But if you want something clean and debuggable, you might be happier using something designed as a text-based format.
pickle
does have a "plain text" protocol 0, as explained in kindall's answer. It's certainly more readable than protocol 3 or 4, but it's still not something I'd actually want to read.
JSON is much nicer, but it can't handle datetime
out of the box. You can come up with your own encoding (the stdlib's json
module is extensible) for the handful of types you need to encode, or use something like jsonpickle
. It's generally safer, more efficient, and more readable to come up with custom encodings for each type you care about than a general "pack arbitrary types in a turing-complete protocol" scheme like pickle
or jsonpickle
, but of course it's also more work, especially if you have a lot of extra types.
JSON Schema lets you define languages in JSON, similar to what you'd do in XML. It comes with a built-in date-time
String format, and the jsonschema
library for Python knows how to use it.
YAML has a standard extension repository that includes many types JSON doesn't, including a timestamp. Most of the zillion 'yaml' modules for Python already know how to encode datetime
objects to and from this type. If you need additional types beyond what YAML includes, it was designed to be extensible declaratively. And there are libraries that do the equivalent of jsonpickle
, defining new types on the fly, if you really need that.
And finally, you can always write an XML language.