Convert list of single key dictionaries into a single dictionary

You can use reduce:

reduce(lambda r, d: r.update(d) or r, lst, {})

Demo:

>>> lst = [
...     {'1': 'A'},
...     {'2': 'B'},
...     {'3': 'C'}
... ]
>>> reduce(lambda r, d: r.update(d) or r, lst, {})
{'1': 'A', '3': 'C', '2': 'B'}

or you could chain the items calls (Python 2):

from itertools import chain, imap
from operator import methodcaller

dict(chain.from_iterable(imap(methodcaller('iteritems'), lst)))

Python 3 version:

from itertools import chain
from operator import methodcaller

dict(chain.from_iterable(map(methodcaller('items'), lst)))

Demo:

>>> from itertools import chain, imap
>>> from operator import methodcaller
>>> 
>>> dict(chain.from_iterable(map(methodcaller('iteritems'), lst)))
{'1': 'A', '3': 'C', '2': 'B'}

Or use a dict comprehension:

{k: v for d in lst for k, v in d.iteritems()}

Demo:

>>> {k: v for d in lst for k, v in d.iteritems()}
{'1': 'A', '3': 'C', '2': 'B'}

Of the three, for the simple 3-dictionary input, the dict comprehension is fastest:

>>> import timeit
>>> def d_reduce(lst):
...     reduce(lambda r, d: r.update(d) or r, lst, {})
... 
>>> def d_chain(lst):
...     dict(chain.from_iterable(imap(methodcaller('iteritems'), lst)))
... 
>>> def d_comp(lst):
...     {k: v for d in lst for k, v in d.iteritems()}
... 
>>> timeit.timeit('f(lst)', 'from __main__ import lst, d_reduce as f')
2.4552760124206543
>>> timeit.timeit('f(lst)', 'from __main__ import lst, d_chain as f')
3.9764280319213867
>>> timeit.timeit('f(lst)', 'from __main__ import lst, d_comp as f')
1.8335261344909668

When you increase the number of items in the inputlist to 1000, then the chain method catches up:

>>> import string, random
>>> lst = [{random.choice(string.printable): random.randrange(100)} for _ in range(1000)]
>>> timeit.timeit('f(lst)', 'from __main__ import lst, d_reduce as f', number=10000)
5.420135974884033
>>> timeit.timeit('f(lst)', 'from __main__ import lst, d_chain as f', number=10000)
3.464245080947876
>>> timeit.timeit('f(lst)', 'from __main__ import lst, d_comp as f', number=10000)
3.877490997314453

Increasing the input list further doesn't appear to matter from here on out; the chain() approach is a small percentage faster but never gains a clear advantage.


You can use a dictionary comprehension:

>>> lst = [
...     {'1': 'A'},
...     {'2': 'B'},
...     {'3': 'C'}
... ]
>>> {k:v for x in lst for k,v in x.items()}
{'2': 'B', '3': 'C', '1': 'A'}
>>>