How to instantiate a table object to bulk_insert rows using alembic / SQLAlchemy
If you have your tables as models in code, you can also use the __table__
attribute:
from src.models.services import Service
op.bulk_insert(Service.__table__,
[
{
'id': 88,
'name':'Test 1',
'is_active': 'true',
'include_in_broker_fee': 'true',
'is_domestic': 'true',
'is_international': 'true'
},
{
'id': 89,
'name':'Test 2',
'is_active': 'true',
'include_in_broker_fee': 'true',
'is_domestic': 'true',
'is_international': 'true'
}
])
In order to update the table as you've shown above you'll need to define it so sqlalchemy knows what to update. Doing this with alchemy's MetaData() object is pretty straightforward, in fact you almost have it. Try something like this:
from sqlalchemy import Table, MetaData
meta = MetaData(bind=op.get_bind())
services = Table('services', meta)
Now that the table is defined you can leverage alchemy's bulk update methods. For this I refer you to this bit of their documentation where they show several examples of bulk_insert_mappings() and bulk_save_objects() --- http://docs.sqlalchemy.org/en/latest/faq/performance.html
In case somebody stumbles upon this like I did: Currently for this to work you need to reflect specific tables in your MetaData() object. Underlying database was MySQL.
Working code:
from alembic import op
from sqlalchemy import Table, MetaData
def upgrade():
# get metadata from current connection
meta = MetaData(bind=op.get_bind())
# pass in tuple with tables we want to reflect, otherwise whole database will get reflected
meta.reflect(only=('some_table',))
# define table representation
some_table_tbl = Table('some_table', meta)
# insert records
op.bulk_insert(
some_table_tbl,
[
{
# data here...
}, # (...)
]