How to escape single quotes in Unload
You can also use postgres style :
unload
($$
select * from table where id='ABC'
$$)
to 's3://bucket/queries_results/20150324/table_dump/'
credentials 'aws_access_key_id=;aws_secret_access_key='
;
You would want to use two single quotes to enclose the value.
If your query contains quotes (for example to enclose literal values), put the literal between two sets of single quotation marks—you must also enclose the query between single quotation marks:
Example:
UNLOAD ('select * from venue where venuestate=''NV''')
Taken from the redshift documentation: https://docs.aws.amazon.com/redshift/latest/dg/r_UNLOAD.html
As Sarang says, simply by replacing single quotes by double quotes in col4 and col5 values of your query should do the trick.
However I would suggest you to break your string down in smaller chunks easier to read and maintain. This way, you should be able to use execute
as chepner suggests (and MySQL documentation):
# Create the inner SQL statement. Notice the single quotes for the general
# string and the double quotes for the col4 and col5 values
sql_stmt = ('SELECT col1, col2 '
'FROM %s.visitation_hourly_summary_us '
'WHERE col4 = "2018-07-10" AND col5= "1";' % schema_name)
# Format the s3 path
s3_target = 's3://%s/%s/%s.csv' % (s3_bucket_name, schema, table)
# Format credentials string
s3_credentials = 'aws_access_key_id=%s;aws_secret_access_key=%s' % (
aws_access_key_id, aws_secret_access_key)
# Create a tuple with all preformatted strings
data = (sql_stmt, s3_target, s3_credentials)
# Format the s3 query skeleton
s3_stmt = ("UNLOAD ('%s') TO '%s' "
"CREDENTIALS '%s' "
"MANIFEST GZIP ALLOWOVERWRITE;Commit;")
con = psycopg2.connect(conn_string)
cur = con.cursor()
cur.execute(s3_stmt, data)