Is the time-complexity of iterative string append actually O(n^2), or O(n)?
In CPython, the standard implementation of Python, there's an implementation detail that makes this usually O(n), implemented in the code the bytecode evaluation loop calls for +
or +=
with two string operands. If Python detects that the left argument has no other references, it calls realloc
to attempt to avoid a copy by resizing the string in place. This is not something you should ever rely on, because it's an implementation detail and because if realloc
ends up needing to move the string frequently, performance degrades to O(n^2) anyway.
Without the weird implementation detail, the algorithm is O(n^2) due to the quadratic amount of copying involved. Code like this would only make sense in a language with mutable strings, like C++, and even in C++ you'd want to use +=
.
The author relies on an optimization that happens to be here, but is not explicitly dependable. strA = strB + strC
is typically O(n)
, making the function O(n^2)
. However, it is pretty easy to make sure it the whole process is O(n)
, use an array:
output = []
# ... loop thing
output.append('%20')
# ...
output.append(char)
# ...
return ''.join(output)
In a nutshell, the append
operation is amortized O(1)
, (although you can make it strong O(1)
by pre-allocating the array to the right size), making the loop O(n)
.
And then the join
is also O(n)
, but that's okay because it is outside the loop.
I found this snippet of text on Python Speed > Use the best algorithms and fastest tools:
String concatenation is best done with
''.join(seq)
which is anO(n)
process. In contrast, using the'+'
or'+='
operators can result in anO(n^2)
process because new strings may be built for each intermediate step. The CPython 2.4 interpreter mitigates this issue somewhat; however,''.join(seq)
remains the best practice