What is the difference between Θ(n) and O(n)?

Short explanation:

If an algorithm is of Θ(g(n)), it means that the running time of the algorithm as n (input size) gets larger is proportional to g(n).

If an algorithm is of O(g(n)), it means that the running time of the algorithm as n gets larger is at most proportional to g(n).

Normally, even when people talk about O(g(n)) they actually mean Θ(g(n)) but technically, there is a difference.


More technically:

O(n) represents upper bound. Θ(n) means tight bound. Ω(n) represents lower bound.

f(x) = Θ(g(x)) iff f(x) = O(g(x)) and f(x) = Ω(g(x))

Basically when we say an algorithm is of O(n), it's also O(n2), O(n1000000), O(2n), ... but a Θ(n) algorithm is not Θ(n2).

In fact, since f(n) = Θ(g(n)) means for sufficiently large values of n, f(n) can be bound within c1g(n) and c2g(n) for some values of c1 and c2, i.e. the growth rate of f is asymptotically equal to g: g can be a lower bound and and an upper bound of f. This directly implies f can be a lower bound and an upper bound of g as well. Consequently,

f(x) = Θ(g(x)) iff g(x) = Θ(f(x))

Similarly, to show f(n) = Θ(g(n)), it's enough to show g is an upper bound of f (i.e. f(n) = O(g(n))) and f is a lower bound of g (i.e. f(n) = Ω(g(n)) which is the exact same thing as g(n) = O(f(n))). Concisely,

f(x) = Θ(g(x)) iff f(x) = O(g(x)) and g(x) = O(f(x))


There are also little-oh and little-omega (ω) notations representing loose upper and loose lower bounds of a function.

To summarize:

f(x) = O(g(x)) (big-oh) means that the growth rate of f(x) is asymptotically less than or equal to to the growth rate of g(x).

f(x) = Ω(g(x)) (big-omega) means that the growth rate of f(x) is asymptotically greater than or equal to the growth rate of g(x)

f(x) = o(g(x)) (little-oh) means that the growth rate of f(x) is asymptotically less than the growth rate of g(x).

f(x) = ω(g(x)) (little-omega) means that the growth rate of f(x) is asymptotically greater than the growth rate of g(x)

f(x) = Θ(g(x)) (theta) means that the growth rate of f(x) is asymptotically equal to the growth rate of g(x)

For a more detailed discussion, you can read the definition on Wikipedia or consult a classic textbook like Introduction to Algorithms by Cormen et al.


There's a simple way (a trick, I guess) to remember which notation means what.

All of the Big-O notations can be considered to have a bar.

When looking at a Ω, the bar is at the bottom, so it is an (asymptotic) lower bound.

When looking at a Θ, the bar is obviously in the middle. So it is an (asymptotic) tight bound.

When handwriting O, you usually finish at the top, and draw a squiggle. Therefore O(n) is the upper bound of the function. To be fair, this one doesn't work with most fonts, but it is the original justification of the names.


one is Big "O"

one is Big Theta

http://en.wikipedia.org/wiki/Big_O_notation

Big O means your algorithm will execute in no more steps than in given expression(n^2)

Big Omega means your algorithm will execute in no fewer steps than in the given expression(n^2)

When both condition are true for the same expression, you can use the big theta notation....