O(log n) algorithm for finding max of array?

This question is asked a lot (is this a popular CS homework question or something?) and the answer is always the same: no.

Think about it mathematically. Unless the array is sorted, there is nothing to "cut in half" to give you the log(n) behavior.

Read the question comments for a more in-depth discussion (which is probably way out of the question's scope anyhow).


Consider this: without visiting every element, how do you know that some element you haven't visited isn't larger than the largest you have found so far?


It's not possible to do this in O(log(N)). It is O(N) in the best/worst/average case because one would need to visit every item in the array to determine if it is the larges one or not. Array is unsorted, which means you cannot cut corners.

Even in the case of parallelisation, this cannot be done in O(N), because Big-O notation doesn't care about how many CPU one has or what is the frequency of each CPU. It is abstracted from this specifically to give crude estamate of the problem.

Parallelisation can be neglected because time spent dividing a job can be considered equal to the time of sequential execution. This is due to the reason of constants being disregarded. The following are all the same:

O(N) = O(Const * N) = O(N / Const) = O(N + Const) = O(N - Const)

From the other hand, in practise, divide-and-conquer parallel algorithms can give you some performance benefits, so it may run a little bit faster. Fortunately, Big-O doesn't deal with this fine-grained algorithmic complexity analysis.