For some persons finding the Big O of recursive algorithms can prove to be difficult especially if the algorithms are complexed. The Master Theorem provides a relatively simple way to solve recurrence relations as long as the recurrence fits the theorem constraints.

**Basic definition:**

T(n) = aT(n/b) + f(n^{c}) **where** a >= 1, b > 1, and c >= 1

T(n) = Θ(n^{c}) **if** c >= log_{b}a

T(n) = Θ(n^{c} log n) **if **c = log_{b}a

T(n) = Θ(n^{logba}) **if** c < log_{b}a

We can think of “b” as the number of branches and “a” as the number of recurrences done; lets take a look at the example.

**If we have a function:**

T(n) = 2T(n/1) + f(n^{0}) : a = 1, b = 2, c = 0

log_{2}1 = 0 **therefore** c = log_{b}a which satisfies Θ(n^{c}log n) = O(log n)

**References:**

anupcowkur.com

wikipedia.com