For some persons finding the Big O of recursive algorithms can prove to be difficult especially if the algorithms are complexed. The Master Theorem provides a relatively simple way to solve recurrence relations as long as the recurrence fits the theorem constraints.

**Basic definition:**

T(n) = aT(n/b) + f(n^{c}) **where** a >= 1, b > 1, and c >= 0

T(n) = Θ(n^{c}) **if** c > log_{b}a

T(n) = Θ(n^{c} log n) **if **c = log_{b}a

T(n) = Θ(n^{logba}) **if** c < log_{b}a

We can think of “b” as the number of branches and “a” as the number of recurrences done; lets take a look at the example.

**If we have a function:**

a = 1, b = 2, c = 0 (log_{b}a)

T(n) = 2T(n/1) + f(n^{0})

c = 0 which satisfies Θ(n^{c}log n) = O(log n)

**References:**

anupcowkur.com

wikipedia.com

### Like this:

Like Loading...

*Related*

## Published by Romaine Carter

Interests: optimization algorithms, Neural Nets, MATLAB, MASM programming, Visual C++, Python, C#.Net, Haskell, software design patterns, TDD and ASP.NET MVC x.
View all posts by Romaine Carter