Why Master method ? - Master method helps us analysing algorithms with their running time.

So, lets consider the integer multiplication - here.

Now we get equation 1.

The way we reason about recursive algorithm like these is with the help of recurrence.

Recurrences

T(n) = max number of operations this algorithm needs to multiply 2 n digit numbers. Recurrence is to express T(n) in terms of recursive calls.

There are 2 ingredients of recurrences - base case and normal recurrence.

So, for integer multiplication, we have following:

Here we have 4 times T(n/2) as 4 is the number of products we have to calculate, and n/2 is the size of recurrence.

New recurrence:

Base case : T(1)

General recurrence – 3 T( n/2) + O ( n )

Master method

What makes the master method so awesome is that it can be thought of as a "black box" for solving recurrences. This means that you give it some inputs and it tell tell you the running time of the algorithm.

So, it only works when all subproblems have the same size. For example in each recursive call to MergeSort, each recursive call does work on size n/2. Though there are other more generic versions of master method.

So, given such a recurrence, we will have upper bound on the recurrence, and via master method it an is:

In the first case, the base for the logarithm doesn't matter because a base will only change the amount by a constant factor. i.e. log

However, in the third case, the base matters a lot because it could make the difference between linear and quadratic time.

Let's look at MergeSort as an example. We know T(n) = 2T(n/2) + O(n).

We have a=2, b=2, and d=1. a=b

For binary search, we split the problem into two with n/2 input size each BUT we only work in one of them and just compare the element you're searching for with the center element, which takes constant time.

T(n) = 2T (n/2) + O(1)

As, we have 2 recursive calls, and problem size of n/2. Also, when we find the element, we have to just compare the value i.e. O(1) time for comparing.

Thus, we have a=1, b=2, d=0 and a=b

first case (naive algorithm)

a=4, b = 2, d=1

T(n) = 4T(n/2) + O (n)

a > b

2nd case (gauss's method)

T(n) = 3T(n/2)+O(n)

a = 3, b = 2, d = 1

a> b

T(n) = 7T(n/2) + O(n

a = 7

b = 2

d = 2

a > b

T(n) = 2T(n/2)+ O(n2)

a = 2, b = 2, d = 2

a<b

So, lets consider the integer multiplication - here.

Now we get equation 1.

The way we reason about recursive algorithm like these is with the help of recurrence.

Recurrences

**What is recurrence?**T(n) = max number of operations this algorithm needs to multiply 2 n digit numbers. Recurrence is to express T(n) in terms of recursive calls.

**Ingredient of recurrence**There are 2 ingredients of recurrences - base case and normal recurrence.

So, for integer multiplication, we have following:

- Base case : T(1) <= constant.
- For all n>1 i.e. general case : T( n ) <= 4 T ( n /2 ) + O ( n )

Here we have 4 times T(n/2) as 4 is the number of products we have to calculate, and n/2 is the size of recurrence.

*Algorithm 2 : 3 products rather than 4 for integer multiplication*New recurrence:

Base case : T(1)

General recurrence – 3 T( n/2) + O ( n )

*Recurrence of merge sort*- 2T (n / 2)Master method

**Formal definition of master method**What makes the master method so awesome is that it can be thought of as a "black box" for solving recurrences. This means that you give it some inputs and it tell tell you the running time of the algorithm.

**Assumption**– Master method assumes that the size of the sub problem is same.So, it only works when all subproblems have the same size. For example in each recursive call to MergeSort, each recursive call does work on size n/2. Though there are other more generic versions of master method.

**Base Case:**
T(n) ≤ a constant c, for all sufficiently small n

**For larger n i.e. General case:**
T(n) ≤ aT(n/b) + O(n

^{d})- a is the number of recursive calls
- b is the factor by which the problem size shrinks. It must be greater than one or the problem will never get any smaller
- d is the exponent of the combining time. It can be as small as 0, which indicates it takes linear time.

**and independent of the input size.***constants*So, given such a recurrence, we will have upper bound on the recurrence, and via master method it an is:

**Notice the logarithm**In the first case, the base for the logarithm doesn't matter because a base will only change the amount by a constant factor. i.e. log

_{2}a = constant * log_{e}a.However, in the third case, the base matters a lot because it could make the difference between linear and quadratic time.

**Examples***Example 1 - Merge Sort*Let's look at MergeSort as an example. We know T(n) = 2T(n/2) + O(n).

We have a=2, b=2, and d=1. a=b

^{d}so we are in case 1. The running time will be O(n^{d}logn) which is just O(nlogn).*Example 2 - Binary Search*For binary search, we split the problem into two with n/2 input size each BUT we only work in one of them and just compare the element you're searching for with the center element, which takes constant time.

T(n) = 2T (n/2) + O(1)

As, we have 2 recursive calls, and problem size of n/2. Also, when we find the element, we have to just compare the value i.e. O(1) time for comparing.

Thus, we have a=1, b=2, d=0 and a=b

^{d}, so it becomes case 1, so the overall runtime will be O(logn).*Example 3 - Integer multiplication*first case (naive algorithm)

a=4, b = 2, d=1

T(n) = 4T(n/2) + O (n)

a > b

^{d}, so it is case 3. Hence, we have O(n^{log24}), i.e. O(n^{2})2nd case (gauss's method)

T(n) = 3T(n/2)+O(n)

a = 3, b = 2, d = 1

a> b

^{d}i.e. case 3 again. Hence we have O(n^{log23}) i.e. O(n^{1.59})*Example 4 - Matrix multiplication using Strassen's method*T(n) = 7T(n/2) + O(n

^{2})a = 7

b = 2

d = 2

a > b

^{d}, which implies case 3, so T(n) = O(n^{log27}) = O(n^{2.81})*Example 5 - Fictious recurrence (just to to cover case 2)*T(n) = 2T(n/2)+ O(n2)

a = 2, b = 2, d = 2

a<b

^{d}, so it is case 2. Time complexity is O(n^{d}) = O(n^{2})
## 0 comments:

## Post a Comment