master theorem

The master theorem concerns recurrence relations of the form:

{displaystyle T(n)=a;T!left({frac {n}{b}} ight)+f(n)}T(n)=a;T!left({frac {n}{b}}
ight)+f(n) where {displaystyle ain mathbb {N} {mbox{, }}1<bin mathbb {R} }{displaystyle ain mathbb {N} {mbox{, }}1<bin mathbb {R} }

In the application to the analysis of a recursive algorithm, the constants and function take on the following significance:

  • n is the size of the problem.
  • a is the number of subproblems in the recursion.
  • n/b is the size of each subproblem. (Here it is assumed that all subproblems are essentially the same size.)
  • f (n) is the cost of the work done outside the recursive calls, which includes the cost of dividing the problem and the cost of merging the solutions to the subproblems.

It is possible to determine an asymptotic tight bound in these three cases:

Case 1

Generic form

{displaystyle f(n)=Oleft(n^{c}
ight)} where c<log _{b}a (using big O notation)

then:

{displaystyle T(n)=Oleft(n^{log _{b}a}
ight)}

Example

T(n)=8Tleft({frac {n}{2}}
ight)+1000n^{2}

As one can see from the formula above:

a=8,\,b=2,\,f(n)=1000n^{2}, so
f(n) = Oleft(n^c
ight), where {displaystyle c=2}c=2

Next, we see if we satisfy the case 1 condition:

log _{b}a=log _{2}8=3>c.

It follows from the first case of the master theorem that

T(n) = Thetaleft( n^{log_b a} 
ight) = Thetaleft( n^{3} 
ight)

(indeed, the exact solution of the recurrence relation is T(n)=1001n^{3}-1000n^{2}, assumingT(1)=1).

Case 2

Generic form

If it is true, for some constant k ≥ 0, that:

{displaystyle f(n)=Theta left(n^{c}log ^{k}n
ight)} where c=log _{b}a

then:

{displaystyle T(n)=Theta left(n^{c}log ^{k+1}n
ight)}

Example

T(n)=2Tleft({frac {n}{2}}
ight)+10n

As we can see in the formula above the variables get the following values:

a=2,\,b=2,\,c=1,\,f(n)=10n
f(n)=Theta left(n^{c}log ^{k}n
ight) where {displaystyle c=1,k=0}c=1,k=0

Next, we see if we satisfy the case 2 condition:

log _{b}a=log _{2}2=1, and therefore, {displaystyle c=log _{b}a}c=log _{b}a

So it follows from the second case of the master theorem:

T(n)=Theta left(n^{log _{b}a}log ^{k+1}n
ight)=Theta left(n^{1}log ^{1}n
ight)=Theta left(nlog n
ight)

Thus the given recurrence relation T(n) was in Θ(n log n).

(This result is confirmed by the exact solution of the recurrence relation, which isT(n)=n+10nlog _{2}n, assuming T(1)=1.)

Case 3

Generic form

If it is true that:

{displaystyle f(n)=Omega left(n^{c}
ight)} where {displaystyle c>log _{b}a}c>log _{b}a

and if it is also true that:

afleft({frac {n}{b}}
ight)leq kf(n) for some constant {displaystyle k<1}k<1 and sufficiently large n (often called the regularity condition)

then:

Tleft(n
ight)=Theta left(f(n)
ight)

Example

T(n)=2Tleft({frac {n}{2}}
ight)+n^{2}

As we can see in the formula above the variables get the following values:

a=2,\,b=2,\,f(n)=n^{2}
f(n)=Omega left(n^{c}
ight), where {displaystyle c=2}c=2

Next, we see if we satisfy the case 3 condition:

log _{b}a=log _{2}2=1, and therefore, yes, {displaystyle c>log _{b}a}c>log _{b}a

The regularity condition also holds:

2left({frac {n^{2}}{4}}
ight)leq kn^{2}, choosingk=1/2

So it follows from the third case of the master theorem:

Tleft(n
ight)=Theta left(f(n)
ight)=Theta left(n^{2}
ight).

Thus the given recurrence relation T(n) was in Θ(n2), that complies with the f (n) of the original formula.

(This result is confirmed by the exact solution of the recurrence relation, which is T(n)=2n^{2}-n, assuming T(1)=1.)

from wikipedia

---------------------------------------------------------------------------------------------------------------------------------------------

it means lim(n->Infinity)(f(n)/n^(b lg a)) should  1. ->Infinity && f(n)/n^(b lg a)    2. (f(n)/n^(b lg a)< n^x for x>0.

if 1 but not 2 can't use the method.  as f(n)/n^(b lg a) = n*lgn/n = lgn. 

原文地址:https://www.cnblogs.com/wujunde/p/6938869.html