big o - Complexity of factorial recursive algorithm -


today in class teacher wrote on blackboard recursive factorial algorithm:

int factorial(int n) {    if (n == 1)         return 1;    else         return n * factorial(n-1); } 

she said has cost of t(n-1) + 1.

then iteration method said t(n-1) = t(n-2) + 2 = t(n-3) + 3 ... t(n-j) + j, algorithm stops when n - j = 1, j = n - 1.

after that, substituted j in t(n-1) + 1, , obtained t(1) + n-1.

she directly said n-1 = 2(log2n-1), cost of algorithm exponential.

i lost last 2 steps. can please explain them me?

let's start off analysis of algorithm. can write recurrence relation total amount of work done. base case, 1 unit of work when algorithm run on input of size 1, so

t(1) = 1

for input of size n + 1, algorithm 1 unit of work within function itself, makes call same function on input of size n. therefore

t(n + 1) = t(n) + 1

if expand out terms of recurrence, that

  • t(1) = 1
  • t(2) = t(1) + 1 = 2
  • t(3) = t(2) + 1 = 3
  • t(4) = t(3) + 1 = 4
  • ...
  • t(n) = n

so in general algorithm require n units of work complete (i.e. t(n) = n).

the next thing teacher said that

t(n) = n = 2log2 n

this statement true, because 2log2x = x nonzero x, since exponentiation , logarithms inverse operations of 1 another.

so question is: polynomial time or exponential time? i'd classify pseudopolynomial time. if treat input x number, runtime indeed polynomial in x. however, polynomial time formally defined such runtime of algorithm must polynomial respect number of bits used specify input problem. here, number x can specified in θ(log x) bits, runtime of 2log x technically considered exponential time. wrote length in this earlier answer, , i'd recommend looking @ more thorough explanation.

hope helps!


Comments

Popular posts from this blog

Perl - how to grep a block of text from a file -

delphi - How to remove all the grips on a coolbar if I have several coolbands? -

javascript - Animating array of divs; only the final element is modified -