Explore recursive algorithm time complexity, call stack depth, and Big-O analysis with focused questions on recursion patterns, branching factors, and performance. Ideal for those reviewing computational complexity concepts and recursive function behavior.
Given a recursive function that only calls itself once per call, such as calculating the sum of an array using 'sum(arr, n) = arr[n-1] + sum(arr, n-1)', what is the overall time complexity using Big-O notation?
Explanation: Linear recursion, as demonstrated in the sum example, processes each element one by one, leading to O(n) time complexity. O(log n) appears in cases like binary search where the problem size halves each time. O(n^2) and O(2^n) occur in more complex recursive patterns with multiple calls in each step or overlapping subproblems, neither of which happens in this scenario.
If a recursive function calls itself twice per call, like in calculating Fibonacci numbers recursively, what is the time complexity in Big-O notation when computing fib(n)?
Explanation: In the Fibonacci example, each call generates two more calls, forming a binary tree of calls and resulting in exponential growth, which is O(2^n). O(n) applies when there is only one call per recursion. O(n^2) is reserved for certain divide-and-conquer algorithms, while O(log n) is related to reductions by half per call, not present here.
For a tail-recursive function that sums numbers from 1 to n using an accumulator, such as 'sum(n, acc)', what is the call stack depth during execution?
Explanation: Tail-recursive functions may allow for stack optimization transforming recursion into iteration, resulting in O(1) stack depth when optimized, but without such optimization, the depth remains O(n). O(log n) would be the case if problem size halves, and O(n^2) is not relevant to call stack depth in standard recursion scenarios. O(1) only applies for fully optimized cases, not assumed by default.
Given a recursive function that solves the Towers of Hanoi problem for n disks by making two recursive calls and one move per call, what is the Big-O time complexity?
Explanation: Towers of Hanoi involves two recursive calls for each move, doubling the number of operations per disk, and results in O(2^n) complexity. O(n) and O(n^2) would indicate polynomial relationships, not matching the exponential growth here. O(n log n) typically comes up in efficient sorting algorithms, but not in recursive problems like Hanoi.
What is the call stack depth in a divide-and-conquer recursive algorithm such as merge sort on an array of n elements?
Explanation: In merge sort, the array is continually halved until individual elements are reached, resulting in a maximum stack depth of O(log n). O(n) could occur in simple linear recursion, but merge sort utilizes the divide-and-conquer approach. O(1) is not realistic for this recursion type, and O(n^2) represents more complex recurrence relations with massive branching, not present here.