Sorting & Complexity Fundamentals Quiz Quiz

Assess your understanding of essential sorting algorithms, asymptotic complexities, and fundamental principles in data structures. Sharpen your problem-solving skills with easy, foundational questions designed for programming fundamentals.

  1. Stable Sorting Algorithm

    Which of the following sorting algorithms is stable under standard implementation?

    1. Selection Sort
    2. Heap Sort
    3. Merge Sort
    4. QuickSort

    Explanation: Merge Sort is stable since it preserves the relative order of equal elements. Heap Sort, QuickSort, and Selection Sort are generally not stable in their conventional implementations because they may swap elements in a manner that changes the order of equal elements.

  2. Average Time Complexity of QuickSort

    What is the average case time complexity of the QuickSort algorithm?

    1. O(n log n)
    2. O(1)
    3. O(log n)
    4. O(n^2)

    Explanation: QuickSort averages O(n log n) time, making it efficient for most cases. O(n^2) is its worst case, O(1) is incorrect as it is unrealistic for sorting, and O(log n) is too low for sorting n items.

  3. Best Case for Insertion Sort

    When does Insertion Sort achieve its best case time complexity?

    1. Array is already sorted
    2. Random input
    3. All elements are identical
    4. Array is in reverse order

    Explanation: Insertion Sort works fastest when the array is already sorted, requiring only O(n) time. Reverse order is the worst case, random input is average, and all identical elements still require checking unless it's already sorted.

  4. Space Complexity of Merge Sort

    What is the space complexity of standard Merge Sort?

    1. O(log n)
    2. O(1)
    3. O(n)
    4. O(n^2)

    Explanation: Merge Sort requires O(n) extra space due to the temporary arrays used in merging. O(1) is not possible because of these arrays, O(log n) is for recursion stack in in-place algorithms, and O(n^2) is excessive.

  5. Heap Sort Data Structure

    Which data structure is used to implement Heap Sort?

    1. Queue
    2. Stack
    3. AVL Tree
    4. Binary Heap

    Explanation: Heap Sort utilizes a Binary Heap to efficiently manage and retrieve the next element. AVL trees are balanced search trees, Stack and Queue are unrelated linear structures.

  6. Non-comparison-based Sorting

    Which sorting algorithm does NOT rely on element-to-element comparisons?

    1. Bubble Sort
    2. Shell Sort
    3. Selection Sort
    4. Counting Sort

    Explanation: Counting Sort is not comparison-based; it uses counting and indexing. The other algorithms sort by comparing elements with each other.

  7. Binary Search Time Complexity

    What is the Big-O time complexity of binary search on a sorted array?

    1. O(log n)
    2. O(n)
    3. O(n log n)
    4. O(1)

    Explanation: Binary search halves the search interval each time, resulting in O(log n) time. O(n) applies to linear search, O(n log n) is typical for sorting, and O(1) suggests one-step lookup.

  8. Master Theorem Application

    The master theorem is primarily used for solving what type of mathematical expressions?

    1. Differential equations
    2. Matrix multiplications
    3. Probability distributions
    4. Recurrence relations

    Explanation: The master theorem addresses recurrence relations common in recursive algorithm analysis. Differential equations, probability, and matrices are beyond its scope.

  9. Big-Theta Notation Meaning

    What does Big-Θ (Theta) notation represent in algorithm analysis?

    1. Tight bound (average behavior)
    2. Lower bound only
    3. Worst-case bound
    4. Loose upper bound

    Explanation: Big-Theta describes both the upper and lower tight bounds of a function, often interpreted as average or typical case. Worst-case is O-notation, lower-bound is Omega, and loose upper bounds are not Θ.

  10. Effect of Doubling Input in Exponential Algorithm

    For an algorithm running in O(2ⁿ), what happens to running time when n is doubled?

    1. Exponentially (≈ 2× growth per +1 in n)
    2. Triples
    3. Doubles
    4. Remains constant

    Explanation: In O(2ⁿ), each increase in n multiplies running time by about 2, leading to exponential growth. Doubling or tripling are linear patterns, and constant time growth does not apply.