Big-O in Real Tasks: Time and Space Efficiency Quiz Quiz

Deepen your understanding of time and space complexity with this quiz focused on practical Big-O implications in real-world programming tasks. Explore efficiency analysis, spot costly algorithms, and distinguish between subtle complexity differences for robust and optimal code.

  1. Searching an Unsorted Array

    When searching for an element in an unsorted list of 1,000,000 items using a linear search, what is the worst-case time complexity?

    1. O(log n)
    2. O(1)
    3. O(n^2)
    4. O(n)

    Explanation: In linear search, every element may need to be checked, resulting in O(n) time complexity where n is the list size. O(log n) is for algorithms like binary search which require sorted data. O(1) suggests constant time, which is not possible here. O(n^2) appears in algorithms with nested loops, not simple linear scans.

  2. Choosing a Sorting Algorithm for Space Efficiency

    Given limited memory, which sorting algorithm is best suited for sorting a large array in-place, minimizing extra space usage?

    1. Counting Sort
    2. Merge Sort
    3. Bubble Sort
    4. Selection Sort

    Explanation: Selection sort uses constant extra space (O(1)) because it operates in-place without needing additional storage. Merge sort typically uses O(n) extra space, which is undesirable for space efficiency. Bubble sort is also in-place but less efficient than selection sort in practice. Counting sort is not in-place and uses more space for counting arrays.

  3. Analyzing Nested Loops for Efficiency

    If a function uses two nested loops, each going from 1 to n, what is the time complexity of this function?

    1. O(2n)
    2. O(log n)
    3. O(n)
    4. O(n^2)

    Explanation: Each of the n iterations of the outer loop triggers n iterations of the inner loop, resulting in O(n*n), or O(n^2), time complexity. O(n) describes a single loop, not nested loops. O(log n) would indicate much faster growth. O(2n) is linear and not applicable to nested loops.

  4. Space Complexity of Recursive Algorithms

    What is the space complexity of a recursive algorithm that makes a single recursive call per invocation for a problem size of n?

    1. O(1)
    2. O(log n)
    3. O(n)
    4. O(n^2)

    Explanation: Each recursive call adds a new layer to the call stack, leading to O(n) space usage for n calls. O(1) applies to algorithms that use constant space regardless of input size. O(n^2) is too high for a linear recursion. O(log n) appears in divide-and-conquer approaches where the recursion depth is minimized, which is not the case here.

  5. Hash Table Lookup Operations

    What is the average-case time complexity for searching an element in a hash table with efficient hashing and few collisions?

    1. O(log n)
    2. O(n)
    3. O(n^2)
    4. O(1)

    Explanation: With a good hash function and few collisions, hash table lookup is a constant time operation, or O(1) on average. O(n) would indicate scanning the entire table, which occurs only with severe collisions. O(n^2) is not typical for lookups. O(log n) can appear in balanced trees, not hash tables.