Time and Space Complexity Analysis Quiz Quiz

  1. Hash Table Operation Complexity

    What is the average-case time complexity for inserting, deleting, or searching an element in a well-designed hash table?

    1. O(1)
    2. O(n)
    3. O(log n)
    4. O(n^2)
    5. 0(1)
  2. Big O Notation Fundamentals

    Which Big O notation describes an algorithm whose running time doubles for every additional element in the input?

    1. O(2^n)
    2. O(n log n)
    3. O(n^2)
    4. O(n)
    5. O(log n)
  3. Worst-Case Hash Table Behavior

    What is the worst-case time complexity for searching in a poorly-designed hash table due to many collisions?

    1. O(n)
    2. O(1)
    3. O(n log n)
    4. O(log n)
    5. O(nn)
  4. Space Complexity of Hash Tables

    Why do hash tables typically use more memory than simple arrays?

    1. They keep extra space for empty slots to reduce collisions
    2. They need to store every key twice
    3. They require logarithmic space for each element
    4. All elements are sorted, which takes more memory
    5. They only store keys, not values
  5. Calculating Time Complexity in Loops

    Given a loop that runs 'n' times and performs a simple operation each time, what is its time complexity?

    1. O(n)
    2. O(1)
    3. O(n^2)
    4. O(log n)
    5. 0(n)
  6. Improving Dynamic Programming Space Usage

    Which technique helps reduce the space complexity of a dynamic programming table in tree reconciliation problems?

    1. Storing only necessary mapping sites using filters
    2. Sorting the input data beforehand
    3. Duplicating all internal nodes
    4. Using recursion instead of iteration
    5. Increasing tree height artificially
  7. Tree Model and Complexity

    For trees generated under the Yule model, what is the expected worst-case space complexity for the improved node mapping algorithm?

    1. O(n^{1.42})
    2. O(n^2)
    3. O(n^{0.42})
    4. O(n^{1.58})
    5. O(nlogn)
  8. Practical Time Complexity Reduction

    Why is it beneficial to achieve sub-quadratic time complexity when reconciling very large trees?

    1. It allows analysis of much larger datasets within available memory and reasonable time
    2. It always guarantees correctness regardless of input
    3. It increases the number of evolutionary events found
    4. It makes the tree taller and more balanced
    5. It eliminates the need for any preprocessing
  9. Average Number of Mapping Sites

    In improved node mapping algorithms, what is the average number of mapping sites per parasite node for trees generated under the Yule model?

    1. n^{0.42}
    2. n^{1.42}
    3. n^{2.17}
    4. n^{0.58}
    5. n
  10. Identifying Algorithmic Bottlenecks

    As datasets grow, which resource is typically the biggest bottleneck for coevolutionary analysis algorithms with quadratic space complexity?

    1. Main memory (RAM)
    2. CPU clock speed
    3. Disk space
    4. Network bandwidth
    5. Number of recursive calls