Big-O Analysis of Arrays u0026 Strings with Hash Maps: Efficiency in Frequency, Deduplication, and Two-Sum Quiz

Test your knowledge of applying Big-O notation to arrays and strings with hash maps, focusing on frequency counting, deduplication, the two-sum problem, and understanding time and space complexities. This quiz helps you master performance trade-offs and optimization strategies for common algorithmic tasks.

  1. Time Complexity of Hash Map Frequency Count

    What is the typical time complexity of counting the frequency of each character in a string of length n using a hash map?

    1. O(log n)
    2. O(n^2)
    3. O(n)
    4. O(1)

    Explanation: Using a hash map to count character frequencies requires a single pass through the string, resulting in an O(n) time complexity. O(log n) and O(1) are incorrect because they underestimate the need to check every character. O(n^2) would only apply if there was a nested loop, which is unnecessary in this scenario.

  2. Space Complexity of Deduplication with Hash Sets

    After deduplicating an array of n elements using a hash set, what is the space complexity in terms of n?

    1. O(n^2)
    2. O(1)
    3. O(n)
    4. O(log n)

    Explanation: Storing each unique element from the array in a hash set requires up to O(n) additional space. O(1) is unrealistic since the set could grow linearly with the input. O(n^2) is excessive and not needed, while O(log n) does not account for potentially n distinct elements.

  3. Two-Sum Problem Time Complexity

    When solving the two-sum problem for an array using a hash map, what is the most efficient achievable time complexity?

    1. O(n)
    2. O(n log n)
    3. O(log n)
    4. O(n^2)

    Explanation: A hash map enables you to check for complements in one pass, yielding O(n) time. O(n^2) arises from a brute-force approach, which is slower. O(log n) and O(n log n) are incorrect as they reflect binary search and sorting complexities, not hash map lookups.

  4. Frequency Counting Space Complexity

    What is the worst-case space complexity when counting element frequencies in an array of n integers using a hash map?

    1. O(1)
    2. O(log n)
    3. O(n^2)
    4. O(n)

    Explanation: The hash map may need to store up to n distinct keys if all elements are unique, giving O(n) space. O(1) is only true if the set of possible keys is constant-sized, which is not always the case. O(n^2) and O(log n) do not correspond to the storage pattern of this technique.

  5. Deduplication In-Place Trade-off

    What is the primary trade-off when deduplicating an array in-place (without extra space) compared to using a hash set?

    1. In-place uses more space but less time
    2. In-place sorting always has better performance
    3. In-place increases time complexity but saves space
    4. In-place reduces both time and space

    Explanation: Removing duplicates in-place often requires nested loops or sorting, increasing time complexity but saving space. Reducing both time and space is generally not possible; there is usually a trade-off. Using more space in-place contradicts the definition. In-place sorting does not always outperform other methods.

  6. Impact of Hash Collisions

    How do frequent hash collisions affect the average-case time complexity of hash map operations in algorithms like two-sum?

    1. Collisions can lead to slower lookups, approaching O(n) per operation
    2. Collisions guarantee better performance
    3. Collisions make operations constant time
    4. Collisions have no effect on complexity

    Explanation: If many elements hash to the same location, operations degrade toward O(n) per lookup. Collisions do affect complexity and can slow things down. Constant time is only typical when collisions are rare or well-handled. Performance does not improve with more collisions.

  7. Using Hash Maps vs. Sorting for Deduplication

    When deduplicating a large unsorted array, why might a hash map approach be preferred over sorting and removing duplicates?

    1. Sorting is always faster than hash maps
    2. Hash maps require nested loops
    3. Sorting always uses less space
    4. Hash maps usually achieve better time complexity for unsorted data

    Explanation: Hash maps can deduplicate in O(n) time, while sorting requires O(n log n). Sorting may save space in some languages but not always, and hash maps rarely use nested loops here. Sorting is not faster than a well-implemented hash map for this problem.

  8. Best Space Complexity for Frequency Counting in Small Alphabets

    For a string containing only lowercase English letters, what is the best possible space complexity of frequency counting?

    1. O(n^2)
    2. O(log n)
    3. O(1)
    4. O(n)

    Explanation: The number of possible keys (letters) is constant, so the space needed is O(1). O(n) is needed for arbitrary-length keys. O(log n) and O(n^2) do not reflect the true space for such a limited alphabet.

  9. Optimizing Two-Sum for Sorted Arrays

    What is an efficient approach for the two-sum problem on a sorted array, and its typical time complexity?

    1. Hash map in O(n log n) time
    2. Binary search on each element in O(n^2) time
    3. Two pointers in O(n) time
    4. Brute-force in O(n^2) time

    Explanation: With a sorted array, two pointers can find a valid pair in O(n). A hash map is not needed and would still only be O(n). Brute-force and repeated binary search require O(n^2) steps, so they are less efficient.

  10. Choosing Hash Maps: When is Space Most Costly?

    In which scenario does using a hash map for frequency counting come with the greatest space cost?

    1. When all elements are the same
    2. When the input is very small
    3. When there are only two possible unique values
    4. When the number of unique elements is close to the input size

    Explanation: If each element is unique, the hash map may require O(n) space, making it more costly. When all elements are the same or there are only two unique values, the space remains small. For tiny inputs, space is not a concern regardless of uniqueness.