Challenge your understanding of search algorithm time complexities with this focused quiz, designed to assess your ability to distinguish between various algorithmic approaches and their performance. Learn about linear, binary, and specialized search strategies by analyzing real-world examples and theoretical concepts.
What is the worst-case time complexity of linear search on an unsorted array containing n elements?
Explanation: The worst-case time complexity of linear search on an unsorted array is O(n) because each element must be checked one by one until the target is found or the end is reached. O(log n) applies to binary search in sorted arrays, not linear search. O(1) assumes constant time access, which is not valid for linear scan operations. O(n log n) is typical for certain sorting algorithms, not simple search methods.
Which condition must be met for binary search to have its optimal O(log n) time complexity on an array?
Explanation: Binary search requires that the array is sorted to correctly halve the search interval at each step, which ensures O(log n) performance. Unique elements are not a necessity for the algorithm to work. Circular or sparse arrays are not required conditions for binary search’s optimal performance, and may actually complicate standard binary search.
Suppose you are searching for a key in a hash table without any collisions. What is the expected time complexity for a successful search?
Explanation: When there are no collisions in a hash table, searching is performed in constant time, O(1), because the index calculation leads directly to the key's location. O(n) complexity typically appears in linear searches or worst-case hash table scenarios with many collisions. O(log n) is common for balanced tree structures, while O(n^2) does not apply to hash table search.
What is the worst-case time complexity of searching for a value in a balanced binary search tree containing n nodes?
Explanation: A balanced binary search tree (BST) ensures that the height is minimized, so each search operation takes at most O(log n) time by halving the search space at each step. O(n) would be correct for unbalanced trees that degrade into lists. O(n^2) does not describe BST search, and O(1) is only achievable with a data structure like a perfect hash table, not a tree.
Which search algorithm commonly exhibits O(2^n) exponential time complexity in its worst case, particularly on large search spaces without pruning?
Explanation: Brute-force search often checks every possible combination, leading to O(2^n) complexity for problems with binary choices, such as subsets or combinatorial problems. Breadth-first and depth-first search usually have polynomial complexity relative to graph size, not exponential, unless applied to certain problems. Jump search is used for sorted lists and has O(√n) complexity, which is much better than exponential.