Test your understanding of Big-O notation, time complexity, and space complexity with practical scenarios. This beginner-friendly quiz covers foundational concepts and helps clarify how computational efficiency impacts real-world tasks.
If an algorithm returns the first element of a list of any size, what is its time complexity?
Explanation: Accessing the first element of a list is a constant-time operation, so the time complexity is O(1). O(n) and O(n^2) would indicate linear or quadratic time, which only occur when processing multiple elements. O(log n) is typical for divide-and-conquer operations like binary search, not direct access.
What is the time complexity of checking each item in a list of 1000 items once?
Explanation: Checking each item once involves a single operation per element, so the complexity is O(n). O(n^2) would involve nested loops, O(1) is for a single step, and O(log n) applies to reducing the set size each step, not full traversal.
Given two nested loops, each iterating over a list of n items, what is the time complexity?
Explanation: Two nested loops each running n times will cause a total of n * n operations, giving O(n^2) complexity. O(n) is for a single loop, O(log n) reflects logarithmic growth, and O(1) is constant time, none of which fit this nested structure.
When performing a linear search on an array and storing only the search index, what is the space complexity?
Explanation: A linear search only requires a constant amount of storage, usually for the index and possible target value, so the extra space is O(1). O(n) would be required if we stored all or a portion of the array. O(n^2) and O(log n) do not apply here, as there is no nested or logarithmic storage requirement.
What is the space complexity when making a full copy of a list with n items?
Explanation: Copying n items requires new space equal to their number, resulting in O(n). O(1) would be present if no extra space was used, O(log n) is too little space, and O(n^2) is an overestimate for just copying once.
For linear search, what is the best-case time complexity when the target value is the first item?
Explanation: If the first element matches the target in linear search, only one comparison is needed, making it O(1). O(n) is the overall worst case for searching through the entire list. O(log n) and O(n^2) are incorrect for a single comparison in this context.
Why does a binary search require the data to be sorted to achieve O(log n) time complexity?
Explanation: Binary search divides the search range in half with each step, which only works if data is sorted. Sorted data doesn’t inherently use less memory, so B is incorrect. Unsorted data does not make linear scan slower; it just requires O(n) instead. D incorrectly describes linear search, not binary search.
Which scenario is most likely to use O(n^2) space complexity?
Explanation: A two-dimensional n by n grid needs storage for each cell, resulting in O(n^2) space. Processing one element at a time uses only O(1) or O(n), binary search uses O(1) or O(log n) space, and in-place list reversal uses constant space.
If a recursive function calls itself n times before finishing, what is the space complexity due to the call stack?
Explanation: Each recursive call adds a new frame to the call stack, resulting in O(n) space if there are n calls. O(1) applies to iterative tasks, O(n^2) is too high, and O(log n) would occur with divide-and-conquer recursion like binary search but not simple recursion through all elements.
What is the average-case time complexity for searching for a key in a hash table with good hash distribution?
Explanation: With a well-distributed hash function, searching for a key in a hash table is O(1) on average. O(n) could occur only with poor hash functions or many collisions. O(n^2) and O(log n) do not accurately describe typical hash table search.
What is the worst-case time complexity of the bubble sort algorithm on a list of size n?
Explanation: Bubble sort compares each pair of elements repeatedly, resulting in O(n^2) time in the worst case. O(n) would be optimistic and only possible in a best case or partially sorted list. O(log n) is unrelated to sorting by repeated comparisons, and O(1) is only for very limited use cases.
If you concatenate n strings, each of length 1, into a new string, what is the space complexity?
Explanation: Storing the resulting concatenated string requires O(n) space. O(1) is too small since every character needs memory. O(n^2) would arise if concatenation was inside a nested loop. O(log n) does not fit growing storage proportional to input size.
What is the average or amortized time complexity of appending n elements to a dynamic array?
Explanation: Dynamic arrays use O(1) amortized time for each append, as resizing happens infrequently and the total cost is spread out. O(n) would be the worst case for individual resizes, but over many appends the average is O(1). O(log n) and O(n^2) are too high for basic appending.
Which search algorithm is most efficient on a large, sorted list in terms of time complexity?
Explanation: Binary search has O(log n) time complexity, making it most efficient for sorted lists. Linear search is O(n), which is slower. 'Bubble search' is not a standard term and distracts. 'Logarithmic search' sounds correct by name but is not a widely-used term compared to binary search.
What is the typical extra space complexity of an in-place sorting algorithm like selection sort?
Explanation: In-place sorting algorithms like selection sort swap elements in the original array, using only a few extra variables for temporary values (O(1) space). O(n) is needed when creating new arrays, O(log n) for certain divide-and-conquer sorts, and O(n^2) only for extensive additional data structures.
Why is it important to consider both time and space complexity when designing an algorithm for large datasets?
Explanation: Considering both time and space complexity helps ensure your solution works efficiently without exceeding the memory and time limits of real systems. Code typos are not addressed by Big-O analysis. O(1) is ideal but not always attainable, and considering Big-O can't guarantee a program never crashes.