Explore essential concepts of algorithmic growth by identifying time and space complexities, interpreting Big-O notation, and distinguishing practical implications on performance. This quiz challenges your understanding of common Big-O cases and helps you analyze efficiency in computing.
Given a function where an outer loop runs n times and, for each iteration, an inner loop also runs n times, what is the overall time complexity?
Explanation: When both the outer and inner loops run n times, their combined number of operations is n times n, resulting in O(n^2) complexity. O(2n) is incorrect because it suggests linear growth, not the quadratic increase caused by nested loops. O(log n) is for logarithmic scenarios, such as binary search. O(n + log n) implies a linear loop with an additional logarithmic operation, which does not fit this case.
If a function creates a new array of size n from input, what is the space complexity?
Explanation: Creating an array of size n requires space that grows linearly with n, represented as O(n) space complexity. O(1) is for constant space usage, which is not the case when storing n items. O(n^2) indicates quadratic growth, which only happens when, for example, you have a two-dimensional array. O(n log n) refers to a scenario where space usage scales faster than linear but slower than quadratic.
For linear search in an unsorted array of n elements, what Big-O notation correctly describes the worst-case time complexity?
Explanation: The worst-case for linear search requires checking every element, resulting in O(n) time complexity, as all n elements might have to be examined. O(1) describes constant time, which applies only if the element is found immediately. O(log n) fits algorithms like binary search in sorted data. O(n^2) is excessive for a single scan and applies to nested operations.
Which of the following time complexities grows fastest as n becomes very large?
Explanation: O(2^n) is exponential growth, and it outpaces polynomial and logarithmic complexities as n increases, making it the fastest growing of these choices. O(n^2) is quadratic and grows slower than exponential but faster than linear. O(n log n) is common in efficient sorting algorithms and grows faster than linear but slower than quadratic. O(n) is linear and the slowest among these.
If an algorithm’s runtime triples whenever the input size doubles, which Big-O time complexity best describes this behavior?
Explanation: When the runtime approximately multiplies by a constant factor as input size doubles (for example, triples), this is characteristic of linear growth, O(n). O(log n) would show much slower increases on input doubling. O(n^2) would result in quadrupled time, not tripled, when input doubles. O(2^n) implies exponential growth, which escalates much faster than what is described.