Assess your understanding of randomized algorithms, the Monte Carlo method, expected outcomes, and error analysis. This quiz provides practical examples and core concepts for learners interested in probabilistic computing techniques.
In simulating the value of π using the Monte Carlo method, why does increasing the number of random points lead to a more accurate estimate?
Explanation: Increasing the number of random points in a Monte Carlo simulation utilizes the law of large numbers, helping the sample average approach the true expected value. This leads to a better estimate of π. More points do not decrease randomness (option B), the algorithm remains probabilistic rather than deterministic (option C), and the locations of points are uniformly distributed, not always closer to the circle center (option D).
Which statement best distinguishes a Las Vegas algorithm from a Monte Carlo algorithm?
Explanation: Las Vegas algorithms are randomized algorithms that produce correct results with varying runtime, but never incorrect answers. Option B is wrong because Las Vegas algorithms do not always run in constant time. Option C incorrectly describes Monte Carlo algorithms, which may give wrong answers but with bounded runtime. Option D is unrelated and not a property of Las Vegas algorithms.
When using a randomized pivot in Quicksort, how does this impact the algorithm's expected runtime on a sorted array as compared to using a fixed pivot?
Explanation: Randomly choosing the pivot in Quicksort avoids worst-case scenarios caused by sorted input, bringing the expected runtime down to O(n log n). Option B is wrong because space complexity remains O(log n) due to recursion. The algorithm does not become unstable as in option C. Option D is incorrect, as randomness ensures good expected performance but does not guarantee the best case every time.
Suppose a Monte Carlo algorithm has an individual error probability of 0.2. If the algorithm is run independently five times and outputs the majority result, what is the main benefit of this approach?
Explanation: Running the Monte Carlo algorithm multiple times and using the majority output exploits the law of large numbers, statistically lowering the chance of an overall wrong answer. Option B is incorrect because the error is reduced, not eliminated. Option C is wrong as each run remains probabilistic. Option D is the opposite of the actual outcome.
In the randomized algorithm for finding the minimum cut in a connected undirected graph, what is the main risk associated with the contraction step?
Explanation: Contracting an edge from the minimum cut merges nodes that should stay in separate partitions, possibly causing the optimal cut to be missed. Option B is incorrect because the algorithm never eliminates all vertices. Option C is false as it does not find the maximum cut. Option D is also incorrect since the contraction operation is efficient, not exponential in time.