Q1
Which of the following best describes the core principle behind a greedy algorithm?
- Making the locally optimal choice at each step with the hope of finding a global optimum.
- Exploring all possible solutions to find the absolute best answer.
- Dividing the problem into smaller subproblems and solving them recursively.
- Randomly selecting a solution and iteratively improving it.
Q2
In the Activity Selection Problem, what criterion is typically used to select the next activity?
- Earliest start time
- Shortest duration
- Earliest finish time
- Maximum profit
Q3
What is the primary advantage of using Huffman Coding for data compression?
- Minimizing the average code length for frequently occurring characters.
- Providing strong encryption for sensitive data.
- Guaranteeing lossless compression for all types of data.
- Simplifying the encoding and decoding process significantly.
Q4
For the Fractional Knapsack Problem, how is the greedy approach applied?
- Selecting items in decreasing order of weight.
- Selecting items in increasing order of value.
- Selecting items in decreasing order of value-to-weight ratio.
- Selecting items randomly until the knapsack is full.
Q5
In Job Sequencing with Deadlines, what factor determines the selection of jobs?
- Jobs with the earliest deadlines are prioritized.
- Jobs with the highest profits are prioritized, respecting deadlines.
- Jobs with the shortest processing times are prioritized.
- Jobs are selected randomly.
Q6
Which data structure is commonly used to efficiently implement Kruskal’s algorithm for finding the minimum spanning tree?
- Heap
- Binary Search Tree
- Disjoint Set Union (DSU)
- Linked List
Q7
What is the primary difference between Kruskal’s and Prim’s algorithms for finding minimum spanning trees?
- Kruskal’s is faster than Prim’s.
- Prim’s grows the MST from a single starting vertex, while Kruskal’s adds edges in increasing order of weight.
- Kruskal’s can handle negative edge weights, while Prim’s cannot.
- Prim’s is easier to implement than Kruskal’s.
Q8
What is a common real-world application of Huffman Coding?
- Database indexing
- Image compression (JPEG)
- Network routing
- Operating system scheduling
Q9
What is the typical time complexity of the Fractional Knapsack algorithm?
- O(n)
- O(n log n)
- O(n^2)
- O(2^n)
Q10
Under what condition is a greedy algorithm most likely to fail in finding the optimal solution?
- When the problem has optimal substructure.
- When the problem has overlapping subproblems.
- When the locally optimal choice does not lead to a globally optimal solution.
- When the input size is very small.
Q11
Why might dynamic programming be preferred over a greedy approach for certain optimization problems?
- Dynamic programming is always faster than greedy algorithms.
- Dynamic programming guarantees an optimal solution by exploring all possibilities.
- Greedy algorithms always require more memory.
- Dynamic programming is simpler to implement.
Q12
What is the 'greedy choice property'?
- The property that an optimal solution to the problem can always be found by making a series of locally optimal choices.
- The property that the problem can be divided into smaller, overlapping subproblems.
- The property that the problem has many possible solutions.
- The property that the problem is NP-complete.
Q13
Which of the following problems is LEAST suitable for solving with a greedy algorithm?
- Fractional Knapsack
- Activity Selection
- Traveling Salesperson Problem (TSP)
- Huffman Coding
Q14
What is a common strategy for proving the correctness of a greedy algorithm?
- Exhaustive search
- Mathematical induction, showing that the greedy choice always leads to an optimal partial solution.
- Empirical testing with large datasets
- Randomized simulations
Q15
Suppose a new problem seems suitable for a greedy approach. What's a critical first step in determining if this is truly viable?
- Immediately implement the greedy algorithm and test it.
- Focus on optimizing the data structures for speed.
- Attempt to prove that the greedy choice property holds and that an optimal substructure exists.
- Compare the problem to known dynamic programming problems.