Which sorting algorithm repeatedly compares adjacent pairs and swaps them if they are out of order until the array is sorted, as would happen with [4, 2, 3] becoming [2, 3, 4] after multiple passes?
After the first pass of selection sort on the array [5, 3, 4, 1], which element ends up in the first position (index 0)?
In quick sort, what is the primary role of the pivot element during the partition step for any array, such as [7, 2, 9, 4]?
When given an already sorted array like [1, 2, 3, 4, 5], which algorithm can finish after a single full pass because it detects that no swaps are needed?
Which sorting algorithm typically uses recursion and a divide-and-conquer approach by splitting the array around a chosen element called the pivot?
Which of the three algorithms is stable in its standard implementation, meaning equal elements keep their original relative order when sorting a list of records with equal keys?
For an array of length n, how many swaps does basic selection sort perform at most over the entire sort?
What is the worst-case time complexity of quick sort when poor pivot choices lead to extremely unbalanced partitions, such as always picking the smallest element in a sorted array?
What is the best-case time complexity of bubble sort with an early-exit (no-swap) check when the input list of length n is already sorted?
For large random arrays, which of the three algorithms generally offers the best average-case performance?