Explore essential concepts of concurrency and parallelism in Haskell, including lightweight threads, synchronization, parallel strategies, and common libraries. This quiz helps you assess your understanding of safe concurrent patterns, practical API usage, and distinctions between concurrency and parallelism in functional programming.
Which abstraction allows multiple computations to make progress independently in Haskell, often using lightweight threads?
Explanation: Concurrency allows multiple computations to progress independently, which is fundamental in Haskell for handling tasks such as network servers or user interactions. Recursion is a technique for repeating computations, not coordinating multiple actions. Memoization refers to caching results of computations, and backtracking is useful for exploring multiple possibilities but isn’t specifically concerned with simultaneous computation.
Which statement best describes parallelism as opposed to concurrency in the context of Haskell?
Explanation: Parallelism in Haskell aims to improve performance by carrying out many calculations at the same time, often by taking advantage of multiple CPU cores. Unlike concurrency, which is about structuring programs to manage numerous tasks, parallelism is specifically about actual simultaneous execution. Suspending calculations describes lazy evaluation, and parallelism is not restricted to network programming.
In Haskell, what term refers to the lightweight, user-level threads commonly created for concurrent activities?
Explanation: Green threads are lightweight, managed in user space by the Haskell runtime, and allow for efficient concurrent programming. Blue threads are not a recognized Haskell term. Fiber nodes and link cells are unrelated to thread abstractions in Haskell. Green threads help Haskell achieve effective concurrency without relying on the operating system threads.
Which Haskell strategy function allows a list to be evaluated in parallel, such as when summing a large list of numbers?
Explanation: The 'parList' strategy is used in Haskell to evaluate lists in parallel, distributing the workload across cores. 'catList' and 'threadList' do not exist, while 'forkList' is not a standard strategy function for parallel evaluation. Using 'parList' helps in efficiently processing large lists.
Which Haskell primitive ensures safe communication between threads by providing an initially empty box that multiple threads can read from or write to?
Explanation: MVar is a mutable variable in Haskell that allows for safe communication and synchronization between threads. CVar, LVar, and RVar are not the standard primitives for this purpose in Haskell, with MVar being the most common for protecting shared data. Only MVar provides the classic 'full or empty' box for inter-thread communication.
What Haskell function is most commonly used to spawn a new lightweight thread to execute an IO action concurrently?
Explanation: The 'forkIO' function creates a new lightweight thread for concurrent execution of IO actions in Haskell. 'parIO', 'spawnIO', and 'runThread' are not standard functions for this purpose. Using 'forkIO' allows concurrent tasks like handling multiple clients in a server program.
When two Haskell threads access the same mutable value, what mechanism can help prevent data races?
Explanation: MVar provides a safe way to synchronize access to shared mutable state and prevent data races. Ignoring synchronization or unrestricted sharing leads to unsafe behaviors. Using only pure functions avoids the need for mutable state entirely, but when mutation is necessary, proper synchronization (like with MVar) is essential.
Which combinator is commonly used to suggest parallel evaluation of two expressions in pure Haskell code?
Explanation: The 'par' combinator hints to the runtime to evaluate expressions in parallel. 'seq' is used to enforce evaluation order but does not suggest parallelism. 'fork' and 'wait' are not combinators for pure parallel evaluation in Haskell, making 'par' the correct choice here.
What is a common bug that can occur when mutable state is shared without proper synchronization in concurrent Haskell programs?
Explanation: A race condition occurs when multiple threads access or modify shared mutable state without adequate synchronization, leading to unpredictable results. Infinite loops and stack overflows are separate programming issues, and type errors are caught by the compiler and not specific to concurrency or parallelism.
Which of the following Haskell libraries is commonly used to implement parallelism in pure computations, for example, mapping functions over large data structures?
Explanation: Control.Parallel provides combinators for suggesting parallel evaluation of pure computations, making it suitable for processing large lists or arrays. Data.List is for list manipulation without concurrency features. System.IO handles input and output operations, and Control.Concurrent deals with concurrency, not pure parallelism.