Concurrency and Parallelism in Haskell Quiz Quiz

Explore essential concepts of concurrency and parallelism in Haskell, including lightweight threads, synchronization, parallel strategies, and common libraries. This quiz helps you assess your understanding of safe concurrent patterns, practical API usage, and distinctions between concurrency and parallelism in functional programming.

  1. Understanding Concurrency Basics

    Which abstraction allows multiple computations to make progress independently in Haskell, often using lightweight threads?

    1. Memoization
    2. Backtracking
    3. Concurrency
    4. Recursion

    Explanation: Concurrency allows multiple computations to progress independently, which is fundamental in Haskell for handling tasks such as network servers or user interactions. Recursion is a technique for repeating computations, not coordinating multiple actions. Memoization refers to caching results of computations, and backtracking is useful for exploring multiple possibilities but isn’t specifically concerned with simultaneous computation.

  2. Distinguishing Concurrency and Parallelism

    Which statement best describes parallelism as opposed to concurrency in the context of Haskell?

    1. Parallelism is limited to network programming only.
    2. Parallelism refers to managing many tasks but not necessarily at the same time.
    3. Parallelism means suspending calculations to save resources.
    4. Parallelism is about performing many calculations simultaneously to speed up computation.

    Explanation: Parallelism in Haskell aims to improve performance by carrying out many calculations at the same time, often by taking advantage of multiple CPU cores. Unlike concurrency, which is about structuring programs to manage numerous tasks, parallelism is specifically about actual simultaneous execution. Suspending calculations describes lazy evaluation, and parallelism is not restricted to network programming.

  3. Lightweight Threads in Haskell

    In Haskell, what term refers to the lightweight, user-level threads commonly created for concurrent activities?

    1. Fiber nodes
    2. Blue threads
    3. Link cells
    4. Green threads

    Explanation: Green threads are lightweight, managed in user space by the Haskell runtime, and allow for efficient concurrent programming. Blue threads are not a recognized Haskell term. Fiber nodes and link cells are unrelated to thread abstractions in Haskell. Green threads help Haskell achieve effective concurrency without relying on the operating system threads.

  4. Parallel List Evaluation

    Which Haskell strategy function allows a list to be evaluated in parallel, such as when summing a large list of numbers?

    1. forkList
    2. threadList
    3. parList
    4. catList

    Explanation: The 'parList' strategy is used in Haskell to evaluate lists in parallel, distributing the workload across cores. 'catList' and 'threadList' do not exist, while 'forkList' is not a standard strategy function for parallel evaluation. Using 'parList' helps in efficiently processing large lists.

  5. Safe Communication between Threads

    Which Haskell primitive ensures safe communication between threads by providing an initially empty box that multiple threads can read from or write to?

    1. RVar
    2. LVar
    3. MVar
    4. CVar

    Explanation: MVar is a mutable variable in Haskell that allows for safe communication and synchronization between threads. CVar, LVar, and RVar are not the standard primitives for this purpose in Haskell, with MVar being the most common for protecting shared data. Only MVar provides the classic 'full or empty' box for inter-thread communication.

  6. Spawning Concurrent Computations

    What Haskell function is most commonly used to spawn a new lightweight thread to execute an IO action concurrently?

    1. runThread
    2. parIO
    3. forkIO
    4. spawnIO

    Explanation: The 'forkIO' function creates a new lightweight thread for concurrent execution of IO actions in Haskell. 'parIO', 'spawnIO', and 'runThread' are not standard functions for this purpose. Using 'forkIO' allows concurrent tasks like handling multiple clients in a server program.

  7. Ensuring Data Race Safety

    When two Haskell threads access the same mutable value, what mechanism can help prevent data races?

    1. Using MVar for synchronization
    2. Unrestricted sharing of variables
    3. Ignoring synchronization
    4. Using only pure functions

    Explanation: MVar provides a safe way to synchronize access to shared mutable state and prevent data races. Ignoring synchronization or unrestricted sharing leads to unsafe behaviors. Using only pure functions avoids the need for mutable state entirely, but when mutation is necessary, proper synchronization (like with MVar) is essential.

  8. Parallel Computations in Pure Code

    Which combinator is commonly used to suggest parallel evaluation of two expressions in pure Haskell code?

    1. fork
    2. wait
    3. seq
    4. par

    Explanation: The 'par' combinator hints to the runtime to evaluate expressions in parallel. 'seq' is used to enforce evaluation order but does not suggest parallelism. 'fork' and 'wait' are not combinators for pure parallel evaluation in Haskell, making 'par' the correct choice here.

  9. Common Concurrency Mistakes

    What is a common bug that can occur when mutable state is shared without proper synchronization in concurrent Haskell programs?

    1. Stack overflow
    2. Infinite loop
    3. Type error
    4. Race condition

    Explanation: A race condition occurs when multiple threads access or modify shared mutable state without adequate synchronization, leading to unpredictable results. Infinite loops and stack overflows are separate programming issues, and type errors are caught by the compiler and not specific to concurrency or parallelism.

  10. Choosing the Right Library

    Which of the following Haskell libraries is commonly used to implement parallelism in pure computations, for example, mapping functions over large data structures?

    1. System.IO
    2. Control.Concurrent
    3. Data.List
    4. Control.Parallel

    Explanation: Control.Parallel provides combinators for suggesting parallel evaluation of pure computations, making it suitable for processing large lists or arrays. Data.List is for list manipulation without concurrency features. System.IO handles input and output operations, and Control.Concurrent deals with concurrency, not pure parallelism.