OS Concurrency Quiz: Processes and Threads Fundamentals Quiz

Challenge your understanding of operating system concurrency, focusing on the core concepts of processes, threads, context switching, and synchronization. This quiz is designed to help learners review key concurrency principles commonly covered in introductory operating systems courses.

  1. Difference Between Process and Thread

    Which statement best describes a primary difference between a process and a thread in operating systems?

    1. A process has its own memory space, while threads within the same process share memory.
    2. A process cannot communicate with another process.
    3. A thread always runs faster than a process.
    4. A thread cannot be scheduled by the operating system.

    Explanation: Processes are independent execution units with separate memory, but threads within the same process share the same memory region. Threads do not always run faster than processes, as speed depends on many factors, making option B incorrect. Processes can communicate with each other using inter-process communication, so option C is false. Threads are schedulable entities, so option D is incorrect.

  2. Definition of Concurrency

    What does concurrency mean in the context of operating systems?

    1. Running only system processes and not user programs.
    2. Ensuring that only one program runs at a time.
    3. Executing multiple programs one after the other.
    4. Allowing multiple tasks to make progress at the same time.

    Explanation: Concurrency allows several tasks or threads to be in progress at overlapping periods, even if not simultaneously on a single-core CPU. Option A is sequential execution, option B is not concurrency but serial execution, and option D confuses the types of programs with the concept itself.

  3. Process Creation Example

    If a word processor launches a spell-checker in a separate process, how are their memory spaces related?

    1. They do not share memory space by default.
    2. The spell-checker cannot access files.
    3. They automatically share all data.
    4. They merge into one process after starting.

    Explanation: Separate processes each have their own memory, keeping their data isolated unless specifically set up for sharing. Automatic sharing does not occur, so option B is incorrect. The spell-checker can access files if allowed, so option C is wrong. Processes do not merge, making option D incorrect.

  4. Thread Example Scenario

    In a web browser, when each tab runs as a separate thread within the same process, what does this allow them to share?

    1. The entire process memory including global variables.
    2. Direct network connections only.
    3. No access to input devices.
    4. Completely separate memory with no shared data.

    Explanation: Threads within a process can access the same global variables and heap memory, making shared data possible. Option B is wrong since threads do not have completely separate memory. Input devices are not restricted at the thread level, disqualifying option C, and option D is too narrow a view of sharing.

  5. Context Switching

    What best describes a context switch in an operating system?

    1. Automatically merging two threads into one.
    2. Executing programs only during system startup.
    3. Saving and loading information to switch CPU execution from one process or thread to another.
    4. Copying files between memory locations.

    Explanation: A context switch involves the operating system temporarily stopping one process or thread and resuming another, saving and restoring their state. It does not merge threads, copy files, or occur only at startup, making options B, C, and D incorrect.

  6. Synchronization Tools

    Which of the following is commonly used to prevent race conditions when multiple threads access shared data?

    1. Network cable
    2. Firewall rule
    3. Mutex lock
    4. Read-only memory

    Explanation: A mutex lock is designed to coordinate access to shared resources, preventing race conditions. Network cables and firewall rules relate to connectivity and security, not synchronization. Read-only memory cannot be used by threads to coordinate changes because it cannot be modified.

  7. Thread Advantages

    What is a key advantage of using threads over creating multiple processes for concurrency?

    1. Threads always run on separate machines.
    2. Threads share resources, allowing faster context switches.
    3. Processes never terminate.
    4. Only threads can access hardware devices.

    Explanation: Threads are lightweight and share memory, making switching between them relatively fast. They do not run on separate machines by definition, making option B incorrect. Both threads and processes can access hardware if programmed, so option C is wrong. Option D is false as processes do terminate.

  8. Critical Section Problem

    Which statement best explains the critical section problem in concurrent programming?

    1. It refers to deleting operating system files.
    2. It is about compressing code for faster execution.
    3. It involves sections of code that must not be accessed by more than one process or thread at a time.
    4. It means increasing the number of CPU cores.

    Explanation: A critical section is part of code where shared resources are accessed, and it must be protected to avoid simultaneous access. Deleting system files, increasing CPU cores, and compressing code are not related to the critical section problem.

  9. Deadlock Scenario

    In a situation with two threads each waiting for a resource held by the other, what is this situation called?

    1. Buffer overflow
    2. Pipeline
    3. Scheduler jump
    4. Deadlock

    Explanation: Deadlock occurs when two or more threads are waiting indefinitely for resources locked by each other, halting further progress. A pipeline refers to sequential execution stages, buffer overflow is an excess data issue, and scheduler jump is not a standard concurrency term.

  10. Preemptive Scheduling

    What does preemptive scheduling allow the operating system to do with threads or processes?

    1. Prevent the use of multiple threads entirely.
    2. Allocate memory in a random order.
    3. Interrupt and switch execution from one thread or process to another.
    4. Guarantee that each process finishes before the next starts.

    Explanation: Preemptive scheduling lets the OS interrupt running tasks to ensure fair CPU use among processes or threads. It does not prevent threading, guarantee sort order for process execution, or randomly allocate memory, making the other options inaccurate.