Lambda Integration with DynamoDB Streams Quiz Quiz

Challenge your understanding of integrating serverless functions with DynamoDB Streams, focusing on configuration, security, event sources, and common processing patterns. Hone your knowledge of triggers, permissions, and best practices for connecting Lambda functions to database stream events.

  1. Lambda Event Source Fundamentals

    Which event source must be enabled on a DynamoDB table to allow it to trigger a Lambda function when new items are added?

    1. DynamoDB Streams
    2. Global Index
    3. Trigger Mode
    4. Event Processor

    Explanation: DynamoDB Streams must be enabled on the table to capture changes and trigger Lambda functions. Trigger Mode is not a configuration in this context, and Event Processor is a general term, not a DynamoDB feature. Global Index relates to database indexing, not event streaming.

  2. Types of Stream Records

    When configuring DynamoDB Streams, which option provides both the new and old images of an updated item for Lambda processing?

    1. Stream view off
    2. Keys only
    3. New image only
    4. New and old images

    Explanation: Selecting 'New and old images' for the stream view allows Lambda to access both the previous and updated states of a record. 'Keys only' provides only the keys, 'New image only' gives just the updated values, and 'Stream view off' disables this detail entirely.

  3. Permissions Configuration

    To read from a DynamoDB stream, what permission must a Lambda function’s execution role include?

    1. dynamodb:ScanTable
    2. dynamodb:DescribeStream
    3. dynamodb:WriteItem
    4. dynamodb:CreateTable

    Explanation: The 'dynamodb:DescribeStream' permission allows Lambda to access stream details necessary for processing events. 'dynamodb:WriteItem' and 'dynamodb:CreateTable' relate to modifying the table, while 'dynamodb:ScanTable' is for reading table contents, not streams.

  4. Event Processing Modes

    When integrating Lambda with DynamoDB Streams, how are records delivered to the Lambda function by default?

    1. One at a time
    2. In batches
    3. Every hour
    4. After table backup

    Explanation: By default, records from DynamoDB Streams are delivered to Lambda in batches for efficient processing. Delivering records one at a time is less efficient and not the standard behavior. Delivery does not depend on backup routines or hourly intervals.

  5. Failure Handling

    What happens if a Lambda function fails to process a DynamoDB stream event record?

    1. Nothing happens; the record is ignored
    2. A new table is created
    3. The record is retried later
    4. The record is permanently deleted

    Explanation: If Lambda misses processing a record, it is retried until it is successfully handled or the retry limit is reached. The record is not immediately deleted, nor is a new table created. Ignoring the record would risk data loss, which is avoided by retries.

  6. Filtering Events

    Which feature allows you to specify which DynamoDB Stream events should trigger your Lambda function based on conditions?

    1. Query indexing
    2. Polling throttle
    3. Data sharding
    4. Event filtering

    Explanation: Event filtering enables you to control which stream records invoke the Lambda, providing selective processing. Query indexing and data sharding are functions of database optimization, while polling throttle relates to rate limits, not filtering.

  7. Latency Considerations

    Typically, what is the expected latency between a DynamoDB table change and a Lambda function being triggered via streams?

    1. Immediate with no delay
    2. Several minutes
    3. Over an hour
    4. A few seconds

    Explanation: The usual latency is a few seconds due to stream propagation and Lambda invocation speeds. Assuming immediate triggering is inaccurate, while several minutes or over an hour is far outside typical performance expectations.

  8. Use Cases

    Which scenario best fits using a Lambda function triggered by a DynamoDB Stream?

    1. Manually updating inventory records
    2. Automatically sending notifications when an order status changes
    3. Printing table schema
    4. Backing up data to an external drive daily

    Explanation: Reacting to changes in the table, like status updates triggering notifications, is well-suited for stream-triggered Lambda functions. Backing up data daily and manually updating records are unrelated, and printing schema is an administrative task, not an event-driven process.

  9. Stream Retention

    How long are DynamoDB stream records typically retained for Lambda processing?

    1. 7 days
    2. 1 hour
    3. 12 hours
    4. 24 hours

    Explanation: Records remain available in DynamoDB streams for 24 hours, giving Lambda sufficient time to process them. 12 hours and 1 hour are too short, while 7 days exceeds standard retention and could lead to storage issues.

  10. Parallel Processing

    What determines the number of parallel Lambda invocations for processing DynamoDB stream shards?

    1. Table name length
    2. Key schema complexity
    3. The number of shards
    4. The table read throughput

    Explanation: Each DynamoDB stream shard may be processed concurrently by separate Lambda invocations, thereby increasing parallelization. Table read throughput and key schema do not directly affect stream processing concurrency, and table name length is irrelevant.