Challenge your understanding of integrating serverless functions with DynamoDB Streams, focusing on configuration, security, event sources, and common processing patterns. Hone your knowledge of triggers, permissions, and best practices for connecting Lambda functions to database stream events.
Which event source must be enabled on a DynamoDB table to allow it to trigger a Lambda function when new items are added?
Explanation: DynamoDB Streams must be enabled on the table to capture changes and trigger Lambda functions. Trigger Mode is not a configuration in this context, and Event Processor is a general term, not a DynamoDB feature. Global Index relates to database indexing, not event streaming.
When configuring DynamoDB Streams, which option provides both the new and old images of an updated item for Lambda processing?
Explanation: Selecting 'New and old images' for the stream view allows Lambda to access both the previous and updated states of a record. 'Keys only' provides only the keys, 'New image only' gives just the updated values, and 'Stream view off' disables this detail entirely.
To read from a DynamoDB stream, what permission must a Lambda function’s execution role include?
Explanation: The 'dynamodb:DescribeStream' permission allows Lambda to access stream details necessary for processing events. 'dynamodb:WriteItem' and 'dynamodb:CreateTable' relate to modifying the table, while 'dynamodb:ScanTable' is for reading table contents, not streams.
When integrating Lambda with DynamoDB Streams, how are records delivered to the Lambda function by default?
Explanation: By default, records from DynamoDB Streams are delivered to Lambda in batches for efficient processing. Delivering records one at a time is less efficient and not the standard behavior. Delivery does not depend on backup routines or hourly intervals.
What happens if a Lambda function fails to process a DynamoDB stream event record?
Explanation: If Lambda misses processing a record, it is retried until it is successfully handled or the retry limit is reached. The record is not immediately deleted, nor is a new table created. Ignoring the record would risk data loss, which is avoided by retries.
Which feature allows you to specify which DynamoDB Stream events should trigger your Lambda function based on conditions?
Explanation: Event filtering enables you to control which stream records invoke the Lambda, providing selective processing. Query indexing and data sharding are functions of database optimization, while polling throttle relates to rate limits, not filtering.
Typically, what is the expected latency between a DynamoDB table change and a Lambda function being triggered via streams?
Explanation: The usual latency is a few seconds due to stream propagation and Lambda invocation speeds. Assuming immediate triggering is inaccurate, while several minutes or over an hour is far outside typical performance expectations.
Which scenario best fits using a Lambda function triggered by a DynamoDB Stream?
Explanation: Reacting to changes in the table, like status updates triggering notifications, is well-suited for stream-triggered Lambda functions. Backing up data daily and manually updating records are unrelated, and printing schema is an administrative task, not an event-driven process.
How long are DynamoDB stream records typically retained for Lambda processing?
Explanation: Records remain available in DynamoDB streams for 24 hours, giving Lambda sufficient time to process them. 12 hours and 1 hour are too short, while 7 days exceeds standard retention and could lead to storage issues.
What determines the number of parallel Lambda invocations for processing DynamoDB stream shards?
Explanation: Each DynamoDB stream shard may be processed concurrently by separate Lambda invocations, thereby increasing parallelization. Table read throughput and key schema do not directly affect stream processing concurrency, and table name length is irrelevant.