← All Cheatsheets

Cloud

AWS Lambda

Runtimes · Triggers · Limits · Cold Starts · IAM · Observability · Best Practices

Core Concepts

Function

Unit of deployment. Code + config + IAM role. Stateless by design.

Handler

Entry point. exports.handler = async (event, context) => {}

Event

JSON payload from the trigger (API GW, SQS, S3, etc.)

Context

Runtime info: requestId, remainingTime, functionName, memoryLimit.

Execution environment

Isolated micro-VM. Reused across warm invocations (init code runs once).

🔢
Limits & Quotas
Max timeout15 minutes
Memory128 MB – 10,240 MB
vCPUProportional to memory
Deployment package50 MB (zip) / 250 MB (unzipped)
Container imageUp to 10 GB
Ephemeral storage (/tmp)512 MB – 10,240 MB
Env vars4 KB total
Concurrent executions1,000 (default, soft limit)
Burst concurrency500–3,000 (region-dependent)
Payload (sync)6 MB request / 6 MB response
Payload (async)256 KB
🔌
Trigger Sources
API Gateway / ALBSync
SQSAsync
SNSAsync
S3Async
DynamoDB StreamsStream
KinesisStream
EventBridgeAsync
Step FunctionsSync
CloudFront (Lambda@Edge)Sync
🥶
Cold Starts
Cold start: First invocation (or after idle) initializes a new execution environment. Adds 100ms–1s+ latency depending on runtime and package size.
1
Download code

From S3 or ECR. Larger packages = slower.

2
Start execution environment

Micro-VM init. JVM/Node/.NET differ significantly.

3
Run init code

Code outside handler: DB connections, SDK clients.

4
Run handler

Your actual function logic.

Mitigation Strategies

Provisioned Concurrency: Pre-warm N instances. Eliminates cold starts. Costs extra.
SnapStart (Java): Snapshot after init. 10× faster cold starts for Java 11+.
Minimize package size: Tree-shake, use layers, avoid heavy SDKs.
Choose fast runtimes: Node.js / Python cold start faster than Java / .NET.
Init code outside handler: DB connections, clients — reused on warm invocations.
🔐
IAM & Permissions

Execution Role

IAM role Lambda assumes at runtime. Grants access to AWS services (DynamoDB, S3, etc.). Principle of least privilege.

Resource Policy

Controls who can invoke the function. Required for cross-account or service triggers (S3, SNS).

# Minimum execution role

AWSLambdaBasicExecutionRole

# + VPC access

AWSLambdaVPCAccessExecutionRole

📊
Runtimes Comparison
RuntimeCold StartBest for
Node.js 20.x~100msAPIs, event processing, general
Python 3.12~100msML, data, scripting
Java 21~500ms+Enterprise, SnapStart
.NET 8~200msWindows workloads
Go 1.x~50msHigh-perf, low latency
ContainerVariesCustom deps, large packages
🔍
Observability

CloudWatch Logs

Auto-created log group: /aws/lambda/function-name. Every invocation logged.

Key Metrics

InvocationsTotal calls
DurationExecution time (p50/p99)
ErrorsUnhandled exceptions
ThrottlesConcurrency limit hits
ConcurrentExecutionsLive instances
X-Ray tracingLambda InsightsStructured logsCustom metricsPowertools
Best Practices
Init outside handler: DB connections, SDK clients — reused on warm invocations.
Set timeout conservatively: Default is 3s. Set to expected max + buffer. Avoid 15min for simple tasks.
Right-size memory: More memory = more CPU. Profile with Lambda Power Tuning.
Use environment variables: Never hardcode config. Use SSM Parameter Store for secrets.
Idempotent handlers: SQS/SNS can deliver duplicates. Design handlers to be safe to retry.
Use DLQ / on-failure: Capture failed async invocations. Never silently drop events.
Avoid VPC unless needed: VPC adds cold start latency. Use only for private resources.
Use Lambda Layers: Share common dependencies (SDKs, utils) across functions.