Skip to content
Vladimir Chavkov
Go back

LocalStack: Complete AWS Local Development and Testing Guide

Edit page

LocalStack: Complete AWS Local Development and Testing Guide

LocalStack is a cloud service emulator that runs in a single container on your laptop or CI environment, providing a fully functional local AWS cloud stack. It enables developers to develop and test cloud applications offline, reducing costs and dramatically speeding up the development feedback loop. This guide covers LocalStack setup, service configuration, and production workflow patterns.

What is LocalStack?

LocalStack emulates AWS cloud services locally for development and testing:

Key Features

  1. AWS Service Emulation: 80+ AWS services including S3, Lambda, DynamoDB, SQS, SNS
  2. Docker-Based: Single container deployment with docker-compose
  3. AWS CLI Compatible: Use standard AWS CLI and SDKs
  4. Terraform/CDK/CloudFormation: Full IaC support
  5. CI/CD Integration: GitHub Actions, GitLab CI, Jenkins
  6. Cloud Pods: Save and share state across environments
  7. Hot Reloading: Lambda code changes without redeployment

LocalStack Community vs Pro

FeatureCommunity (Free)Pro
Core Services (S3, SQS, SNS, DynamoDB, Lambda)YesYes
API GatewayBasicFull (REST, HTTP, WebSocket)
IAM EnforcementNoYes
RDS/AuroraNoYes
ECS/EKSNoYes
CognitoNoYes
CloudWatchBasicFull
Step FunctionsNoYes
Cloud PodsNoYes
CI AnalyticsNoYes
PersistenceBasicAdvanced

LocalStack vs Alternatives

FeatureLocalStackMotoElasticMQDynamoDB LocalAWS SAM Local
Multi-Service80+ services100+ mocksSQS onlyDynamoDB onlyLambda + API GW
InfrastructureDocker containerPython libraryDockerDocker/JARDocker
IaC SupportFullNoNoNoSAM templates
Real HTTP EndpointsYesTest-onlyYesYesYes
AWS CLI CompatibleYesNoPartialYesYes
Language AgnosticYesPython onlyYesYesLimited

Installation

Docker Compose

docker-compose.yml
version: '3.8'
services:
localstack:
image: localstack/localstack:3.4
container_name: localstack
ports:
- "4566:4566" # Gateway
- "4510-4559:4510-4559" # External services
environment:
- SERVICES=s3,sqs,sns,lambda,dynamodb,apigateway,iam,sts,cloudformation,ssm,secretsmanager,logs,events,stepfunctions
- DEBUG=0
- LAMBDA_EXECUTOR=docker-reuse
- DOCKER_HOST=unix:///var/run/docker.sock
- PERSISTENCE=1
volumes:
- "./volume:/var/lib/localstack"
- "/var/run/docker.sock:/var/run/docker.sock"
Terminal window
docker compose up -d

AWS CLI Configuration

Terminal window
# Install awslocal wrapper
pip install awscli-local
# Or configure a profile
aws configure --profile localstack
# AWS Access Key ID: test
# AWS Secret Access Key: test
# Default region: us-east-1
# Default output format: json
# Use with --endpoint-url
aws --endpoint-url=http://localhost:4566 s3 ls
# Or use awslocal (auto-configures endpoint)
awslocal s3 ls

Core Services

S3

Terminal window
# Create a bucket
awslocal s3 mb s3://my-bucket
# Upload files
awslocal s3 cp ./data.json s3://my-bucket/data.json
# List buckets
awslocal s3 ls
# Enable versioning
awslocal s3api put-bucket-versioning \
--bucket my-bucket \
--versioning-configuration Status=Enabled
# Configure bucket notification to SQS
awslocal s3api put-bucket-notification-configuration \
--bucket my-bucket \
--notification-configuration '{
"QueueConfigurations": [{
"QueueArn": "arn:aws:sqs:us-east-1:000000000000:s3-events",
"Events": ["s3:ObjectCreated:*"]
}]
}'

Lambda

lambda_function.py
import json
def handler(event, context):
body = json.loads(event.get('body', '{}'))
name = body.get('name', 'World')
return {
'statusCode': 200,
'body': json.dumps({'message': f'Hello, {name}!'})
}
Terminal window
# Package and deploy Lambda
zip function.zip lambda_function.py
awslocal lambda create-function \
--function-name hello-function \
--runtime python3.12 \
--handler lambda_function.handler \
--zip-file fileb://function.zip \
--role arn:aws:iam::000000000000:role/lambda-role
# Invoke Lambda
awslocal lambda invoke \
--function-name hello-function \
--payload '{"body": "{\"name\": \"LocalStack\"}"}' \
output.json
cat output.json

DynamoDB

Terminal window
# Create a table
awslocal dynamodb create-table \
--table-name Orders \
--attribute-definitions \
AttributeName=orderId,AttributeType=S \
AttributeName=customerId,AttributeType=S \
--key-schema \
AttributeName=orderId,KeyType=HASH \
--global-secondary-indexes \
'IndexName=CustomerIndex,KeySchema=[{AttributeName=customerId,KeyType=HASH}],Projection={ProjectionType=ALL}' \
--billing-mode PAY_PER_REQUEST
# Put an item
awslocal dynamodb put-item \
--table-name Orders \
--item '{
"orderId": {"S": "order-001"},
"customerId": {"S": "cust-123"},
"amount": {"N": "99.99"},
"status": {"S": "pending"}
}'
# Query items
awslocal dynamodb query \
--table-name Orders \
--key-condition-expression "orderId = :id" \
--expression-attribute-values '{":id": {"S": "order-001"}}'

SQS and SNS

Terminal window
# Create SQS queue
awslocal sqs create-queue --queue-name order-events
# Create SNS topic
awslocal sns create-topic --name order-notifications
# Subscribe SQS to SNS
awslocal sns subscribe \
--topic-arn arn:aws:sns:us-east-1:000000000000:order-notifications \
--protocol sqs \
--notification-endpoint arn:aws:sqs:us-east-1:000000000000:order-events
# Publish a message
awslocal sns publish \
--topic-arn arn:aws:sns:us-east-1:000000000000:order-notifications \
--message '{"orderId": "order-001", "event": "created"}'
# Receive messages
awslocal sqs receive-message \
--queue-url http://localhost:4566/000000000000/order-events

Terraform with LocalStack

providers.tf
terraform {
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 5.0"
}
}
}
provider "aws" {
access_key = "test"
secret_key = "test"
region = "us-east-1"
s3_use_path_style = true
skip_credentials_validation = true
skip_metadata_api_check = true
skip_requesting_account_id = true
endpoints {
s3 = "http://localhost:4566"
sqs = "http://localhost:4566"
sns = "http://localhost:4566"
lambda = "http://localhost:4566"
dynamodb = "http://localhost:4566"
iam = "http://localhost:4566"
apigateway = "http://localhost:4566"
cloudformation = "http://localhost:4566"
ssm = "http://localhost:4566"
}
}
main.tf
resource "aws_s3_bucket" "data" {
bucket = "data-bucket"
}
resource "aws_sqs_queue" "events" {
name = "order-events"
visibility_timeout_seconds = 30
message_retention_seconds = 86400
}
resource "aws_dynamodb_table" "orders" {
name = "Orders"
billing_mode = "PAY_PER_REQUEST"
hash_key = "orderId"
attribute {
name = "orderId"
type = "S"
}
}
resource "aws_lambda_function" "processor" {
function_name = "order-processor"
runtime = "python3.12"
handler = "handler.main"
filename = "lambda.zip"
role = aws_iam_role.lambda_role.arn
environment {
variables = {
TABLE_NAME = aws_dynamodb_table.orders.name
QUEUE_URL = aws_sqs_queue.events.url
}
}
}
Terminal window
# Using tflocal wrapper
pip install terraform-local
tflocal init
tflocal plan
tflocal apply -auto-approve

CI/CD Integration

GitHub Actions

.github/workflows/test.yml
name: Integration Tests
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
services:
localstack:
image: localstack/localstack:3.4
ports:
- 4566:4566
env:
SERVICES: s3,sqs,dynamodb,lambda
DEBUG: 0
steps:
- uses: actions/checkout@v4
- name: Wait for LocalStack
run: |
pip install awscli-local
timeout 60 bash -c 'until awslocal s3 ls 2>/dev/null; do sleep 2; done'
- name: Setup infrastructure
run: |
awslocal s3 mb s3://test-bucket
awslocal dynamodb create-table \
--table-name TestTable \
--attribute-definitions AttributeName=id,AttributeType=S \
--key-schema AttributeName=id,KeyType=HASH \
--billing-mode PAY_PER_REQUEST
- name: Run tests
run: pytest tests/integration/ -v
env:
AWS_ENDPOINT_URL: http://localhost:4566
AWS_ACCESS_KEY_ID: test
AWS_SECRET_ACCESS_KEY: test
AWS_DEFAULT_REGION: us-east-1

Testing Patterns

Python Integration Tests

import boto3
import pytest
import os
@pytest.fixture
def aws_clients():
endpoint = os.environ.get('AWS_ENDPOINT_URL', 'http://localhost:4566')
kwargs = {
'endpoint_url': endpoint,
'region_name': 'us-east-1',
'aws_access_key_id': 'test',
'aws_secret_access_key': 'test',
}
return {
's3': boto3.client('s3', **kwargs),
'sqs': boto3.client('sqs', **kwargs),
'dynamodb': boto3.resource('dynamodb', **kwargs),
}
def test_order_processing(aws_clients):
# Setup
table = aws_clients['dynamodb'].create_table(
TableName='Orders',
KeySchema=[{'AttributeName': 'orderId', 'KeyType': 'HASH'}],
AttributeDefinitions=[{'AttributeName': 'orderId', 'AttributeType': 'S'}],
BillingMode='PAY_PER_REQUEST'
)
table.wait_until_exists()
# Test
table.put_item(Item={'orderId': 'test-001', 'amount': 99})
response = table.get_item(Key={'orderId': 'test-001'})
assert response['Item']['orderId'] == 'test-001'
assert response['Item']['amount'] == 99

Production Best Practices

Checklist


Accelerate Your Cloud Development

Building efficient local development workflows with LocalStack requires understanding of AWS services and testing patterns. At chavkov.com, I deliver hands-on AWS and DevOps training that includes local development best practices.

Contact me to discuss training options for your team.


Edit page
Share this post on:

Previous Post
Harbor: Enterprise Container Registry Complete Guide
Next Post
Platform Engineering: Complete Guide to Internal Developer Platforms