AWS SAM with Python: Real-World Examples
Learn to build serverless applications using AWS SAM and Python through practical, step-by-step examples.
Download This Guide
Get all code examples and explanations in a downloadable format.
Why AWS SAM with Python?
AWS Serverless Application Model (SAM) is an open-source framework for building serverless applications. When combined with Python, you get a powerful environment for developing scalable, cost-effective applications. Python’s simplicity and extensive libraries make it ideal for serverless functions, while SAM simplifies deployment and management.
🧩 Simple Analogy
Think of AWS SAM as a magical toolbox that helps you build treehouses (serverless apps). Python is like your favorite hammer – comfortable and powerful. SAM organizes your tools so you can build faster, while Python lets you build exactly what you imagine!
Getting Started with AWS SAM and Python
Before diving into examples, let’s set up your environment:
# Install AWS SAM CLI pip install aws-sam-cli # Initialize a new SAM project sam init --runtime python3.12 --name my-sam-app # Navigate to project directory cd my-sam-app # Build your application sam build # Test locally sam local start-api
Real-World Python SAM Examples
📧 1. Email Processing Service
Automatically process incoming emails, extract attachments, and store them in S3. Triggered by S3 uploads to an inbox bucket.
Resources: EmailProcessorFunction: Type: AWS::Serverless::Function Properties: CodeUri: email_processor/ Handler: app.lambda_handler Runtime: python3.12 Events: S3Event: Type: S3 Properties: Bucket: !Ref InboxBucket Events: s3:ObjectCreated:* InboxBucket: Type: AWS::S3::Bucket
import boto3 import email from email import policy s3 = boto3.client('s3') def lambda_handler(event, context): # Get uploaded email from S3 bucket = event['Records'][0]['s3']['bucket']['name'] key = event['Records'][0]['s3']['object']['key'] response = s3.get_object(Bucket=bucket, Key=key) email_content = response['Body'].read() # Parse email msg = email.message_from_bytes(email_content, policy=policy.default) # Process attachments for part in msg.walk(): if part.get_content_maintype() == 'multipart': continue if part.get('Content-Disposition') is None: continue filename = part.get_filename() if filename: # Save attachment to S3 s3.put_object( Bucket='attachments-bucket', Key=filename, Body=part.get_payload(decode=True) ) return {'statusCode': 200, 'body': 'Email processed successfully'}
🧒 6-Year-Old Explanation
Imagine you have a magical mailbox (S3 bucket). When someone sends you a letter with pictures (email with attachments), a friendly robot (Lambda function) automatically takes the pictures and puts them in your photo album (another S3 bucket). SAM is like the instructions that tell the robot what to do!
📊 2. Real-Time Data Processing Pipeline
Process streaming data from Kinesis, transform it, and store results in DynamoDB.
Data processing pipeline with Kinesis, Lambda, and DynamoDB
Resources: DataProcessorFunction: Type: AWS::Serverless::Function Properties: CodeUri: data_processor/ Handler: app.lambda_handler Runtime: python3.12 Events: StreamEvent: Type: Kinesis Properties: Stream: !GetAtt DataStream.Arn BatchSize: 100 StartingPosition: LATEST DataStream: Type: AWS::Kinesis::Stream Properties: ShardCount: 1 ProcessedDataTable: Type: AWS::DynamoDB::Table Properties: AttributeDefinitions: - AttributeName: id AttributeType: S KeySchema: - AttributeName: id KeyType: HASH BillingMode: PAY_PER_REQUEST
import json import boto3 import base64 dynamodb = boto3.resource('dynamodb') table = dynamodb.Table('ProcessedDataTable') def lambda_handler(event, context): for record in event['Records']: # Kinesis data is base64 encoded payload = base64.b64decode(record['kinesis']['data']) data = json.loads(payload) # Process data (example transformation) processed_data = { 'id': data['id'], 'processed_value': data['value'] * 1.1, 'timestamp': data['timestamp'] } # Save to DynamoDB table.put_item(Item=processed_data) return {'status': 'success', 'processed': len(event['Records'])}
🤖 3. Serverless API with AI Processing
Create an API that processes images with Amazon Rekognition using Python.
Resources: ImageAnalysisAPI: Type: AWS::Serverless::Api Properties: StageName: prod ImageAnalysisFunction: Type: AWS::Serverless::Function Properties: CodeUri: image_analysis/ Handler: app.lambda_handler Runtime: python3.12 Events: ApiEvent: Type: Api Properties: Path: /analyze Method: POST RestApiId: !Ref ImageAnalysisAPI # IAM permissions for Rekognition Policies: - RekognitionDetectOnlyPolicy
import json import boto3 import base64 rekognition = boto3.client('rekognition') def lambda_handler(event, context): # Get image from request body body = json.loads(event['body']) image_bytes = base64.b64decode(body['image']) # Analyze image with Rekognition response = rekognition.detect_labels( Image={'Bytes': image_bytes}, MaxLabels=10, MinConfidence=80 ) # Extract relevant information labels = [{ 'Name': label['Name'], 'Confidence': label['Confidence'] } for label in response['Labels']] return { 'statusCode': 200, 'body': json.dumps({'labels': labels}) }
🧒 6-Year-Old Explanation
Imagine you have a magic camera that can tell you what’s in a picture. You send a photo to a special mailbox (API), and a smart robot (Lambda function) uses the magic camera (Rekognition) to identify everything in the photo. Then it sends you back a list of what it found!
AWS SAM Best Practices for Python
Optimizing Python Serverless Applications
- Use Layers for Dependencies: Package dependencies separately for faster deployment
- Enable X-Ray Tracing: Monitor and debug your applications effectively
- Set Memory Appropriately: Balance memory and CPU allocation for cost efficiency
- Implement Proper Error Handling: Use dead-letter queues for failed invocations
- Optimize Cold Starts: Use provisioned concurrency for critical functions
Resources: MyFunction: Type: AWS::Serverless::Function Properties: CodeUri: function/ Handler: app.handler Runtime: python3.12 MemorySize: 1024 # Optimized for this workload Timeout: 15 # Avoid excessively long timeouts Tracing: Active # Enable X-Ray Layers: - !Ref MyDependencyLayer DeadLetterQueue: Type: SQS TargetArn: !GetAtt MyDLQ.Arn ProvisionedConcurrencyConfig: ProvisionedConcurrentExecutions: 5 MyDependencyLayer: Type: AWS::Serverless::LayerVersion Properties: ContentUri: layer/ CompatibleRuntimes: - python3.12 RetentionPolicy: Retain MyDLQ: Type: AWS::SQS::Queue
Conclusion
AWS SAM with Python provides a powerful combination for building scalable serverless applications. The examples we’ve covered demonstrate practical implementations for common use cases, from email processing to real-time data pipelines and AI-powered APIs. By following best practices and leveraging SAM’s capabilities, you can create efficient, maintainable serverless applications.
As serverless architectures continue to evolve, AWS SAM remains an essential tool for Python developers. Its local testing capabilities, simplified deployment process, and infrastructure-as-code approach make it ideal for projects of any size. Start with these examples and explore how you can adapt them to your specific needs.
Expand Your AWS SAM Knowledge