How to Upload Files to S3 Bucket Using AWS CloudFormation Templates: Complete YAML Example


3 views

When automating AWS infrastructure deployments, developers often need to provision S3 buckets and pre-populate them with files during stack creation. The standard AWS::S3::Bucket resource alone doesn't handle file uploads - we need additional components.

Here's the technical implementation combining three key AWS services:

AWSTemplateFormatVersion: '2010-09-09'
Resources:
  MyS3Bucket:
    Type: AWS::S3::Bucket
    Properties:
      BucketName: my-app-config-bucket
      VersioningConfiguration:
        Status: Enabled
      
  UploadLambdaRole:
    Type: AWS::IAM::Role
    Properties:
      AssumeRolePolicyDocument:
        Version: '2012-10-17'
        Statement:
          - Effect: Allow
            Principal:
              Service: lambda.amazonaws.com
            Action: sts:AssumeRole
      ManagedPolicyArns:
        - arn:aws:iam::aws:policy/AmazonS3FullAccess

  FileUploadFunction:
    Type: AWS::Lambda::Function
    Properties:
      Handler: index.handler
      Role: !GetAtt UploadLambdaRole.Arn
      Runtime: python3.8
      Code:
        ZipFile: |
          import boto3
          import json
          def handler(event, context):
              s3 = boto3.client('s3')
              s3.put_object(
                  Bucket='my-app-config-bucket',
                  Key='config/settings.json',
                  Body=json.dumps({'appSettings': {'timeout': 30}})
              )
              return {'statusCode': 200}

  LambdaInvoker:
    Type: AWS::CloudFormation::CustomResource
    Properties:
      ServiceToken: !GetAtt FileUploadFunction.Arn

This solution works through the following workflow:

  1. The template creates an S3 bucket with versioning enabled
  2. A Lambda execution role with S3 permissions is provisioned
  3. A Python Lambda function contains the actual file upload logic
  4. A Custom Resource triggers the Lambda during stack creation

For simpler use cases, you can embed AWS CLI commands in UserData:

Resources:
  EC2Instance:
    Type: AWS::EC2::Instance
    Properties:
      UserData:
        Fn::Base64: |
          #!/bin/bash
          aws s3 cp ./local-file.txt s3://my-bucket/path/ --region us-east-1
  • Always encrypt sensitive files using ServerSideEncryptionConfiguration
  • For large files, consider multi-part uploads in the Lambda
  • Set appropriate CannedACL values for public/private access
  • Use DeletionPolicy: Retain for production data buckets

When automating AWS infrastructure with CloudFormation, developers often need to bootstrap resources with initial content. A common scenario is uploading files during S3 bucket creation - something the native AWS::S3::Bucket resource doesn't support directly.

We'll implement a two-phase approach:

AWSTemplateFormatVersion: '2010-09-09'
Resources:
  MyS3Bucket:
    Type: AWS::S3::Bucket
    Properties:
      BucketName: my-app-config-bucket
      VersioningConfiguration:
        Status: Enabled

  UploadLambda:
    Type: AWS::Lambda::Function
    Properties:
      Handler: index.handler
      Runtime: python3.9
      Code:
        ZipFile: |
          import boto3
          def handler(event, context):
              s3 = boto3.client('s3')
              s3.upload_file('/tmp/config.json', 'my-app-config-bucket', 'config.json')
      Role: !GetAtt LambdaExecutionRole.Arn

  LambdaExecutionRole:
    Type: AWS::IAM::Role
    Properties:
      AssumeRolePolicyDocument:
        Version: '2012-10-17'
        Statement:
          - Effect: Allow
            Principal:
              Service: [lambda.amazonaws.com]
            Action: ['sts:AssumeRole']
      Policies:
        - PolicyName: S3UploadPolicy
          PolicyDocument:
            Version: '2012-10-17'
            Statement:
              - Effect: Allow
                Action: ['s3:PutObject']
                Resource: !Sub 'arn:aws:s3:::${MyS3Bucket}/*'

  CustomResource:
    Type: Custom::UploadInitialFiles
    Properties:
      ServiceToken: !GetAtt UploadLambda.Arn
      BucketName: !Ref MyS3Bucket

The solution leverages several AWS services working in concert:

  • Lambda-backed Custom Resource: Triggers file upload after bucket creation
  • IAM Permissions: Grants least-privilege access for the upload operation
  • Bootstrap File Handling: The Lambda can either generate files dynamically or package them with the function

For simpler use cases, consider these variations:

Using AWS CLI in UserData

UserData:
  Fn::Base64: !Sub |
    #!/bin/bash
    aws s3 cp ./config.json s3://${MyS3Bucket}/config.json

S3 Batch Operations

For large-scale initial uploads:

  BatchJob:
    Type: AWS::S3::BatchOperations::Job
    Properties:
      Operation:
        LambdaInvoke:
          FunctionArn: !GetAtt UploadLambda.Arn
      Report:
        Bucket: !Ref MyS3Bucket
        Prefix: 'batch-reports'
      Priority: 1