Use Case : How to import DynamoDB table values from Source Account AWS_AccountA- S3 bucket in a different account AWS_AccountB DynamoDB table.
Approach : Creating a AWS Lambda in destination account where the AWS DynamoDB table is existing for importing the values from the S3 bucket in Source Account
Reference : https://dev.to/aws-builders/export-aws-dynamodb-values-to-s3-bucket-1733/
- Create a Lambda - dynamodb-import
- Create Environmental variables attached to the lambda and change the values based on S3 bucket name and file name in source account and changed the dynamodb table in destination account.
- Create IAM Role with all necessary permissions and attached to AWS Lambda
1.Lambda Name : dynamodb-import
import boto3 import json import os def lambda_handler(event, context): dynamodb = boto3.resource('dynamodb') s3 = boto3.client('s3') source_s3_bucket = os.environ['source_s3_bucket'] file_key = os.environ['file_key'] table_name = os.environ['destination_dynamodb_table'] table = dynamodb.Table(table_name) try: response = s3.get_object(Bucket=source_s3_bucket, Key=file_key) file_content = response['Body'].read().decode('utf-8') items = json.loads(file_content) with table.batch_writer() as batch: for item in items: batch.put_item(Item=item) return { 'statusCode': 200, 'body': json.dumps(f'Imported {len(items)} items to DynamoDB') } except Exception as e: print(f"An error occurred:n{e}") return { 'statusCode': 500, 'body': f'Error: {str(e)}' }
2.Environmental variables for AWS Lambda
destination_dynamodb_table : AccountB_import_table
file_key : AccountA_File.json
source_s3_bucket : AccountA_S3_bucket_Name
Note: file_key is the file name in Source Account which we are trying to import to destination dynamodb table.
3.IAM Role Attached to Lambda
3.1.IAM Role Policy
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:GetObject", "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::Source_Account_S3Bucket_Name", "arn:aws:s3:::Source_Account_S3Bucket_Name/*" ] }, { "Effect": "Allow", "Action": [ "dynamodb:BatchWriteItem", "dynamodb:PutItem", "dynamodb:UpdateItem" ], "Resource": "arn:aws:dynamodb:AWS_Region:Destination_AWS_Account_Id:table/*" }, { "Effect": "Allow", "Action": "logs:CreateLogGroup", "Resource": "arn:aws:logs:AWS_Region:Destination_AWS_Account_Id:*" }, { "Effect": "Allow", "Action": [ "logs:CreateLogGroup", "logs:CreateLogStream", "logs:PutLogEvents" ], "Resource": [ "arn:aws:logs:eu-west-1:Destination_AWS_Account_Id:log-group:/aws/lambda/dynamodb-import:*" ] } ] }
3.2.IAM Role Trust relationship
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "lambda.amazonaws.com" }, "Action": "sts:AssumeRole" } ] }
Conclusion: Import from values from a file from S3 bucket to AWS DynamoDB table.
💬 If you enjoyed reading this blog post and found it informative, please take a moment to share your thoughts by leaving a review and liking it 😀 and follow me in dev.to , linkedin
Reference:
1.Refer my earlier blog on Export AWS DynamoDB table values to S3 bucket using Python Boto 3 : https://dev.to/aws-builders/export-aws-dynamodb-values-to-s3-bucket-1733
Top comments (0)