Skip to content

aaksenov/dd-aws-lambda-functions

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

dd-aws-lambda-functions

Repository of lambda functions that process aws log streams and send data to datadog

Overview

This project contains lambda functions to be used to process aws log streams and send data to datadog, along with some small tools to easily update these lambda functions in a dev environment.

The development process is to have a lambda function based on a zip file hosted on amazon s3. To publish a new version of the function, one updates the zip file, pushes it to s3, and updates the lambda function.

Each lambda function will retrieve datadog api keys from KMS.

Getting started

  1. install awscli

    pip install awscli 

    You'll need write access to a s3 bucket, and to be able to call lambda:UpdateFunctionCode

  2. Generate base.zip

    rake build-base 

    base.zip contains datadogpy and it's dependencies.

How to use an existing function (ie rds_enhanced_monitoring)

  1. Pick a bucket on which to store the packaged lambda function

  2. Initialize the function in the AWS console (see below)

  3. Update the KMS secret in main.py

  4. Package and push the function

    rake push[functionname,bucket] 

How to create a new function

  1. Initialize the function locally

    rake init[functionname] 

    This creates locally a hello lambda function

  2. Use this function (see above)

How to update an existing function

  1. Update the function's code

  2. Double check that the KMS secret in main.py is up to date

  3. Package and Push the function

    rake push[functionname,bucket] 

How to initialize a lambda function in the AWS console

  1. Create a KMS key for the datadog api key and app key

  2. Create and configure a lambda function

    • In the AWS Console, create a lambda_execution policy, with the following policy:

      { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "logs:CreateLogGroup", "logs:CreateLogStream", "logs:PutLogEvents" ], "Resource": "arn:aws:logs:*:*:*" }, { "Effect": "Allow", "Action": [ "kms:Decrypt" ], "Resource": [ "<KMS ARN>" ] } ] } 
    • Create a lambda_execution role and attach this policy

    • Create a lambda function: Skip the blueprint, name it functionname, set the Runtime to Python 2.7, the handle to main.lambda_handler, and the role to lambda_execution. The actual function code could be anything at this step (like print 'hello lambda') as it will use a zip file from s3 as the code entry type.

    • Subscribe to the appropriate log stream

About

Repo of lambda functions that process streams and send data to datadog

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%