You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: aws/logs_monitoring/README.md
+94-72Lines changed: 94 additions & 72 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,8 +1,8 @@
1
1
# Datadog Forwarder
2
2
3
-
AWS Lambda function to ship logs and metrics from ELB, S3, CloudTrail, VPC, CloudFront and CloudWatch logs to Datadog
3
+
AWS Lambda function to ship logs and metrics from ELB, S3, CloudTrail, VPC, CloudFront, and CloudWatch logs to Datadog
4
4
5
-
# Features
5
+
##Features
6
6
7
7
- Forward logs through HTTPS (defaulted to port 443)
8
8
- Use AWS Lambda to re-route triggered S3 events to Datadog
@@ -16,126 +16,148 @@ AWS Lambda function to ship logs and metrics from ELB, S3, CloudTrail, VPC, Clou
16
16
- Multiline Log Support (S3 Only)
17
17
- Forward custom metrics from logs
18
18
19
-
# Quick Start
19
+
##Quick Start
20
20
21
-
The provided Python script must be deployed into your AWS Lambda service. We will explain how in this step-by-step tutorial.
21
+
The provided Python script must be deployed into your AWS Lambda service to collect your logs and send them to Datadog.
22
22
23
-
## 1. Create a new Lambda function
23
+
###1. Create a new Lambda function
24
24
25
-
-Navigate to the Lambda console: https://console.aws.amazon.com/lambda/home and create a new function.
26
-
- Select `Author from scratch` and give the function a unique name.
27
-
- For `Role`, select `Create new role from template(s)` and give the role a unique name.
28
-
- Under Policy templates, search for and select `s3 object read-only permissions`.
25
+
1.[Navigate to the Lambda console](https://console.aws.amazon.com/lambda/home) and create a new function.
26
+
2. Select `Author from scratch` and give the function a unique name: `datadog-log-monitoring-function`
27
+
3. For `Role`, select `Create new role from template(s)` and give the role a unique name: `datadog-log-monitoring-function-role`
28
+
4. Under Policy templates, select `s3 object read-only permissions`.
29
29
30
-
## 2. Provide the code
30
+
###2. Provide the code
31
31
32
-
- Copy paste the code of the Lambda function
33
-
- Set the runtime to `Python 2.7`, `Python 3.6`, or `Python 3.7`
34
-
- Set the handler to `lambda_function.lambda_handler`
32
+
1. Copy paste the code of the Lambda function from the `lambda_function.py` file.
33
+
2. Set the runtime to `Python 2.7`, `Python 3.6`, or `Python 3.7`
34
+
3. Set the handler to `lambda_function.lambda_handler`
35
35
36
+
### 3. Set your Parameters
36
37
37
-
### Parameters
38
+
At the top of the script you'll find a section called `PARAMETERS`, that's where you want to edit your code, available paramters are:
38
39
39
-
At the top of the script you'll find a section called `#Parameters`, that's where you want to edit your code.
40
+
#### DD_API_KEY
40
41
41
-
```
42
-
#Parameters
43
-
ddApiKey = "<your_api_key>"
44
-
# metadata: Additional metadata to send with the logs
45
-
metadata = {
46
-
"ddsourcecategory": "aws"
47
-
}
48
-
```
42
+
Set the Datadog API key for your Datadog platform, it can be found here:
49
43
50
-
-**API key**:
44
+
* Datadog US Site: https://app.datadoghq.com/account/settings#api
45
+
* Datadog EU Site: https://app.datadoghq.eu/account/settings#api
51
46
52
-
There are 3 possibilities to set your Datadog's API key (available in your Datadog platform):
47
+
There are 3 possibilities to set your Datadog API key:
53
48
54
49
1.**KMS Encrypted key (recommended)**: Use the `DD_KMS_API_KEY` environment variable to use a KMS encrypted key. Make sure that the Lambda execution role is listed in the KMS Key user in https://console.aws.amazon.com/iam/home#encryptionKeys.
55
-
2.**Environment Variable**: Use the `DD_API_KEY` environment variable of the Lambda function
56
-
3.**Manual**: Replace `<your_api_key>` in the code:
50
+
2.**Environment Variable**: Use the `DD_API_KEY` environment variable for the Lambda function.
51
+
3.**Manual**: Replace `<YOUR_DATADOG_API_KEY>` in the code:
## The Datadog API key associated with your Datadog Account
56
+
## It can be found here:
57
+
##
58
+
## * Datadog US Site: https://app.datadoghq.com/account/settings#api
59
+
## * Datadog EU Site: https://app.datadoghq.eu/account/settings#api
60
+
#
61
+
DD_API_KEY="<YOUR_DATADOG_API_KEY>"
62
+
```
59
63
60
-
You can optionally change the structured metadata. The metadata is merged to all the log events sent by the Lambda script.
61
-
Example adding the environment (`env`) value to your logs:
64
+
#### Custom Tags
62
65
63
-
```
64
-
metadata = {
65
-
"ddsourcecategory": "aws",
66
-
"env": "prod",
67
-
}
68
-
```
66
+
Add custom tags to all data forwarded by your function, either:
67
+
68
+
* Use the `DD_TAGS` environment variable. Your tags must be a comma-separated list of strings with no trailing comma.
69
+
* Edit the lambda code directly:
69
70
70
-
-**(Optional) Custom Tags**
71
+
```python
72
+
## @param DD_TAGS - list of comma separated strings - optional -default: none
73
+
## Pass custom tags as environment variable or through this variable.
74
+
## Ensure your tags are a comma separated list of strings with no trailing comma in the envvar!
75
+
#
76
+
DD_TAGS= os.environ.get("DD_TAGS", "")
77
+
```
71
78
72
-
You have two options to add custom tags to your logs:
79
+
#### Datadog Site
73
80
74
-
- Manually by editing the lambda code [here](https://github.com/DataDog/datadog-serverless-functions/blob/master/aws/logs_monitoring/lambda_function.py#L418-L423).
75
-
- Automatically with the `DD_TAGS` environment variable (tags must be a comma-separated list of strings).
81
+
Define your Datadog Site to send data to, `datadoghq.com` for Datadog US site or `datadoghq.eu` for Datadog EU site, either:
Two environment variables can be used to forward logs through a proxy:
92
112
93
-
-`DD_URL`: Define the proxy endpoint to forward the logs to
94
-
-`DD_PORT`: Define the proxy port to forward the logs to
113
+
*`DD_URL`: Define the proxy endpoint to forward the logs to.
114
+
*`DD_PORT`: Define the proxy port to forward the logs to.
115
+
116
+
## 3. Configure your function
95
117
96
-
## 4. Configuration
118
+
To configure your function:
97
119
98
-
- Set the memory to the highest possible value.
99
-
- Also set the timeout limit. We recommend 120 seconds to deal with big files.
100
-
- Hit the `Save and Test` button.
120
+
1. Set the memory to the highest possible value.
121
+
2. Also set the timeout limit. 120 seconds is recommended to deal with big files.
122
+
3. Hit the `Save and Test` button.
101
123
102
-
## 5. Testing it
124
+
## 4. Test it
103
125
104
-
If the test "succeeded", you are all set! The test log will not show up in the platform.
126
+
If the test "succeeded", you are all set! The test log doesn't show up in the platform.
105
127
106
-
For S3 logs, there may be some latency between the time a first S3 log file is posted and the Lambda function wakes up.
128
+
**Note**: For S3 logs, there may be some latency between the time a first S3 log file is posted and the Lambda function wakes up.
107
129
108
-
## 6. (optional) Scrubbing / Redaction rules
130
+
## 5. (optional) Scrubbing / Redaction rules
109
131
110
-
Multiple scrubbing options are available. `REDACT_IP` and `REDACT_EMAIL` match against hard-coded patterns, while `DD_SCRUBBING_RULE` allows users to supply a regular expression.
111
-
- To use `REDACT_IP`, add it as an environment variable and set the value to `true`.
112
-
- Text matching `\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}`will be replaced with `xxx.xxx.xxx.xxx`.
132
+
Multiple scrubbing options are available. `REDACT_IP` and `REDACT_EMAIL` match against hard-coded patterns, while `DD_SCRUBBING_RULE` allows users to supply a regular expression.
133
+
- To use `REDACT_IP`, add it as an environment variable and set the value to `true`.
134
+
- Text matching `\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}`is replaced with `xxx.xxx.xxx.xxx`.
113
135
- To use `REDACT_EMAIL`, add it as an environment variable and set the value to `true`.
114
-
- Text matching `[a-zA-Z0-9_.+-]+@[a-zA-Z0-9-]+\.[a-zA-Z0-9-.]+` will be replaced with `xxxxx@xxxxx.com`.
136
+
- Text matching `[a-zA-Z0-9_.+-]+@[a-zA-Z0-9-]+\.[a-zA-Z0-9-.]+` is replaced with `xxxxx@xxxxx.com`.
115
137
- To use `DD_SCRUBBING_RULE`, add it as a environment variable, and supply a regular expression as the value.
116
-
- Text matching the user-supplied regular expression will be replaced with `xxxxx`, by default.
117
-
- Use the `DD_SCRUBBING_RULE_REPLACEMENT` environment variable to supply a replacement value instead of `xxxxx`.
138
+
- Text matching the user-supplied regular expression is replaced with `xxxxx`, by default.
139
+
- Use the `DD_SCRUBBING_RULE_REPLACEMENT` environment variable to supply a replacement value instead of `xxxxx`.
118
140
- Scrubbing rules are applied to the full JSON-formatted log, including any metadata that is automatically added by the Lambda function.
119
-
- Each instance of a pattern match is replaced until no more matches are found in each log.
141
+
- Each instance of a pattern match is replaced until no more matches are found in each log.
120
142
121
-
## 7. (optional) Filtering rules
143
+
## 6. (optional) Filtering rules
122
144
123
-
Use the `EXCLUDE_AT_MATCH` OR `INCLUDE_AT_MATCH` environment variables to filter logs based on a regular expression match.
145
+
Use the `EXCLUDE_AT_MATCH` OR `INCLUDE_AT_MATCH` environment variables to filter logs based on a regular expression match:
124
146
125
-
- To use `EXCLUDE_AT_MATCH` add it as an environment variable and set its value to a regular expression. Logs matching the regular expression will be excluded.
126
-
- To use `INCLUDE_AT_MATCH` add it as an environment variable and set its value to a regular expression. If not excluded by `EXCLUDE_AT_MATCH`, logs matching the regular expression will be included.
127
-
- If a log matches both the inclusion and exclusion criteria, it will be excluded.
147
+
- To use `EXCLUDE_AT_MATCH` add it as an environment variable and set its value to a regular expression. Logs matching the regular expression are excluded.
148
+
- To use `INCLUDE_AT_MATCH` add it as an environment variable and set its value to a regular expression. If not excluded by `EXCLUDE_AT_MATCH`, logs matching the regular expression are included.
149
+
- If a log matches both the inclusion and exclusion criteria, it is excluded.
128
150
- Filtering rules are applied to the full JSON-formatted log, including any metadata that is automatically added by the function.
129
151
130
-
## 8. (optional) Multiline Log support for s3
152
+
## 7. (optional) Multiline Log support for s3
131
153
132
154
If there are multiline logs in s3, set `DD_MULTILINE_LOG_REGEX_PATTERN` environment variable to the specified regex pattern to detect for a new log line.
133
155
134
156
- Example: for multiline logs beginning with pattern `11/10/2014`: `DD_MULTILINE_LOG_REGEX_PATTERN="\d{2}\/\d{2}\/\d{4}"`
135
157
136
-
## 9. (optional) Forward Metrics from Logs
158
+
## 8. (optional) Forward Metrics from Logs
137
159
138
-
For example, if you have a Lambda function that powers a performance-critical task (e.g., a consumer-facing API), you can avoid the added latencies of submitting metric via API calls, by writing custom metrics to CloudWatch Logs using the appropriate Datadog Lambda Layer (e.g., [Lambda Layer for Python](https://github.com/DataDog/datadog-lambda-layer-python)). The log forwarder will automatically detect log entries that contain metrics and forward them to Datadog metric intake.
160
+
For example, if you have a Lambda function that powers a performance-critical task (e.g., a consumer-facing API), you can avoid the added latencies of submitting metric via API calls, by writing custom metrics to CloudWatch Logs using the appropriate Datadog Lambda Layer (e.g., [Lambda Layer for Python](https://github.com/DataDog/datadog-lambda-layer-python)). The log forwarder automatically detects log entries that contain metrics and forward them to Datadog metric intake.
139
161
140
162
The [Datadog Lambda Layer for Python 2.7, 3.6, or 3.7]((https://github.com/DataDog/datadog-lambda-layer-python))**MUST** be added to the log forwarder Lambda function, to enable metric forwarding. Use the Lambda layer ARN below, and replace `us-east-1` with the actual AWS region where your log forwarder operates and replace `Python27` with the Python runtime your function uses (`Python27`, `Python36`, or `Python37`).
0 commit comments