Skip to content

Commit 8c06521

Browse files
authored
Update README.md
1 parent db5db86 commit 8c06521

File tree

1 file changed

+23
-30
lines changed

1 file changed

+23
-30
lines changed

README.md

Lines changed: 23 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,7 @@
11
# saf-lambda-function
22
This code uses the Serverless Framework to deploy an AWS lambda function that, when triggered by a file uploaded in an S3 bucket, will run the [SAF CLI](https://github.com/mitre/saf) with the given input command (`COMMAND_STRING_INPUT`).
33

4-
## To Use
5-
### Follow the example-specific instructions below:
4+
## Prerequisites
65
1. Clone this repository: `git clone https://github.com/mitre/saf-lambda-function.git`
76
2. Install the Serverless Framework: `npm install -g serverless`
87
3. Install the latest dependencies: `npm install`.
@@ -13,6 +12,9 @@ export AWS_PROFILE=<your_creds_profile_name>
1312
# To ensure your access to AWS, run:
1413
aws s3 ls
1514
```
15+
16+
## Inputs
17+
### Environment Variables
1618
5. Set the S3 bucket name that you would like to upload your HDF file to
1719
```bash
1820
export BUCKET=<bucket-name>
@@ -22,12 +24,13 @@ Example:
2224
```bash
2325
export COMMAND_STRING_INPUT="convert hdf2splunk -H 127.0.0.1 -u admin -p Valid_password! -I your_index_name"
2426
```
25-
- More examples can be found at [SAF CLI Usage](https://github.com/mitre/saf#usage)
2627
- NOTE: Do not include the input flag in the command string as this will be appended on from the S3 bucket trigger, ex: "-i hdf_file.json".
28+
- NOTE: Do not include the output flag in the command string. Instead, set the desired output information in `config.json`.
2729
- NOTE: This action does not support `view heimdall`.
28-
29-
7. Ensure that the environment variables are set properly: `env`
30-
8. Modify any config values you may want to change. These are found in `config.json` and have the following default values:
30+
- More examples can be found at [SAF CLI Usage](https://github.com/mitre/saf#usage)
31+
- You can ensure that the environment variables are set properly: `env`.
32+
### Config variables
33+
7. Modify any config values you may want to change. These are found in `config.json` and have the following default values:
3134
```
3235
{
3336
"service-name": "saf-lambda-function",
@@ -43,21 +46,12 @@ EXAMPLE:
4346
input file: `<BUCKET>/unprocessed/burpsuite_scan.xml`
4447
output file: `<bucket-name>/processed/burpsuite_scan_output.json`
4548

46-
9. Test by invoking locally
47-
48-
10. Deploy the service: `sls deploy --verbose`. This may take several minutes and should look like the example below.
49-
<img width="1287" alt="Screen Shot 2022-04-19 at 4 37 58 PM" src="https://user-images.githubusercontent.com/32680215/164254895-c7251b9a-2566-4f42-ac39-b9c97433aabd.png">
50-
51-
11. (Optional) Test by invoking via AWS
52-
53-
54-
### Testing your SAF CLI Lambda function
55-
56-
#### Invoking localling
57-
1. Create an AWS bucket with your bucket name that you previously specified as an environment variable.
58-
2. Load a file into the "bucket-input-folder" which is specified in the `config.json`.
59-
3. If testing for the first time, run `npm make-event`. This will generate an s3 test event by running the command `serverless generate-event -t aws:s3 > test/event.json`.
60-
3. Edit the bucket name and key in `test/event.json`.
49+
## Test and Deploy your SAF CLI Lambda function
50+
### Test by invoking locally
51+
8. Create an AWS bucket with your bucket name that you previously specified as an environment variable.
52+
9. Load a file into the "bucket-input-folder" which is specified in the `config.json`.
53+
10. If testing for the first time, run `npm make-event`. This will generate an s3 test event by running the command `serverless generate-event -t aws:s3 > test/event.json`.
54+
11. Edit the bucket name and key in `test/event.json`.
6155
```
6256
"bucket": {
6357
"name": "your-bucket-name",
@@ -66,21 +60,20 @@ output file: `<bucket-name>/processed/burpsuite_scan_output.json`
6660
"object": {
6761
"key": "your-input-folder/you-file-name.json",
6862
```
69-
4. Run `npm test`.
70-
5. You should see logging in the terminal and an uploaded output file in your s3 bucket if the `config.json` file specifies that the function should upload an output file.
63+
12. Run `npm test`.
64+
You should see logging in the terminal and an uploaded output file in your s3 bucket if the `config.json` file specifies that the function should upload an output file.
7165

7266
Here, `npm test` is running the command: `serverless invoke local --function saf-lambda-function --path test/event.json`.
7367
You can change the specifications more if needed by looking at the documentation for [serverless invoke local](https://www.serverless.com/framework/docs/providers/aws/cli-reference/invoke-local).
7468

75-
#### Invoking via AWS
76-
1. When the service is deployed successfully, log into the AWS console, go to the "Lamda" interface, and set the S3 bucket as the trigger if not already shown.
77-
![Screenshot 2022-04-20 at 09-30-41 Functions - Lambda](https://user-images.githubusercontent.com/32680215/164255328-782346f3-689f-458d-8ebe-b3f9af67964a.png)
69+
### Deploy the service
70+
13. `sls deploy --verbose`. This may take several minutes.
7871

79-
12. You can test the service by uploading your input file into the `bucket-name` that your exported in step 2.![Screenshot 2022-04-20 at 09-32-39 sls-attempt-three-emcrod - S3 bucket](https://user-images.githubusercontent.com/32680215/164255397-a6b68b51-31da-4228-83eb-bcd5928f315e.png)
72+
### Test by invoking via AWS
73+
14. When the service is deployed successfully, log into the AWS console, go to the "Lamda" interface, and set the S3 bucket as the trigger if not already shown.
74+
![Screenshot 2022-04-20 at 09-30-41 Functions - Lambda](https://user-images.githubusercontent.com/32680215/164255328-782346f3-689f-458d-8ebe-b3f9af67964a.png)
8075

81-
### Expected Output
82-
The service will run `saf <COMMAND_STRING_INPUT> -i <latest_file_from_bucket>` and the output will be determined by that command.
83-
For example, for the `convert hdf2splunk` command, the service will convert the uploaded HDF file and send the data to your Splunk instance.
76+
15. You can test the service by uploading your input file into the `bucket-name` that your exported in step 2.![Screenshot 2022-04-20 at 09-32-39 sls-attempt-three-emcrod - S3 bucket](https://user-images.githubusercontent.com/32680215/164255397-a6b68b51-31da-4228-83eb-bcd5928f315e.png)
8477

8578

8679
### Contributing

0 commit comments

Comments
 (0)