You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+23-30Lines changed: 23 additions & 30 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,8 +1,7 @@
1
1
# saf-lambda-function
2
2
This code uses the Serverless Framework to deploy an AWS lambda function that, when triggered by a file uploaded in an S3 bucket, will run the [SAF CLI](https://github.com/mitre/saf) with the given input command (`COMMAND_STRING_INPUT`).
3
3
4
-
## To Use
5
-
### Follow the example-specific instructions below:
4
+
## Prerequisites
6
5
1. Clone this repository: `git clone https://github.com/mitre/saf-lambda-function.git`
7
6
2. Install the Serverless Framework: `npm install -g serverless`
8
7
3. Install the latest dependencies: `npm install`.
10. Deploy the service: `sls deploy --verbose`. This may take several minutes and should look like the example below.
49
-
<imgwidth="1287"alt="Screen Shot 2022-04-19 at 4 37 58 PM"src="https://user-images.githubusercontent.com/32680215/164254895-c7251b9a-2566-4f42-ac39-b9c97433aabd.png">
50
-
51
-
11. (Optional) Test by invoking via AWS
52
-
53
-
54
-
### Testing your SAF CLI Lambda function
55
-
56
-
#### Invoking localling
57
-
1. Create an AWS bucket with your bucket name that you previously specified as an environment variable.
58
-
2. Load a file into the "bucket-input-folder" which is specified in the `config.json`.
59
-
3. If testing for the first time, run `npm make-event`. This will generate an s3 test event by running the command `serverless generate-event -t aws:s3 > test/event.json`.
60
-
3. Edit the bucket name and key in `test/event.json`.
49
+
## Test and Deploy your SAF CLI Lambda function
50
+
### Test by invoking locally
51
+
8. Create an AWS bucket with your bucket name that you previously specified as an environment variable.
52
+
9. Load a file into the "bucket-input-folder" which is specified in the `config.json`.
53
+
10. If testing for the first time, run `npm make-event`. This will generate an s3 test event by running the command `serverless generate-event -t aws:s3 > test/event.json`.
54
+
11. Edit the bucket name and key in `test/event.json`.
5.You should see logging in the terminal and an uploaded output file in your s3 bucket if the `config.json` file specifies that the function should upload an output file.
63
+
12. Run `npm test`.
64
+
You should see logging in the terminal and an uploaded output file in your s3 bucket if the `config.json` file specifies that the function should upload an output file.
71
65
72
66
Here, `npm test` is running the command: `serverless invoke local --function saf-lambda-function --path test/event.json`.
73
67
You can change the specifications more if needed by looking at the documentation for [serverless invoke local](https://www.serverless.com/framework/docs/providers/aws/cli-reference/invoke-local).
74
68
75
-
#### Invoking via AWS
76
-
1. When the service is deployed successfully, log into the AWS console, go to the "Lamda" interface, and set the S3 bucket as the trigger if not already shown.
77
-

69
+
### Deploy the service
70
+
13.`sls deploy --verbose`. This may take several minutes.
78
71
79
-
12. You can test the service by uploading your input file into the `bucket-name` that your exported in step 2.
72
+
### Test by invoking via AWS
73
+
14. When the service is deployed successfully, log into the AWS console, go to the "Lamda" interface, and set the S3 bucket as the trigger if not already shown.
74
+

80
75
81
-
### Expected Output
82
-
The service will run `saf <COMMAND_STRING_INPUT> -i <latest_file_from_bucket>` and the output will be determined by that command.
83
-
For example, for the `convert hdf2splunk` command, the service will convert the uploaded HDF file and send the data to your Splunk instance.
76
+
15. You can test the service by uploading your input file into the `bucket-name` that your exported in step 2.
0 commit comments