You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The PR includes issue data stream and associated dashboard. Cyera fields are mapped to their corresponding ECS fields where possible. Test samples were derived from live data samples, which were subsequently sanitized.
Copy file name to clipboardExpand all lines: packages/cyera/_dev/build/docs/README.md
+14-6Lines changed: 14 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,27 +8,27 @@ The Cyera integration for Elastic allows you to collect logs and visualize the d
8
8
9
9
### Compatibility
10
10
11
-
The Cyera integration supports the following versions of Cyera APIs.
11
+
This integration is compatible with different versions of Cyera APIs for respective data streams as below:
12
12
13
13
| Data streams | Version |
14
14
|----------------|---------|
15
15
| Classification | v1 |
16
16
| Issue | v3 |
17
-
| Datastore | v2 |
18
-
| Event | v1 |
19
17
20
18
### How it works
21
19
22
-
This integration periodically queries the Cyera API to retrieve classifications.
20
+
This integration periodically queries the Cyera API to retrieve classifications and issues.
23
21
24
22
## What data does this integration collect?
25
23
26
24
This integration collects log messages of the following types:
27
25
28
26
-`Classification`: Collects classifications that have been identified by the Cyera system.
29
27
28
+
-`Issue`: Collects issues that have been identified by the Cyera system.
29
+
30
30
### Supported use cases
31
-
Integrating Cyera Classification data stream with Elastic SIEM provides visibility into sensitive data classification across cloud and SaaS environments. By correlating Cyera’s classification intelligence with Elastic analytics, security teams can strengthen data security postureand simplify compliance. Dashboards in Kibana present breakdowns by sensitivity, category, and trends over time, enabling faster investigations and improved accountability.
31
+
Integrating Cyera Classification and Issues data streams with Elastic SIEM provides visibility into both sensitive data classification and the risks associated with that data across cloud and SaaS environments. By correlating Cyera’s classification intelligence with issue context in Elastic analytics, security teams can strengthen data security posture, accelerate incident response, and simplify compliance. Dashboards in Kibana present breakdowns by sensitivity, category, severity, status, risk status, and trends over time, enabling faster investigations, better prioritization, and improved accountability.
32
32
33
33
## What do I need to use this integration?
34
34
@@ -106,6 +106,14 @@ For more information on architectures that can be used for scaling this integrat
106
106
107
107
{{event "classification"}}
108
108
109
+
### Issue
110
+
111
+
{{fields "issue"}}
112
+
113
+
#### Example event
114
+
115
+
{{event "issue"}}
116
+
109
117
### Inputs used
110
118
111
119
These inputs can be used in this integration:
@@ -114,4 +122,4 @@ These inputs can be used in this integration:
114
122
115
123
#### ILM Policy
116
124
117
-
To facilitate classification data, source data stream-backed indices `.ds-logs-cyera.<data_stream_name>-*` are allowed to contain duplicates from each polling interval. ILM policy `logs-cyera.<data_stream_name>-default_policy` is added to these source indices, so it doesn't lead to unbounded growth. This means that in these source indices data will be deleted after `30 days` from ingested date.
125
+
To facilitate classification and issues data, source data stream-backed indices `.ds-logs-cyera.<data_stream_name>-*` are allowed to contain duplicates from each polling interval. ILM policy `logs-cyera.<data_stream_name>-default_policy` is added to these source indices, so it doesn't lead to unbounded growth. This means that in these source indices data will be deleted after `30 days` from ingested date.
0 commit comments