Skip to content

thulab/iotdb-kairosdb

Repository files navigation

Overview

iotdb-kairosdb is a RESTful service for using IoTDB by KairosDB RESTful APIs.

Usage

Configurations are in conf/config.properties

To start the RESTful service:

run > ./start-rest-service.sh

Doc

IKR部署与测试用例

目录

1 测试环境部署

1.1 测试环境要求

  • Ubuntu 16.04
  • Java 8
  • Maven
  • Git

1.2 IoTDB部署

  1. 下载 IoTDB
$ git clone https://github.com/apache/incubator-iotdb.git 
  1. 安装 IoTDB
$ cd incubator-iotdb $ mvn clean install -Dmaven.test.skip=true 
  1. 后台启动 IoTDB
$ nohup ./iotdb/iotdb/bin/start-server.sh & 
  1. 关闭 IoTDB[仅用于当需要操作关闭IoTDB时]
$ ./iotdb/iotdb/bin/stop-server.sh 

1.3 IKR 部署

如果 IKR 和 IoTDB在同一台机器上,步骤 1 和 2 可以省略

  1. 在 IKR 的工作目录下,下载 IoTDB
$ git clone https://github.com/apache/incubator-iotdb.git 
  1. 安装IoTDB
$ mvn clean install -Dmaven.test.skip=true 
  1. 下载 IKR
$ git clone https://github.com/thulab/iotdb-kairosdb.git $ cd iotdb-kairosdb 
  1. 配置 IKR
$ vim conf/config.properties 

配置HOST和PORT,对应IoTDB所在的IP和端口

  1. 后台启动 IKR
$ nohup ./start-rest-service.sh & 
  1. 关闭 IKR[仅用于当需要操作关闭IKR时]
$ ./stop-rest-service-daemon.sh 

2 测试用例

2.1 写入测试用例

  1. 编写测试 JSON 文件作为写入请求的body
$ vim insert.json 

输入以下 JSON :

[ { "name": "archive_file_tracked", "datapoints": [ [1359788400000, 123.3], [1359788300000, 13.2 ], [1359788410000, 23.1 ] ], "tags": { "host": "server1", "data_center": "DC1" } }, { "name": "archive_file_search", "timestamp": 1359786400000, "value": 321, "tags": { "host": "server2" } } ] 
  1. 向 IKR 服务发送写入请求
$ curl -XPOST -H'Content-Type: application/json' -d @insert.json http://[host]:[port]/api/v1/datapoints 
  1. 通过http请求查询写入数据,检查数据是否正确写入

编写查询JSON 文件

$ vim query.json 

输入以下 JSON :

{	"start_absolute" : 1,	"end_relative": {	"value": "5",	"unit": "days"	},	"time_zone": "Asia/Kabul",	"metrics": [	{	"name": "archive_file_tracked"	},	{	"name": "archive_file_search"	}	] } 

向 IKR 服务发送查询请求

curl -XPOST -H'Content-Type: application/json' -d @query.json http://[host]:[port]/api/v1/datapoints/query 

返回结果:

{"queries":[{"sample_size":3,"results":[{"name":"archive_file_tracked","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server1"],"data_center":["DC1"]},"values":[[1359788300000,13.2],[1359788400000,123.3],[1359788410000,23.1]]}]},{"sample_size":1,"results":[{"name":"archive_file_search","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server2"]},"values":[[1359786400000,"321"]]}]}]} 

为了便于阅读,将以上JSON字符串格式化后为:

{ "queries": [ { "sample_size": 3, "results": [ { "name": "archive_file_tracked", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "host": [ "server1" ], "data_center": [ "DC1" ] }, "values": [ [ 1359788300000, 13.2 ], [ 1359788400000, 123.3 ], [ 1359788410000, 23.1 ] ] } ] }, { "sample_size": 1, "results": [ { "name": "archive_file_search", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "host": [ "server2" ] }, "values": [ [ 1359786400000, "321" ] ] } ] } ] } 

查询结果与写入数据一致,说明写入成功

2.2 查询测试

2.2.1 基本查询测试用例

  1. 准备查询测试用的测试数据,写入 JSON 为
[	{	"name": "test_query",	"datapoints": [	[1400000000000, 12.3],	[1400000001000, 13.2],	[1400000002000, 23.1],	[1400000003000, 24.0],	[1400000004000, 24.1],	[1400000009000, 24.6],	[1400000010000, 24.7],	[1400000011000, 24.8],	[1400000012000, 24.9],	[1400000013000, 25.0],	[1400000014000, 25.1],	[1400000015000, 25.2],	[1400000016000, 25.3],	[1400000017000, 25.4],	[1400000023000, 26.0],	[1400000024000, 26.1],	[1400000025000, 26.2],	[1400000026000, 26.3],	[1400000027000, 26.4]	],	"tags": {	"host": "server1",	"data_center": "DC1"	}	},	{	"name": "test_query",	"datapoints": [	[1400000005000, 24.2],	[1400000006000, 24.3],	[1400000007000, 24.4],	[1400000008000, 24.5],	[1400000018000, 25.5],	[1400000019000, 25.6],	[1400000020000, 25.7],	[1400000021000, 25.8],	[1400000022000, 25.9]	],	"tags": {	"host": "server2",	"data_center": "DC1"	}	} ] 

写入方法与写入测试相同:

$ curl -XPOST -H'Content-Type: application/json' -d @insert.json http://[host]:[port]/api/v1/datapoints 
  1. 基本查询测试用例

基本查询又叫简单查询,该查询的JSON中指定了查询的时间范围和查询的metric以及tag。查询结果返回是原始数据。

编写测试 JSON 文件

$ vim query.json 

输入以下 JSON :

{	"start_absolute" : 1,	"end_relative": {	"value": "5",	"unit": "days"	},	"time_zone": "Asia/Kabul",	"metrics": [	{	"name": "test_query"	}] } 

向 IKR 服务发送查询请求

curl -XPOST -H'Content-Type: application/json' -d @query.json http://[host]:[port]/api/v1/datapoints/query 

返回结果:

{"queries":[{"sample_size":28,"results":[{"name":"test_query","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server1","server2"],"data_center":["DC1"]},"values":[[1400000000000,12.3],[1400000001000,13.2],[1400000002000,23.1],[1400000003000,24.0],[1400000004000,24.1],[1400000005000,24.2],[1400000006000,24.3],[1400000007000,24.4],[1400000008000,24.5],[1400000009000,24.6],[1400000010000,24.7],[1400000011000,24.8],[1400000012000,24.9],[1400000013000,25.0],[1400000014000,25.1],[1400000015000,25.2],[1400000016000,25.3],[1400000017000,25.4],[1400000018000,25.5],[1400000019000,25.6],[1400000020000,25.7],[1400000021000,25.8],[1400000022000,25.9],[1400000023000,26.0],[1400000024000,26.1],[1400000025000,26.2],[1400000026000,26.3],[1400000027000,26.4]]}]}]} 

格式化后为:

{ "queries": [ { "sample_size":28, "results":[ { "name":"test_query", "group_by":[ { "name":"type", "type":"number" }], "tags":{ "host":["server1","server2"], "data_center":["DC1"] }, "values":[ [1400000000000,12.3], [1400000001000,13.2], [1400000002000,23.1], [1400000003000,24.0], [1400000004000,24.1], [1400000005000,24.2], [1400000006000,24.3], [1400000007000,24.4], [1400000008000,24.5], [1400000009000,24.6], [1400000010000,24.7], [1400000011000,24.8], [1400000012000,24.9], [1400000013000,25.0], [1400000014000,25.1], [1400000015000,25.2], [1400000016000,25.3], [1400000017000,25.4], [1400000018000,25.5], [1400000019000,25.6], [1400000020000,25.7], [1400000021000,25.8], [1400000022000,25.9], [1400000023000,26.0], [1400000024000,26.1], [1400000025000,26.2], [1400000026000,26.3], [1400000027000,26.4] ] }] }] } 

2.2.2 聚合查询测试用例

聚合查询是在基本查询的基础上加入aggregators字段进行的分析型复杂查询。以下聚合查询测试用例同样使用基本查询测试中写入的测试数据。

2.2.2.1 均值聚合查询测试用例(avg)

创建 avg_query.json

$ vim avg_query.json 

输入以下内容

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "days" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "test_query", "tags": { "host": [ "server2" ] }, "aggregators": [ { "name": "avg", "sampling": { "value": 2, "unit": "seconds" } } ] } ] } 

执行以下命令

$ curl -XPOST -H'Content-Type: application/json' -d @avg_query.json http://[host]:[port]/api/v1/datapoints/query 

得到返回:

{"queries":[{"sample_size":9,"results":[{"name":"test_query","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server2"],"data_center":["DC1"]},"values":[[1400000005000,24.25],[1400000007000,24.45],[1400000018000,25.5],[1400000019000,25.65],[1400000021000,25.85]]}]}]} 

格式化后为:

{ "queries": [ { "sample_size": 9, "results": [ { "name": "test_query", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "host": [ "server2" ], "data_center": [ "DC1" ] }, "values": [ [ 1400000005000, 24.25 ], [ 1400000007000, 24.45 ], [ 1400000018000, 25.5 ], [ 1400000019000, 25.65 ], [ 1400000021000, 25.85 ] ] } ] } ] } 

2.2.2.2 方差聚合查询测试用例(dev)

创建 dev_query.json

$ vim dev_query.json 

输入以下内容

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "days" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "test_query", "aggregators": [ { "name": "dev", "sampling": { "value": 2, "unit": "seconds" }, "return_type":"value" } ] } ] } 

执行以下命令

$ curl -XPOST -H'Content-Type: application/json' -d @dev_query.json http://[host]:[port]/api/v1/datapoints/query 

得到返回:

{"queries":[{"sample_size":28,"results":[{"name":"test_query","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server1","server2"],"data_center":["DC1"]},"values":[[1400000000000,0.0],[1400000001000,7.000357133746822],[1400000003000,0.07071067811865576],[1400000005000,0.07071067811865576],[1400000007000,0.07071067811865576],[1400000009000,0.07071067811865325],[1400000011000,0.07071067811865325],[1400000013000,0.07071067811865576],[1400000015000,0.07071067811865576],[1400000017000,0.07071067811865576],[1400000019000,0.07071067811865325],[1400000021000,0.07071067811865325],[1400000023000,0.07071067811865576],[1400000025000,0.07071067811865576],[1400000027000,0.0]]}]}]} 

格式化后为:

{ "queries": [ { "sample_size": 28, "results": [ { "name": "test_query", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "host": [ "server1", "server2" ], "data_center": [ "DC1" ] }, "values": [ [ 1400000000000, 0 ], [ 1400000001000, 7.000357133746822 ], [ 1400000003000, 0.07071067811865576 ], [ 1400000005000, 0.07071067811865576 ], [ 1400000007000, 0.07071067811865576 ], [ 1400000009000, 0.07071067811865325 ], [ 1400000011000, 0.07071067811865325 ], [ 1400000013000, 0.07071067811865576 ], [ 1400000015000, 0.07071067811865576 ], [ 1400000017000, 0.07071067811865576 ], [ 1400000019000, 0.07071067811865325 ], [ 1400000021000, 0.07071067811865325 ], [ 1400000023000, 0.07071067811865576 ], [ 1400000025000, 0.07071067811865576 ], [ 1400000027000, 0 ] ] } ] } ] } 

2.2.2.3 计数聚合查询测试用例(count)

创建 count_query.json

$ vim count_query.json 

输入以下内容

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "days" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "test_query", "aggregators": [ { "name": "count", "sampling": { "value": 2, "unit": "seconds" } } ] } ] } 

执行以下命令

$ curl -XPOST -H'Content-Type: application/json' -d @count_query.json http://[host]:[port]/api/v1/datapoints/query 

得到返回:

{"queries":[{"sample_size":28,"results":[{"name":"test_query","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server1","server2"],"data_center":["DC1"]},"values":[[1400000000000,1],[1400000001000,2],[1400000003000,2],[1400000005000,2],[1400000007000,2],[1400000009000,2],[1400000011000,2],[1400000013000,2],[1400000015000,2],[1400000017000,2],[1400000019000,2],[1400000021000,2],[1400000023000,2],[1400000025000,2],[1400000027000,1]]}]}]} 

格式化后为:

{ "queries": [ { "sample_size": 28, "results": [ { "name": "test_query", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "host": [ "server1", "server2" ], "data_center": [ "DC1" ] }, "values": [ [ 1400000000000, 1 ], [ 1400000001000, 2 ], [ 1400000003000, 2 ], [ 1400000005000, 2 ], [ 1400000007000, 2 ], [ 1400000009000, 2 ], [ 1400000011000, 2 ], [ 1400000013000, 2 ], [ 1400000015000, 2 ], [ 1400000017000, 2 ], [ 1400000019000, 2 ], [ 1400000021000, 2 ], [ 1400000023000, 2 ], [ 1400000025000, 2 ], [ 1400000027000, 1 ] ] } ] } ] } 

2.2.2.4 首值聚合查询测试用例(first)

创建 first_query.json

$ vim first_query.json 

输入以下内容

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "days" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "test_query", "aggregators": [ { "name": "first", "sampling": { "value": 2, "unit": "seconds" } } ] } ] } 

执行以下命令

$ curl -XPOST -H'Content-Type: application/json' -d @first_query.json http://[host]:[port]/api/v1/datapoints/query 

得到返回:

{"queries":[{"sample_size":28,"results":[{"name":"test_query","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server1","server2"],"data_center":["DC1"]},"values":[[1400000000000,12.3],[1400000001000,13.2],[1400000003000,24.0],[1400000005000,24.2],[1400000007000,24.4],[1400000009000,24.6],[1400000011000,24.8],[1400000013000,25.0],[1400000015000,25.2],[1400000017000,25.4],[1400000019000,25.6],[1400000021000,25.8],[1400000023000,26.0],[1400000025000,26.2],[1400000027000,26.4]]}]}]} 

格式化后为:

{ "queries": [ { "sample_size": 28, "results": [ { "name": "test_query", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "host": [ "server1", "server2" ], "data_center": [ "DC1" ] }, "values": [ [ 1400000000000, 12.3 ], [ 1400000001000, 13.2 ], [ 1400000003000, 24 ], [ 1400000005000, 24.2 ], [ 1400000007000, 24.4 ], [ 1400000009000, 24.6 ], [ 1400000011000, 24.8 ], [ 1400000013000, 25 ], [ 1400000015000, 25.2 ], [ 1400000017000, 25.4 ], [ 1400000019000, 25.6 ], [ 1400000021000, 25.8 ], [ 1400000023000, 26 ], [ 1400000025000, 26.2 ], [ 1400000027000, 26.4 ] ] } ] } ] } 

2.2.2.5 尾值聚合查询测试用例(last)

创建 last_query.json

$ vim last_query.json 

输入以下内容

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "days" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "test_query", "aggregators": [ { "name": "last", "sampling": { "value": 2, "unit": "seconds" } } ] } ] } 

执行以下命令

$ curl -XPOST -H'Content-Type: application/json' -d @last_query.json http://[host]:[port]/api/v1/datapoints/query 

得到返回:

{"queries":[{"sample_size":28,"results":[{"name":"test_query","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server1","server2"],"data_center":["DC1"]},"values":[[1400000000000,12.3],[1400000002000,23.1],[1400000004000,24.1],[1400000006000,24.3],[1400000008000,24.5],[1400000010000,24.7],[1400000012000,24.9],[1400000014000,25.1],[1400000016000,25.3],[1400000018000,25.5],[1400000020000,25.7],[1400000022000,25.9],[1400000024000,26.1],[1400000026000,26.3],[1400000027000,26.4]]}]}]} 

格式化后为:

{ "queries": [ { "sample_size": 28, "results": [ { "name": "test_query", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "host": [ "server1", "server2" ], "data_center": [ "DC1" ] }, "values": [ [ 1400000000000, 12.3 ], [ 1400000002000, 23.1 ], [ 1400000004000, 24.1 ], [ 1400000006000, 24.3 ], [ 1400000008000, 24.5 ], [ 1400000010000, 24.7 ], [ 1400000012000, 24.9 ], [ 1400000014000, 25.1 ], [ 1400000016000, 25.3 ], [ 1400000018000, 25.5 ], [ 1400000020000, 25.7 ], [ 1400000022000, 25.9 ], [ 1400000024000, 26.1 ], [ 1400000026000, 26.3 ], [ 1400000027000, 26.4 ] ] } ] } ] } 

2.2.2.6 最大值聚合查询测试用例(max)

创建 max_query.json

$ vim max_query.json 

输入以下内容

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "days" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "test_query", "aggregators": [ { "name": "max", "sampling": { "value": 2, "unit": "seconds" } } ] } ] } 

执行以下命令

$ curl -XPOST -H'Content-Type: application/json' -d @max_query.json http://[host]:[port]/api/v1/datapoints/query 

得到返回:

{"queries":[{"sample_size":28,"results":[{"name":"test_query","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server1","server2"],"data_center":["DC1"]},"values":[[1400000000000,12.3],[1400000002000,23.1],[1400000004000,24.1],[1400000006000,24.3],[1400000008000,24.5],[1400000010000,24.7],[1400000012000,24.9],[1400000014000,25.1],[1400000016000,25.3],[1400000018000,25.5],[1400000020000,25.7],[1400000022000,25.9],[1400000024000,26.1],[1400000026000,26.3],[1400000027000,26.4]]}]}]} 

格式化后为:

{ "queries": [ { "sample_size": 28, "results": [ { "name": "test_query", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "host": [ "server1", "server2" ], "data_center": [ "DC1" ] }, "values": [ [ 1400000000000, 12.3 ], [ 1400000002000, 23.1 ], [ 1400000004000, 24.1 ], [ 1400000006000, 24.3 ], [ 1400000008000, 24.5 ], [ 1400000010000, 24.7 ], [ 1400000012000, 24.9 ], [ 1400000014000, 25.1 ], [ 1400000016000, 25.3 ], [ 1400000018000, 25.5 ], [ 1400000020000, 25.7 ], [ 1400000022000, 25.9 ], [ 1400000024000, 26.1 ], [ 1400000026000, 26.3 ], [ 1400000027000, 26.4 ] ] } ] } ] } 

2.2.2.7 最小值聚合查询测试用例(min)

创建 min_query.json

$ vim min_query.json 

输入以下内容

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "days" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "test_query", "aggregators": [ { "name": "min", "sampling": { "value": 2, "unit": "seconds" } } ] } ] } 

执行以下命令

$ curl -XPOST -H'Content-Type: application/json' -d @min_query.json http://[host]:[port]/api/v1/datapoints/query 

得到返回:

{"queries":[{"sample_size":28,"results":[{"name":"test_query","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server1","server2"],"data_center":["DC1"]},"values":[[1400000000000,12.3],[1400000001000,13.2],[1400000003000,24.0],[1400000005000,24.2],[1400000007000,24.4],[1400000009000,24.6],[1400000011000,24.8],[1400000013000,25.0],[1400000015000,25.2],[1400000017000,25.4],[1400000019000,25.6],[1400000021000,25.8],[1400000023000,26.0],[1400000025000,26.2],[1400000027000,26.4]]}]}]} 

格式化后为:

{ "queries": [ { "sample_size": 28, "results": [ { "name": "test_query", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "host": [ "server1", "server2" ], "data_center": [ "DC1" ] }, "values": [ [ 1400000000000, 12.3 ], [ 1400000001000, 13.2 ], [ 1400000003000, 24 ], [ 1400000005000, 24.2 ], [ 1400000007000, 24.4 ], [ 1400000009000, 24.6 ], [ 1400000011000, 24.8 ], [ 1400000013000, 25 ], [ 1400000015000, 25.2 ], [ 1400000017000, 25.4 ], [ 1400000019000, 25.6 ], [ 1400000021000, 25.8 ], [ 1400000023000, 26 ], [ 1400000025000, 26.2 ], [ 1400000027000, 26.4 ] ] } ] } ] } 

2.2.2.8 求和值聚合查询测试用例(sum)

创建 sum_query.json

$ vim sum_query.json 

输入以下内容

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "days" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "test_query", "aggregators": [ { "name": "sum", "sampling": { "value": 2, "unit": "seconds" } } ] } ] } 

执行以下命令

$ curl -XPOST -H'Content-Type: application/json' -d @sum_query.json http://[host]:[port]/api/v1/datapoints/query 

得到返回:

{"queries":[{"sample_size":28,"results":[{"name":"test_query","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server1","server2"],"data_center":["DC1"]},"values":[[1400000000000,12.3],[1400000001000,36.3],[1400000003000,48.1],[1400000005000,48.5],[1400000007000,48.9],[1400000009000,49.3],[1400000011000,49.7],[1400000013000,50.1],[1400000015000,50.5],[1400000017000,50.9],[1400000019000,51.3],[1400000021000,51.7],[1400000023000,52.1],[1400000025000,52.5],[1400000027000,26.4]]}]}]} 

格式化后为:

{ "queries": [ { "sample_size": 28, "results": [ { "name": "test_query", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "host": [ "server1", "server2" ], "data_center": [ "DC1" ] }, "values": [ [ 1400000000000, 12.3 ], [ 1400000001000, 36.3 ], [ 1400000003000, 48.1 ], [ 1400000005000, 48.5 ], [ 1400000007000, 48.9 ], [ 1400000009000, 49.3 ], [ 1400000011000, 49.7 ], [ 1400000013000, 50.1 ], [ 1400000015000, 50.5 ], [ 1400000017000, 50.9 ], [ 1400000019000, 51.3 ], [ 1400000021000, 51.7 ], [ 1400000023000, 52.1 ], [ 1400000025000, 52.5 ], [ 1400000027000, 26.4 ] ] } ] } ] } 

2.2.2.9 一阶差分聚合查询测试用例(diff)

创建 diff_query.json

$ vim diff_query.json 

输入以下内容

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "days" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "test_query", "aggregators": [ { "name": "diff" } ] } ] } 

执行以下命令

$ curl -XPOST -H'Content-Type: application/json' -d @diff_query.json http://[host]:[port]/api/v1/datapoints/query 

得到返回:

{"queries":[{"sample_size":28,"results":[{"name":"test_query","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server1","server2"],"data_center":["DC1"]},"values":[[1400000001000,0.8999999999999986],[1400000002000,9.900000000000002],[1400000003000,0.8999999999999986],[1400000004000,0.10000000000000142],[1400000005000,0.09999999999999787],[1400000006000,0.10000000000000142],[1400000007000,0.09999999999999787],[1400000008000,0.10000000000000142],[1400000009000,0.10000000000000142],[1400000010000,0.09999999999999787],[1400000011000,0.10000000000000142],[1400000012000,0.09999999999999787],[1400000013000,0.10000000000000142],[1400000014000,0.10000000000000142],[1400000015000,0.09999999999999787],[1400000016000,0.10000000000000142],[1400000017000,0.09999999999999787],[1400000018000,0.10000000000000142],[1400000019000,0.10000000000000142],[1400000020000,0.09999999999999787],[1400000021000,0.10000000000000142],[1400000022000,0.09999999999999787],[1400000023000,0.10000000000000142],[1400000024000,0.10000000000000142],[1400000025000,0.09999999999999787],[1400000026000,0.10000000000000142],[1400000027000,0.09999999999999787]]}]}]} 

格式化后为:

{ "queries": [ { "sample_size": 28, "results": [ { "name": "test_query", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "host": [ "server1", "server2" ], "data_center": [ "DC1" ] }, "values": [ [ 1400000001000, 0.8999999999999986 ], [ 1400000002000, 9.900000000000002 ], [ 1400000003000, 0.8999999999999986 ], [ 1400000004000, 0.10000000000000142 ], [ 1400000005000, 0.09999999999999787 ], [ 1400000006000, 0.10000000000000142 ], [ 1400000007000, 0.09999999999999787 ], [ 1400000008000, 0.10000000000000142 ], [ 1400000009000, 0.10000000000000142 ], [ 1400000010000, 0.09999999999999787 ], [ 1400000011000, 0.10000000000000142 ], [ 1400000012000, 0.09999999999999787 ], [ 1400000013000, 0.10000000000000142 ], [ 1400000014000, 0.10000000000000142 ], [ 1400000015000, 0.09999999999999787 ], [ 1400000016000, 0.10000000000000142 ], [ 1400000017000, 0.09999999999999787 ], [ 1400000018000, 0.10000000000000142 ], [ 1400000019000, 0.10000000000000142 ], [ 1400000020000, 0.09999999999999787 ], [ 1400000021000, 0.10000000000000142 ], [ 1400000022000, 0.09999999999999787 ], [ 1400000023000, 0.10000000000000142 ], [ 1400000024000, 0.10000000000000142 ], [ 1400000025000, 0.09999999999999787 ], [ 1400000026000, 0.10000000000000142 ], [ 1400000027000, 0.09999999999999787 ] ] } ] } ] } 

2.2.2.10 除法聚合查询测试用例(div)

创建 div_query.json

$ vim div_query.json 

输入以下内容

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "days" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "test_query", "aggregators": [ { "name": "div", "divisor": "2" } ] } ] } 

执行以下命令

$ curl -XPOST -H'Content-Type: application/json' -d @div_query.json http://[host]:[port]/api/v1/datapoints/query 

得到返回:

{"queries":[{"sample_size":28,"results":[{"name":"test_query","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server1","server2"],"data_center":["DC1"]},"values":[[1400000000000,6.15],[1400000001000,6.6],[1400000002000,11.55],[1400000003000,12.0],[1400000004000,12.05],[1400000005000,12.1],[1400000006000,12.15],[1400000007000,12.2],[1400000008000,12.25],[1400000009000,12.3],[1400000010000,12.35],[1400000011000,12.4],[1400000012000,12.45],[1400000013000,12.5],[1400000014000,12.55],[1400000015000,12.6],[1400000016000,12.65],[1400000017000,12.7],[1400000018000,12.75],[1400000019000,12.8],[1400000020000,12.85],[1400000021000,12.9],[1400000022000,12.95],[1400000023000,13.0],[1400000024000,13.05],[1400000025000,13.1],[1400000026000,13.15],[1400000027000,13.2]]}]}]} 

格式化后为:

{ "queries": [ { "sample_size": 28, "results": [ { "name": "test_query", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "host": [ "server1", "server2" ], "data_center": [ "DC1" ] }, "values": [ [ 1400000000000, 6.15 ], [ 1400000001000, 6.6 ], [ 1400000002000, 11.55 ], [ 1400000003000, 12 ], [ 1400000004000, 12.05 ], [ 1400000005000, 12.1 ], [ 1400000006000, 12.15 ], [ 1400000007000, 12.2 ], [ 1400000008000, 12.25 ], [ 1400000009000, 12.3 ], [ 1400000010000, 12.35 ], [ 1400000011000, 12.4 ], [ 1400000012000, 12.45 ], [ 1400000013000, 12.5 ], [ 1400000014000, 12.55 ], [ 1400000015000, 12.6 ], [ 1400000016000, 12.65 ], [ 1400000017000, 12.7 ], [ 1400000018000, 12.75 ], [ 1400000019000, 12.8 ], [ 1400000020000, 12.85 ], [ 1400000021000, 12.9 ], [ 1400000022000, 12.95 ], [ 1400000023000, 13 ], [ 1400000024000, 13.05 ], [ 1400000025000, 13.1 ], [ 1400000026000, 13.15 ], [ 1400000027000, 13.2 ] ] } ] } ] } 

2.2.2.11 值过滤聚合查询测试用例(filter)

创建 filter_query.json

$ vim filter_query.json 

输入以下内容

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "days" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "test_query", "aggregators": [ { "name": "filter", "filter_op": "lt", "threshold": "25" } ] } ] } 

执行以下命令

$ curl -XPOST -H'Content-Type: application/json' -d @filter_query.json http://[host]:[port]/api/v1/datapoints/query 

得到返回:

{"queries":[{"sample_size":28,"results":[{"name":"test_query","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server1","server2"],"data_center":["DC1"]},"values":[[1400000000000,12.3],[1400000001000,13.2],[1400000002000,23.1],[1400000003000,24.0],[1400000004000,24.1],[1400000005000,24.2],[1400000006000,24.3],[1400000007000,24.4],[1400000008000,24.5],[1400000009000,24.6],[1400000010000,24.7],[1400000011000,24.8],[1400000012000,24.9]]}]}]} 

格式化后为:

{ "queries": [ { "sample_size": 28, "results": [ { "name": "test_query", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "host": [ "server1", "server2" ], "data_center": [ "DC1" ] }, "values": [ [ 1400000000000, 12.3 ], [ 1400000001000, 13.2 ], [ 1400000002000, 23.1 ], [ 1400000003000, 24 ], [ 1400000004000, 24.1 ], [ 1400000005000, 24.2 ], [ 1400000006000, 24.3 ], [ 1400000007000, 24.4 ], [ 1400000008000, 24.5 ], [ 1400000009000, 24.6 ], [ 1400000010000, 24.7 ], [ 1400000011000, 24.8 ], [ 1400000012000, 24.9 ] ] } ] } ] } 

2.2.2.12 另存为聚合查询测试用例(save_as)

创建 save_as_query.json

$ vim save_as_query.json 

输入以下内容

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "days" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "test_query", "aggregators": [ { "name": "save_as", "metric_name": "test_save_as" } ] } ] } 

执行以下命令

$ curl -XPOST -H'Content-Type: application/json' -d @save_as_query.json http://[host]:[port]/api/v1/datapoints/query 

得到返回:

{"queries":[{"sample_size":28,"results":[{"name":"test_query","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server1","server2"],"data_center":["DC1"]},"values":[[1400000000000,12.3],[1400000001000,13.2],[1400000002000,23.1],[1400000003000,24.0],[1400000004000,24.1],[1400000005000,24.2],[1400000006000,24.3],[1400000007000,24.4],[1400000008000,24.5],[1400000009000,24.6],[1400000010000,24.7],[1400000011000,24.8],[1400000012000,24.9],[1400000013000,25.0],[1400000014000,25.1],[1400000015000,25.2],[1400000016000,25.3],[1400000017000,25.4],[1400000018000,25.5],[1400000019000,25.6],[1400000020000,25.7],[1400000021000,25.8],[1400000022000,25.9],[1400000023000,26.0],[1400000024000,26.1],[1400000025000,26.2],[1400000026000,26.3],[1400000027000,26.4]]}]}]} 

格式化后为:

{ "queries": [ { "sample_size": 28, "results": [ { "name": "test_query", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "host": [ "server1", "server2" ], "data_center": [ "DC1" ] }, "values": [ [ 1400000000000, 12.3 ], [ 1400000001000, 13.2 ], [ 1400000002000, 23.1 ], [ 1400000003000, 24 ], [ 1400000004000, 24.1 ], [ 1400000005000, 24.2 ], [ 1400000006000, 24.3 ], [ 1400000007000, 24.4 ], [ 1400000008000, 24.5 ], [ 1400000009000, 24.6 ], [ 1400000010000, 24.7 ], [ 1400000011000, 24.8 ], [ 1400000012000, 24.9 ], [ 1400000013000, 25 ], [ 1400000014000, 25.1 ], [ 1400000015000, 25.2 ], [ 1400000016000, 25.3 ], [ 1400000017000, 25.4 ], [ 1400000018000, 25.5 ], [ 1400000019000, 25.6 ], [ 1400000020000, 25.7 ], [ 1400000021000, 25.8 ], [ 1400000022000, 25.9 ], [ 1400000023000, 26 ], [ 1400000024000, 26.1 ], [ 1400000025000, 26.2 ], [ 1400000026000, 26.3 ], [ 1400000027000, 26.4 ] ] } ] } ] } 

然后查询 test_save_as

$ curl -XPOST -H'Content-Type: application/json' --data "{"start_absolute":1,"end_relative":{"value":"5","unit":"days"},"time_zone":"Asia/Kabul","metrics":[{"name":"test_save_as"}]}" http://[host]:[port]/api/v1/datapoints/query 

得到返回:

{"queries":[{"sample_size":28,"results":[{"name":"test_save_as","group_by":[{"name":"type","type":"number"}],"tags":{"saved_from":["test_query"]},"values":[[1400000000000,12.3],[1400000001000,13.2],[1400000002000,23.1],[1400000003000,24.0],[1400000004000,24.1],[1400000005000,24.2],[1400000006000,24.3],[1400000007000,24.4],[1400000008000,24.5],[1400000009000,24.6],[1400000010000,24.7],[1400000011000,24.8],[1400000012000,24.9],[1400000013000,25.0],[1400000014000,25.1],[1400000015000,25.2],[1400000016000,25.3],[1400000017000,25.4],[1400000018000,25.5],[1400000019000,25.6],[1400000020000,25.7],[1400000021000,25.8],[1400000022000,25.9],[1400000023000,26.0],[1400000024000,26.1],[1400000025000,26.2],[1400000026000,26.3],[1400000027000,26.4]]}]}]} 

格式化后为:

{ "queries": [ { "sample_size": 28, "results": [ { "name": "test_save_as", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "saved_from": [ "test_query" ] }, "values": [ [ 1400000000000, 12.3 ], [ 1400000001000, 13.2 ], [ 1400000002000, 23.1 ], [ 1400000003000, 24 ], [ 1400000004000, 24.1 ], [ 1400000005000, 24.2 ], [ 1400000006000, 24.3 ], [ 1400000007000, 24.4 ], [ 1400000008000, 24.5 ], [ 1400000009000, 24.6 ], [ 1400000010000, 24.7 ], [ 1400000011000, 24.8 ], [ 1400000012000, 24.9 ], [ 1400000013000, 25 ], [ 1400000014000, 25.1 ], [ 1400000015000, 25.2 ], [ 1400000016000, 25.3 ], [ 1400000017000, 25.4 ], [ 1400000018000, 25.5 ], [ 1400000019000, 25.6 ], [ 1400000020000, 25.7 ], [ 1400000021000, 25.8 ], [ 1400000022000, 25.9 ], [ 1400000023000, 26 ], [ 1400000024000, 26.1 ], [ 1400000025000, 26.2 ], [ 1400000026000, 26.3 ], [ 1400000027000, 26.4 ] ] } ] } ] } 

2.2.2.13 变化率聚合查询测试用例(rate)

变化率:相邻两个值单位时间内的变化幅度 sampling中的value字段不起作用,只由unit决定变化率的单位

创建 rate_query.json

$ vim rate_query.json 

输入以下内容

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "days" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "test_query", "aggregators": [ { "name": "rate", "sampling": { "value": 1, "unit": "seconds" } } ] } ] } 

执行以下命令

$ curl -XPOST -H'Content-Type: application/json' -d @rate_query.json http://[host]:[port]/api/v1/datapoints/query 

得到返回:

{"queries":[{"sample_size":28,"results":[{"name":"test_query","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server1","server2"],"data_center":["DC1"]},"values":[[1400000001000,0.8999999999999986],[1400000002000,9.900000000000002],[1400000003000,0.8999999999999986],[1400000004000,0.10000000000000142],[1400000005000,0.09999999999999787],[1400000006000,0.10000000000000142],[1400000007000,0.09999999999999787],[1400000008000,0.10000000000000142],[1400000009000,0.10000000000000142],[1400000010000,0.09999999999999787],[1400000011000,0.10000000000000142],[1400000012000,0.09999999999999787],[1400000013000,0.10000000000000142],[1400000014000,0.10000000000000142],[1400000015000,0.09999999999999787],[1400000016000,0.10000000000000142],[1400000017000,0.09999999999999787],[1400000018000,0.10000000000000142],[1400000019000,0.10000000000000142],[1400000020000,0.09999999999999787],[1400000021000,0.10000000000000142],[1400000022000,0.09999999999999787],[1400000023000,0.10000000000000142],[1400000024000,0.10000000000000142],[1400000025000,0.09999999999999787],[1400000026000,0.10000000000000142],[1400000027000,0.09999999999999787]]}]}]} 

格式化后为:

{ "queries": [ { "sample_size": 28, "results": [ { "name": "test_query", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "host": [ "server1", "server2" ], "data_center": [ "DC1" ] }, "values": [ [ 1400000001000, 0.8999999999999986 ], [ 1400000002000, 9.900000000000002 ], [ 1400000003000, 0.8999999999999986 ], [ 1400000004000, 0.10000000000000142 ], [ 1400000005000, 0.09999999999999787 ], [ 1400000006000, 0.10000000000000142 ], [ 1400000007000, 0.09999999999999787 ], [ 1400000008000, 0.10000000000000142 ], [ 1400000009000, 0.10000000000000142 ], [ 1400000010000, 0.09999999999999787 ], [ 1400000011000, 0.10000000000000142 ], [ 1400000012000, 0.09999999999999787 ], [ 1400000013000, 0.10000000000000142 ], [ 1400000014000, 0.10000000000000142 ], [ 1400000015000, 0.09999999999999787 ], [ 1400000016000, 0.10000000000000142 ], [ 1400000017000, 0.09999999999999787 ], [ 1400000018000, 0.10000000000000142 ], [ 1400000019000, 0.10000000000000142 ], [ 1400000020000, 0.09999999999999787 ], [ 1400000021000, 0.10000000000000142 ], [ 1400000022000, 0.09999999999999787 ], [ 1400000023000, 0.10000000000000142 ], [ 1400000024000, 0.10000000000000142 ], [ 1400000025000, 0.09999999999999787 ], [ 1400000026000, 0.10000000000000142 ], [ 1400000027000, 0.09999999999999787 ] ] } ] } ] } 

2.2.2.14 采样率聚合查询测试用例(sampler)

采样率 = 当前数据点的值 * (单位时间(unit) / (当前点的时间戳 - 前一个点的时间戳))

返回数据点数 = 原始数据点数 - 1 (不计算第一个数据点)

创建 sampler_query.json

$ vim sampler_query.json 

输入以下内容

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "days" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "test_query", "aggregators": [ { "name": "sampler", "unit": "minutes" } ] } ] } 

执行以下命令

$ curl -XPOST -H'Content-Type: application/json' -d @sampler_query.json http://[host]:[port]/api/v1/datapoints/query 

得到返回:

{"queries":[{"sample_size":28,"results":[{"name":"test_query","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server1","server2"],"data_center":["DC1"]},"values":[[1400000001000,792.0],[1400000002000,1386.0],[1400000003000,1440.0],[1400000004000,1446.0],[1400000005000,1452.0],[1400000006000,1458.0],[1400000007000,1464.0],[1400000008000,1470.0],[1400000009000,1476.0],[1400000010000,1482.0],[1400000011000,1488.0],[1400000012000,1494.0],[1400000013000,1500.0],[1400000014000,1506.0],[1400000015000,1512.0],[1400000016000,1518.0],[1400000017000,1524.0],[1400000018000,1530.0],[1400000019000,1536.0],[1400000020000,1542.0],[1400000021000,1548.0],[1400000022000,1554.0],[1400000023000,1560.0],[1400000024000,1566.0],[1400000025000,1572.0],[1400000026000,1578.0],[1400000027000,1584.0]]}]}]} 

格式化后为:

{ "queries": [ { "sample_size": 28, "results": [ { "name": "test_query", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "host": [ "server1", "server2" ], "data_center": [ "DC1" ] }, "values": [ [ 1400000001000, 792 ], [ 1400000002000, 1386 ], [ 1400000003000, 1440 ], [ 1400000004000, 1446 ], [ 1400000005000, 1452 ], [ 1400000006000, 1458 ], [ 1400000007000, 1464 ], [ 1400000008000, 1470 ], [ 1400000009000, 1476 ], [ 1400000010000, 1482 ], [ 1400000011000, 1488 ], [ 1400000012000, 1494 ], [ 1400000013000, 1500 ], [ 1400000014000, 1506 ], [ 1400000015000, 1512 ], [ 1400000016000, 1518 ], [ 1400000017000, 1524 ], [ 1400000018000, 1530 ], [ 1400000019000, 1536 ], [ 1400000020000, 1542 ], [ 1400000021000, 1548 ], [ 1400000022000, 1554 ], [ 1400000023000, 1560 ], [ 1400000024000, 1566 ], [ 1400000025000, 1572 ], [ 1400000026000, 1578 ], [ 1400000027000, 1584 ] ] } ] } ] } 

2.2.2.15 百分位数聚合查询测试用例(percentile)

创建 percentile_query.json

$ vim percentile_query.json 

输入以下内容

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "days" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "test_query", "aggregators": [ { "name": "percentile", "sampling": { "value": "5", "unit": "seconds" }, "percentile": "0.75" } ] } ] } 

执行以下命令

$ curl -XPOST -H'Content-Type: application/json' -d @percentile_query.json http://[host]:[port]/api/v1/datapoints/query 

得到返回:

{"queries":[{"sample_size":28,"results":[{"name":"test_query","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server1","server2"],"data_center":["DC1"]},"values":[[1400000000000,12.3],[1400000001000,24.15],[1400000006000,24.65],[1400000011000,25.15],[1400000016000,25.65],[1400000021000,26.15],[1400000026000,26.3]]}]}]} 

格式化后为:

{ "queries": [ { "sample_size": 28, "results": [ { "name": "test_query", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "host": [ "server1", "server2" ], "data_center": [ "DC1" ] }, "values": [ [ 1400000000000, 12.3 ], [ 1400000001000, 24.15 ], [ 1400000006000, 24.65 ], [ 1400000011000, 25.15 ], [ 1400000016000, 25.65 ], [ 1400000021000, 26.15 ], [ 1400000026000, 26.3 ] ] } ] } ] } 

2.3 删除功能测试用例

2.3.1 删除数据点

  1. 准备测试数据,写入 JSON 为
[	{	"name": "test_query",	"datapoints": [	[1400000000000, 12.3],	[1400000001000, 13.2],	[1400000002000, 23.1],	[1400000003000, 24.0],	[1400000004000, 24.1],	[1400000009000, 24.6],	[1400000010000, 24.7],	[1400000011000, 24.8],	[1400000012000, 24.9],	[1400000013000, 25.0],	[1400000014000, 25.1],	[1400000015000, 25.2],	[1400000016000, 25.3],	[1400000017000, 25.4],	[1400000023000, 26.0],	[1400000024000, 26.1],	[1400000025000, 26.2],	[1400000026000, 26.3],	[1400000027000, 26.4]	],	"tags": {	"host": "server1",	"data_center": "DC1"	}	},	{	"name": "test_query",	"datapoints": [	[1400000005000, 24.2],	[1400000006000, 24.3],	[1400000007000, 24.4],	[1400000008000, 24.5],	[1400000018000, 25.5],	[1400000019000, 25.6],	[1400000020000, 25.7],	[1400000021000, 25.8],	[1400000022000, 25.9]	],	"tags": {	"host": "server2",	"data_center": "DC1"	}	} ] 

写入过程与插入测试相同:

$ curl -XPOST -H'Content-Type: application/json' -d @insert.json http://[host]:[port]/api/v1/datapoints 
  1. 编写测试 JSON 文件
$ vim delete.json 

输入以下 JSON :

{	"start_absolute" : 1,	"end_relative": {	"value": "5",	"unit": "days"	},	"time_zone": "Asia/Kabul",	"metrics": [	{	"name": "test_query",	"tags": {	"host": [ "server2" ]	}	}	] } 

该JSON表示删除metric为test_query,且host为server2的所以数据

  1. 向 IKR 服务发送删除请求
$ curl -XPOST -H'Content-Type: application/json' -d @delete.json http://[host]:[port]/api/v1/datapoints/delete 
  1. 查询数据
$ curl -XPOST -H'Content-Type: application/json' -d @query.json http://[host]:[port]/api/v1/datapoints/query 

其中的 query.json 为:

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "days" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "test_query" }] } 

得到结果:

{"queries":[{"sample_size":19,"results":[{"name":"test_query","group_by":[{"name":"type","type":"number"}],"tags":{"host":["server1"],"data_center":["DC1"]},"values":[[1400000000000,12.3],[1400000001000,13.2],[1400000002000,23.1],[1400000003000,24.0],[1400000004000,24.1],[1400000009000,24.6],[1400000010000,24.7],[1400000011000,24.8],[1400000012000,24.9],[1400000013000,25.0],[1400000014000,25.1],[1400000015000,25.2],[1400000016000,25.3],[1400000017000,25.4],[1400000023000,26.0],[1400000024000,26.1],[1400000025000,26.2],[1400000026000,26.3],[1400000027000,26.4]]}]}]} 

格式化后为:

{ "queries":[ { "sample_size":19, "results":[ { "name":"test_query", "group_by": [ { "name":"type", "type":"number" }], "tags": { "host":["server1"], "data_center":["DC1"] }, "values": [ [1400000000000,12.3], [1400000001000,13.2], [1400000002000,23.1], [1400000003000,24.0], [1400000004000,24.1], [1400000009000,24.6], [1400000010000,24.7], [1400000011000,24.8], [1400000012000,24.9], [1400000013000,25.0], [1400000014000,25.1], [1400000015000,25.2], [1400000016000,25.3], [1400000017000,25.4], [1400000023000,26.0], [1400000024000,26.1], [1400000025000,26.2], [1400000026000,26.3], [1400000027000,26.4] ] }] }] } 

返回结果中没有host是server2的数据,说明删除成功。

2.3.2 删除metric

  1. 向 IKR 服务发送删除metric请求
$ curl -XDELETE http://[host]:[port]/api/v1/metric/[metric_name] 

将[metric_name]替换为 test_query,以删除test_query这整个metric

$ curl -XDELETE http://[host]:[port]/api/v1/metric/test_query 
  1. 查询数据
$ curl -XPOST -H'Content-Type: application/json' -d @query.json http://[host]:[port]/api/v1/datapoints/query 

其中的 query.json 为:

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "days" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "test_query" }] } 

得到结果:

{"queries":[{"sample_size":0,"results":[{"name":"test_query","group_by":[],"tags":{},"values":[]}]}]} 

格式化后为:

{ "queries": [ { "sample_size": 0, "results": [ { "name": "test_query", "group_by": [], "tags": {}, "values": [] } ] } ] } 

返回结果中没有任何数据,说明成功删除了metric。

2.4 Metadata 功能测试用例

  1. 写入数据
$ curl -XPOST -H'Content-Type: application/json' --data "t_value" http://[host]:[port]/api/v1/metadata/t_service/t_service_key/t_key $ curl -XPOST -H'Content-Type: application/json' --data "t_value2" http://[host]:[port]/api/v1/metadata/t_service/t_service_key/t_key2 $ curl -XPOST -H'Content-Type: application/json' --data "t_value3" http://[host]:[port]/api/v1/metadata/t_service/t_service_key2/t_key3 $ curl -XPOST -H'Content-Type: application/json' --data "t_value4" http://[host]:[port]/api/v1/metadata/t_service/t_service_key2/t_key4 $ curl -XPOST -H'Content-Type: application/json' --data "t_value5" http://[host]:[port]/api/v1/metadata/t_service2/t_service_key3/t_key5 

执行以上命令后,我们存入的数据应如下表所示:

service service key key value
t_service t_service_key t_key t_value
t_service t_service_key t_key2 t_value2
t_service t_service_key2 t_key3 t_value3
t_service t_service_key2 t_key4 t_value4
t_service t_service_key3 t_key5 t_value5
  1. 读取数据
$ curl http://[host]:[port]/api/v1/metadata/t_service/t_service_key/t_key t_value $ curl http://[host]:[port]/api/v1/metadata/t_service/t_service_key/t_key2 t_value2 $ curl http://[host]:[port]/api/v1/metadata/t_service/t_service_key2/t_key3 t_value3 $ curl http://[host]:[port]/api/v1/metadata/t_service/t_service_key2/t_key4 t_value4 $ curl http://[host]:[port]/api/v1/metadata/t_service2/t_service_key3/t_key5 t_value5 $ curl http://[host]:[port]/api/v1/metadata/t_service/t_service_key {"results":["t_key","t_key2"]} $ curl http://[host]:[port]/api/v1/metadata/t_service/t_service_key2 {"results":["t_key3","t_key4"]} $ curl http://[host]:[port]/api/v1/metadata/t_service2/t_service_key3 {"results":["t_key5"]} $ curl http://[host]:[port]/api/v1/metadata/t_service {"results":["t_service_key","t_service_key2"]} $ curl http://[host]:[port]/api/v1/metadata/t_service2 {"results":["t_service_key3"]} 

如上执行命令即可得到相应的值

  1. 修改数据:修改数据和插入数据同理

  2. 删除数据

$ curl -XDELETE http://[host]:[port]/api/v1/metadata/t_service/t_service_key/t_key2 

执行以上命令删除对应的value,然后查询:

$ curl http://[host]:[port]/api/v1/metadata/t_service/t_service_key/t_key2 

结果返回为空,即已经成功删除

2.5 Roll-up功能测试用例

2.5.1 准备数据

使用查询测试数据作为Roll-up功能的测试数据,需要特别注意的是rollup任务只对从rollup任务创建时刻开始,到当前时刻内的数据进行汇总。因此以下JSON中的时间戳需要根据执行时的真实时间修改,否则将有可能查询不到rollup汇总的结果数据:

vim insert.json 

写入 JSON 为

[ { "name": "test_query", "datapoints": [ [1400000000000, 12.3], [1400000001000, 13.2], [1400000002000, 23.1], [1400000003000, 24.0], [1400000004000, 24.1], [1400000009000, 24.6], [1400000010000, 24.7], [1400000011000, 24.8], [1400000012000, 24.9], [1400000013000, 25.0], [1400000014000, 25.1], [1400000015000, 25.2], [1400000016000, 25.3], [1400000017000, 25.4], [1400000023000, 26.0], [1400000024000, 26.1], [1400000025000, 26.2], [1400000026000, 26.3], [1400000027000, 26.4] ], "tags": { "host": "server1", "data_center": "DC1" } }, { "name": "test_query", "datapoints": [ [1400000005000, 24.2], [1400000006000, 24.3], [1400000007000, 24.4], [1400000008000, 24.5], [1400000018000, 25.5], [1400000019000, 25.6], [1400000020000, 25.7], [1400000021000, 25.8], [1400000022000, 25.9] ], "tags": { "host": "server2", "data_center": "DC1" } } ] 

写入过程与插入测试相同:

$ curl -XPOST -H'Content-Type: application/json' -d @insert.json http://[host]:[port]/api/v1/datapoints 

2.5.2 创建Roll-up任务

  1. 编写创建Roll-up任务的JSON文件
$ vim create_rollup.json 

输入以下 JSON :

{ "name": "MyRollup1", "execution_interval": { "value": 2, "unit": "seconds" }, "rollups": [ { "save_as": "rollup1", "query": { "start_relative": { "value": "10", "unit": "years" }, "end_relative": { "value": "1", "unit": "seconds" }, "metrics": [ { "name": "test_query", "tags": {}, "aggregators": [ { "name": "sum", "sampling": { "value": 2, "unit": "seconds" } } ] } ] } } ] } 
  1. 向 IKR 服务发送请求
$ curl -XPOST -H'Content-Type: application/json' -d @create_rollup.json http://[host]:[port]/api/v1/rollups 

得到类似以下结果(Response):

{"id":"1557338016912","name":"MyRollup1","attributes":{"url":"/api/v1/rollups/1557338016912"}} 

其中id对应的value是创建rollup任务时的时间戳 查看IKR的日志输出,将定期输出类似以下内容:

2019-05-09 02:00:21,936 INFO cn.edu.tsinghua.iotdb.kairosdb.rollup.RollUp:73 - Roll-up id: 1557338016912, name: MyRollup1, execution_interval: 2 SECONDS 
  1. 查询rollup汇总数据
$ curl -XPOST -H'Content-Type: application/json' -d @query.json http://[host]:[port]/api/v1/datapoints/query 

其中的 query.json 为:

{ "start_absolute": 1, "end_relative": { "value": "5", "unit": "seconds" }, "time_zone": "Asia/Kabul", "metrics": [ { "name": "rollup1" }] } 

得到结果:

{"queries":[{"sample_size":15,"results":[{"name":"rollup1","group_by":[{"name":"type","type":"number"}],"tags":{"saved_from":["test_query"]},"values":[[1400000000000,12.3],[1400000001000,36.3],[1400000003000,48.1],[1400000005000,48.5],[1400000007000,48.9],[1400000009000,49.3],[1400000011000,49.7],[1400000013000,50.1],[1400000015000,50.5],[1400000017000,50.9],[1400000019000,51.3],[1400000021000,51.7],[1400000023000,52.1],[1400000025000,52.5],[1400000027000,26.4]]}]}]} 

格式化后:

{ "queries": [ { "sample_size": 15, "results": [ { "name": "rollup1", "group_by": [ { "name": "type", "type": "number" } ], "tags": { "saved_from": [ "test_query" ] }, "values": [ [ 1400000000000, 12.3 ], [ 1400000001000, 36.3 ], [ 1400000003000, 48.1 ], [ 1400000005000, 48.5 ], [ 1400000007000, 48.9 ], [ 1400000009000, 49.3 ], [ 1400000011000, 49.7 ], [ 1400000013000, 50.1 ], [ 1400000015000, 50.5 ], [ 1400000017000, 50.9 ], [ 1400000019000, 51.3 ], [ 1400000021000, 51.7 ], [ 1400000023000, 52.1 ], [ 1400000025000, 52.5 ], [ 1400000027000, 26.4 ] ] } ] } ] } 

可以看rollup任务的查询结果成功写入了rollup1这个metric中,且它有一个tag key是saved_from,tag value为test_query。

2.5.3 查询Roll-up任务

  1. 查询所有rollup任务
$ curl http://[host]:[port]/api/v1/rollups 

执行如上命令即可得到所有的rollup任务list 得到类似以下结果:

[{"id":"1557338016912", "name": "MyRollup1", "execution_interval": { "value": 2, "unit": "seconds" }, "rollups": [ { "save_as": "rollup1", "query": { "start_relative": { "value": "10", "unit": "years" }, "end_relative": { "value": "1", "unit": "seconds" }, "metrics": [ { "name": "test_query", "tags": { }, "aggregators": [ { "name": "sum", "sampling": { "value": 2, "unit": "seconds" } } ] } ] } } ] 
  1. 根据id查询rollup任务
$ curl http://[host]:[port]/api/v1/rollups/[id] 

上面命令中的id是创建rollup任务时返回的id, 执行如上命令即可得到对于id的rollup任务JSON 得到类似以下结果:

{"id":"1557338016912", "name": "MyRollup1", "execution_interval": { "value": 2, "unit": "seconds" }, "rollups": [ { "save_as": "rollup1", "query": { "start_relative": { "value": "10", "unit": "years" }, "end_relative": { "value": "1", "unit": "seconds" }, "metrics": [ { "name": "test_query", "tags": { }, "aggregators": [ { "name": "sum", "sampling": { "value": 2, "unit": "seconds" } } ] } ] } } ] } 

2.5.4 更新Roll-up任务

http://[host]:[port]/api/v1/rollups/{id} 
  1. 编写更新Roll-up任务的JSON文件
$ vim update_rollup.json 

输入以下 JSON :

{ "name": "MyRollup1Update", "execution_interval": { "value": 3, "unit": "seconds" }, "rollups": [ { "save_as": "rollup1", "query": { "start_relative": { "value": "10", "unit": "years" }, "end_relative": { "value": "1", "unit": "seconds" }, "metrics": [ { "name": "test_query", "tags": {}, "aggregators": [ { "name": "sum", "sampling": { "value": 2, "unit": "seconds" } } ] } ] } } ] } 
  1. 向 IKR 服务发送请求
$ curl -XPUT -H'Content-Type: application/json' -d @update_rollup.json http://[host]:[port]/api/v1/rollups/[id] 

得到类似以下结果(Response):

{"id":"1557338016912","name":"MyRollup1Update","attributes":{"url":"/api/v1/rollups/1557338016912"}} 

其中id依然是之前创建的rollup任务的id 查看IKR的日志输出,将定期输出类似以下内容:

2019-05-09 11:12:10,953 INFO cn.edu.tsinghua.iotdb.kairosdb.rollup.RollUp:73 - Roll-up id: 1557371470757, name: MyRollup1Update, execution_interval: 3 SECONDS 

输出间隔变为3秒,name变为MyRollup1Update,与更新的JSON中指定的一致,说明更新成功。

2.5.5 删除Roll-up任务

$ curl -XDELETE http://[host]:[port]/api/v1/rollups/[id] 

执行如上命令即可删除对应id的rollup任务 观察IKR日志发现对应rollup任务的定时日志已经停止输出,说明rollup任务已经成功删除 更新和创建方法类似,区别是更新的URL中包含了更新的rollup任务对应的id,以及使用的请求方法是PUT

2.6 健康检查功能测试用例

  1. 向 IKR 服务发送status健康检查请求
$ curl http://[host]:[port]/api/v1/health/status 

正常情况下的返回结果:

["JVM-Thread-Deadlock: OK","Datastore-Query: OK"] 
  1. 向 IKR 服务发送check健康检查请求
$ curl -w %{http_code} http://[host]:[port]/api/v1/health/check 

返回结果为状态码:

204 

2.7 查询指标名称功能测试用例

  1. 向 IKR 服务发送查询所有metric name的请求
$ curl http://[host]:[port]/api/v1/metricnames 

返回类似以下结果:

{"results":["archive_file_search","archive_file_tracked","rollup1"]} 
  1. 向 IKR 服务发送查询以指定字符串开头的metric name的请求
# Mac $ curl http://[host]:[port]/api/v1/metricnames\?prefix=[prefix] # Ubuntu $ curl http://[host]:[port]/api/v1/metricnames/?prefix=[prefix] 

将[prefix]替换为ar,表示查询以ar开头的metric

# Mac $ curl http://[host]:[port]/api/v1/metricnames\?prefix=ar # Ubuntu $ curl http://[host]:[port]/api/v1/metricnames\?prefix=ar 

返回类似以下结果:

{"results":["archive_file_search","archive_file_tracked"]} 

返回结果中都是以ar开头的metric name,若没有符合要求的则返回的result数组为空:

{"results":[]} 

About

Apache IoTDB Rest API for partially compatible with KairosDB

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 5