- implementing logic to implement the 'compression' attribute - adjusting tests - updating readme.md
This commit is contained in:
54
Readme.md
54
Readme.md
@ -132,22 +132,16 @@ Queries are applied to a range. The following types of ranges ranges are support
|
|||||||
|
|
||||||
## Query Data
|
## Query Data
|
||||||
|
|
||||||
### `compressed`: data is compressed by default
|
### `compression`: compression of data can be enabled
|
||||||
|
|
||||||
To save bandwidth, all data transferred from the server to the client is compressed (gzipped) by default. In case compressing the data is too processor-intense, it can be disabled by specifying `compressed=false` in the body of the request.
|
By default, no data is compressed when transferred from the server to the client. However, compression can be enabled by setting the `compression` attribute to a value other than `none`, i.e. to `gzip` or `deflate`.
|
||||||
|
|
||||||
Because of this, we have to tell `curl` that the data is compressed so that it is being decompressed automatically. `curl` decompresses the response when the `--compressed` parameter is set:
|
If compression is enabled, we have to tell `curl` that the data is compressed so that it is being decompressed automatically. `curl` decompresses the response when the `--compressed` parameter is set:
|
||||||
|
|
||||||
#### Example
|
#### Example
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl --compressed -H "Content-Type: application/json" -X POST -d '{"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
curl --compressed -H "Content-Type: application/json" -X POST -d '{"compression":"gzip","range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
||||||
```
|
|
||||||
|
|
||||||
If we want the raw data uncompressed from the server, we have to specify this in the query body parameter with by specifying `compressed=false`:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
curl -H "Content-Type: application/json" -X POST -d '{"compressed":false,"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
@ -160,7 +154,7 @@ CSV export does not support `index` and `extrema` aggregations.
|
|||||||
#### Example
|
#### Example
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl --compressed -H "Content-Type: application/json" -X POST -d '{"responseFormat":"csv","range":{"startPulseId":0,"endPulseId":4},"channels":["channel1","channel2"],"fields":["channel","pulseId","iocMillis","iocNanos","globalMillis","globalNanos","shape","eventCount","value"]}' http://data-api.psi.ch/sf/query
|
curl -H "Content-Type: application/json" -X POST -d '{"responseFormat":"csv","range":{"startPulseId":0,"endPulseId":4},"channels":["channel1","channel2"],"fields":["channel","pulseId","iocMillis","iocNanos","globalMillis","globalNanos","shape","eventCount","value"]}' http://data-api.psi.ch/sf/query
|
||||||
```
|
```
|
||||||
|
|
||||||
#### Response example
|
#### Response example
|
||||||
@ -197,29 +191,29 @@ The following attributes can be specified:
|
|||||||
|
|
||||||
- **channels**: Array of channel names to be queried.
|
- **channels**: Array of channel names to be queried.
|
||||||
- **range**: The range of the query (see [Query Range](Readme.md#query_range)).
|
- **range**: The range of the query (see [Query Range](Readme.md#query_range)).
|
||||||
- **ordering**: The ordering of the data (see [here](https://github.psi.ch/projects/ST/repos/ch.psi.daq.common/browse/src/main/java/ch/psi/daq/common/ordering/Ordering.java) for possible values).
|
- **ordering**: The ordering of the data (see [here](https://git.psi.ch/sf_daq/ch.psi.daq.common/blob/master/src/main/java/ch/psi/daq/common/ordering/Ordering.java) for possible values).
|
||||||
- **fields**: The requested fields (see [here](https://github.psi.ch/projects/ST/repos/ch.psi.daq.query/browse/src/main/java/ch/psi/daq/query/model/QueryField.java) for possible values).
|
- **fields**: The requested fields (see [here](https://git.psi.ch/sf_daq/ch.psi.daq.query/blob/master/src/main/java/ch/psi/daq/query/model/QueryField.java) for possible values).
|
||||||
- **nrOfBins**: Activates data binning. Specifies the number of bins the pulse/time range should be divided into.
|
- **nrOfBins**: Activates data binning. Specifies the number of bins the pulse/time range should be divided into.
|
||||||
- **binSize**: Activates data binning. Specifies the number of pulses per bin for pulse-range queries or the number of milliseconds per bin for time-range queries (using number of pulses and number of milliseconds makes this binning strategy consistent between channel with different update frequencies).
|
- **binSize**: Activates data binning. Specifies the number of pulses per bin for pulse-range queries or the number of milliseconds per bin for time-range queries (using number of pulses and number of milliseconds makes this binning strategy consistent between channel with different update frequencies).
|
||||||
- **aggregations**: Activates data aggregation. Array of requested aggregations (see [here](https://github.psi.ch/projects/ST/repos/ch.psi.daq.query/browse/src/main/java/ch/psi/daq/query/model/Aggregation.java) for possible values). These values will be added to the *data* array response.
|
- **aggregations**: Activates data aggregation. Array of requested aggregations (see [here](https://git.psi.ch/sf_daq/ch.psi.daq.query/blob/master/src/main/java/ch/psi/daq/query/model/Aggregation.java) for possible values). These values will be added to the *data* array response.
|
||||||
- **aggregationType**: Specifies the type of aggregation (see [here](https://github.psi.ch/projects/ST/repos/ch.psi.daq.query/browse/src/main/java/ch/psi/daq/query/model/AggregationType.java)). The default type is *value* aggregation (e.g., sum([1,2,3])=6). Alternatively, it is possible to define *index* aggregation for multiple arrays in combination with binning (e.g., sum([1,2,3], [3,2,1]) = [4,4,4]).
|
- **aggregationType**: Specifies the type of aggregation (see [here](https://git.psi.ch/sf_daq/ch.psi.daq.query/blob/master/src/main/java/ch/psi/daq/query/model/AggregationType.java)). The default type is *value* aggregation (e.g., sum([1,2,3])=6). Alternatively, it is possible to define *index* aggregation for multiple arrays in combination with binning (e.g., sum([1,2,3], [3,2,1]) = [4,4,4]).
|
||||||
- **aggregateChannels**: Specifies whether the data of the requested channels should be combined together using the defined aggregation (values: true|**false**)
|
- **aggregateChannels**: Specifies whether the data of the requested channels should be combined together using the defined aggregation (values: true|**false**)
|
||||||
- **dbMode**: Defines the database to access (values: **databuffer**|archiverappliance)
|
- **dbMode**: Defines the database to access (values: **databuffer**|archiverappliance)
|
||||||
- **compressed**: Defines whether the response should be compressed or not (values: **true**|false)
|
- **compression**: Defines the compression algorithm to use, default value is **none**, see all values [here](https://git.psi.ch/sf_daq/ch.psi.daq.domain/blob/master/src/main/java/ch/psi/daq/domain/Compression.java))
|
||||||
- **responseFormat**: Specifies the format the response of the requested data is in, either in JSON or CSV format (values: **json**|csv)
|
- **responseFormat**: Specifies the format the response of the requested data is in, either in JSON or CSV format, default value **json**, see all values [here](https://git.psi.ch/sf_daq/ch.psi.daq.domain/blob/master/src/main/java/ch/psi/daq/domain/ResponseType.java))
|
||||||
|
|
||||||
### Example
|
### Example
|
||||||
|
|
||||||
Compressed data but uncompressed by `curl`:
|
Compressed data but uncompressed by `curl`:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl --compressed -H "Content-Type: application/json" -X POST -d '{"range":{"startPulseId":0,"endPulseId":4},"channels":["channel1","channel2"]}' http://data-api.psi.ch/sf/query
|
curl --compressed -H "Content-Type: application/json" -X POST -d '{"compression":"deflate","range":{"startPulseId":0,"endPulseId":4},"channels":["channel1","channel2"]}' http://data-api.psi.ch/sf/query
|
||||||
```
|
```
|
||||||
|
|
||||||
Raw, uncompressed data (returns non-human-readable data):
|
Raw, uncompressed data (returns non-human-readable data):
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl -H "Content-Type: application/json" -X POST -d '{"compressed": false,"range":{"startPulseId":0,"endPulseId":4},"channels":["channel1","channel2"]}' http://data-api.psi.ch/sf/query
|
curl -H "Content-Type: application/json" -X POST -d '{"range":{"startPulseId":0,"endPulseId":4},"channels":["channel1","channel2"]}' http://data-api.psi.ch/sf/query
|
||||||
```
|
```
|
||||||
|
|
||||||
### Response example
|
### Response example
|
||||||
@ -360,7 +354,7 @@ The following examples build on waveform data (see below). They also work for sc
|
|||||||
###### Command
|
###### Command
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl --compressed -H "Content-Type: application/json" -X POST -d '{"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
curl -H "Content-Type: application/json" -X POST -d '{"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
||||||
```
|
```
|
||||||
|
|
||||||
###### Response
|
###### Response
|
||||||
@ -388,7 +382,7 @@ See JSON representation of the data above.
|
|||||||
###### Command
|
###### Command
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl --compressed -H "Content-Type: application/json" -X POST -d '{"range":{"startMillis":0,"startNanos":0,"endMillis":30,"endNanos":999999},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
curl -H "Content-Type: application/json" -X POST -d '{"range":{"startMillis":0,"startNanos":0,"endMillis":30,"endNanos":999999},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
||||||
```
|
```
|
||||||
|
|
||||||
###### Response
|
###### Response
|
||||||
@ -418,7 +412,7 @@ Supported format is ISO8601 *YYYY-MM-DDThh:mm:ss.sTZD* (e.g. *1997-07-16T19:20:3
|
|||||||
###### Command
|
###### Command
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl --compressed -H "Content-Type: application/json" -X POST -d '{"range":{"startDate":"1970-01-01T01:00:00.000","startNanos":0,"endDate":"1970-01-01T01:00:00.030","endNanos":999999},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
curl -H "Content-Type: application/json" -X POST -d '{"range":{"startDate":"1970-01-01T01:00:00.000","startNanos":0,"endDate":"1970-01-01T01:00:00.030","endNanos":999999},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
||||||
```
|
```
|
||||||
|
|
||||||
###### Response
|
###### Response
|
||||||
@ -450,7 +444,7 @@ Archiver Appliance supports queries by *time range* and *date range* only (as it
|
|||||||
###### Command
|
###### Command
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl --compressed -H "Content-Type: application/json" -X POST -d '{"dbmode":"archiverappliance","range":{"startMillis":0,"startNanos":0,"endMillis":30,"endNanos":999999},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
curl -H "Content-Type: application/json" -X POST -d '{"dbmode":"archiverappliance","range":{"startMillis":0,"startNanos":0,"endMillis":30,"endNanos":999999},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
||||||
```
|
```
|
||||||
|
|
||||||
###### Response
|
###### Response
|
||||||
@ -479,7 +473,7 @@ Allows for server side optimizations since not all data needs to be retrieved.
|
|||||||
###### Command
|
###### Command
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl --compressed -H "Content-Type: application/json" -X POST -d '{"fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
curl -H "Content-Type: application/json" -X POST -d '{"fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
||||||
```
|
```
|
||||||
|
|
||||||
###### Response
|
###### Response
|
||||||
@ -532,7 +526,7 @@ Use **none** in case ordering does not matter (allows for server side optimizati
|
|||||||
###### Command
|
###### Command
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl --compressed -H "Content-Type: application/json" -X POST -d '{"ordering":"desc","fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
curl -H "Content-Type: application/json" -X POST -d '{"ordering":"desc","fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
||||||
```
|
```
|
||||||
|
|
||||||
###### Response
|
###### Response
|
||||||
@ -585,7 +579,7 @@ curl --compressed -H "Content-Type: application/json" -X POST -d '{"ordering":"d
|
|||||||
###### Command
|
###### Command
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl --compressed -H "Content-Type: application/json" -X POST -d '{"aggregationType":"value","aggregations":["min","max","mean"],"fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
curl -H "Content-Type: application/json" -X POST -d '{"aggregationType":"value","aggregations":["min","max","mean"],"fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
||||||
```
|
```
|
||||||
|
|
||||||
###### Response
|
###### Response
|
||||||
@ -659,7 +653,7 @@ Array value [aggregations](https://github.psi.ch/projects/ST/repos/ch.psi.daq.qu
|
|||||||
###### Command
|
###### Command
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl --compressed -H "Content-Type: application/json" -X POST -d '{"nrOfBins":2,"aggregationType":"value","aggregations":["min","max","mean"],"fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
curl -H "Content-Type: application/json" -X POST -d '{"nrOfBins":2,"aggregationType":"value","aggregations":["min","max","mean"],"fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
||||||
```
|
```
|
||||||
|
|
||||||
###### Response
|
###### Response
|
||||||
@ -718,7 +712,7 @@ Array value [aggregations](https://github.psi.ch/projects/ST/repos/ch.psi.daq.qu
|
|||||||
###### Command
|
###### Command
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl --compressed -H "Content-Type: application/json" -X POST -d '{"binSize":10,"aggregationType":"value","aggregations":["min","max","mean"],"fields":["globalMillis","value"],"range":{"globalMillis":0,"globalMillis":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
curl -H "Content-Type: application/json" -X POST -d '{"binSize":10,"aggregationType":"value","aggregations":["min","max","mean"],"fields":["globalMillis","value"],"range":{"globalMillis":0,"globalMillis":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
||||||
```
|
```
|
||||||
|
|
||||||
###### Response
|
###### Response
|
||||||
@ -775,7 +769,7 @@ Array value [aggregations](https://github.psi.ch/projects/ST/repos/ch.psi.daq.qu
|
|||||||
###### Command
|
###### Command
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl --compressed -H "Content-Type: application/json" -X POST -d '{"nrOfBins":1,"aggregationType":"index","aggregations":["min","max","mean","sum"],"fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
curl -H "Content-Type: application/json" -X POST -d '{"nrOfBins":1,"aggregationType":"index","aggregations":["min","max","mean","sum"],"fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
||||||
```
|
```
|
||||||
|
|
||||||
###### Response
|
###### Response
|
||||||
@ -844,7 +838,7 @@ curl --compressed -H "Content-Type: application/json" -X POST -d '{"nrOfBins":1,
|
|||||||
###### Command
|
###### Command
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
curl --compressed -H "Content-Type: application/json" -X POST -d '{"aggregationType":"extrema","aggregations":["min","max","sum"],"fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
curl -H "Content-Type: application/json" -X POST -d '{"aggregationType":"extrema","aggregations":["min","max","sum"],"fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' http://data-api.psi.ch/sf/query
|
||||||
```
|
```
|
||||||
|
|
||||||
###### Response
|
###### Response
|
||||||
|
5
daqlocal-compression-benchmark.csv
Normal file
5
daqlocal-compression-benchmark.csv
Normal file
@ -0,0 +1,5 @@
|
|||||||
|
Query duration (s),none / time (s),none / space (mb),gzip / time (s),gzip / space (mb),deflate / time (s),deflate / space (mb)
|
||||||
|
10,1,35.3,5.1,16.2,5.1,16.2
|
||||||
|
30,3,108,15.7,49.7,15.6,49.7
|
||||||
|
60,6,208,30.4,95.5,30.1,95.5
|
||||||
|
300,25,900,129,413,127,413
|
|
@ -4,7 +4,6 @@
|
|||||||
package ch.psi.daq.queryrest.response;
|
package ch.psi.daq.queryrest.response;
|
||||||
|
|
||||||
import java.io.OutputStream;
|
import java.io.OutputStream;
|
||||||
import java.util.zip.GZIPOutputStream;
|
|
||||||
|
|
||||||
import javax.servlet.http.HttpServletResponse;
|
import javax.servlet.http.HttpServletResponse;
|
||||||
|
|
||||||
@ -44,8 +43,8 @@ public abstract class AbstractResponseStreamWriter implements ResponseStreamWrit
|
|||||||
response.addHeader("Content-Type", contentType);
|
response.addHeader("Content-Type", contentType);
|
||||||
if (query.isCompressed()) {
|
if (query.isCompressed()) {
|
||||||
response.addHeader("Content-Disposition", "attachment; filename=data.gz");
|
response.addHeader("Content-Disposition", "attachment; filename=data.gz");
|
||||||
response.addHeader("Content-Encoding", "gzip");
|
response.addHeader("Content-Encoding", query.getCompression().toString());
|
||||||
out = new GZIPOutputStream(out);
|
out = query.getCompression().wrapStream(out);
|
||||||
} else {
|
} else {
|
||||||
response.addHeader("Content-Disposition", "attachment; filename=data.csv");
|
response.addHeader("Content-Disposition", "attachment; filename=data.csv");
|
||||||
}
|
}
|
||||||
|
@ -29,6 +29,7 @@ import ch.psi.daq.cassandra.request.range.RequestRangePulseId;
|
|||||||
import ch.psi.daq.cassandra.request.range.RequestRangeTime;
|
import ch.psi.daq.cassandra.request.range.RequestRangeTime;
|
||||||
import ch.psi.daq.cassandra.util.test.CassandraDataGen;
|
import ch.psi.daq.cassandra.util.test.CassandraDataGen;
|
||||||
import ch.psi.daq.common.ordering.Ordering;
|
import ch.psi.daq.common.ordering.Ordering;
|
||||||
|
import ch.psi.daq.domain.Compression;
|
||||||
import ch.psi.daq.domain.ResponseFormat;
|
import ch.psi.daq.domain.ResponseFormat;
|
||||||
import ch.psi.daq.query.model.Aggregation;
|
import ch.psi.daq.query.model.Aggregation;
|
||||||
import ch.psi.daq.query.model.AggregationType;
|
import ch.psi.daq.query.model.AggregationType;
|
||||||
@ -69,7 +70,7 @@ public class QueryRestControllerCsvTest extends AbstractDaqRestTest {
|
|||||||
0,
|
0,
|
||||||
1),
|
1),
|
||||||
TEST_CHANNEL_NAMES);
|
TEST_CHANNEL_NAMES);
|
||||||
request.setCompressed(false);
|
request.setCompression(Compression.NONE);
|
||||||
request.setResponseFormat(ResponseFormat.CSV);
|
request.setResponseFormat(ResponseFormat.CSV);
|
||||||
|
|
||||||
LinkedHashSet<QueryField> queryFields = new LinkedHashSet<>();
|
LinkedHashSet<QueryField> queryFields = new LinkedHashSet<>();
|
||||||
@ -146,7 +147,7 @@ public class QueryRestControllerCsvTest extends AbstractDaqRestTest {
|
|||||||
0,
|
0,
|
||||||
1),
|
1),
|
||||||
channelName);
|
channelName);
|
||||||
request.setCompressed(false);
|
request.setCompression(Compression.NONE);
|
||||||
request.setResponseFormat(ResponseFormat.CSV);
|
request.setResponseFormat(ResponseFormat.CSV);
|
||||||
|
|
||||||
LinkedHashSet<QueryField> queryFields = new LinkedHashSet<>();
|
LinkedHashSet<QueryField> queryFields = new LinkedHashSet<>();
|
||||||
@ -218,7 +219,7 @@ public class QueryRestControllerCsvTest extends AbstractDaqRestTest {
|
|||||||
0,
|
0,
|
||||||
10),
|
10),
|
||||||
TEST_CHANNEL_NAMES);
|
TEST_CHANNEL_NAMES);
|
||||||
request.setCompressed(false);
|
request.setCompression(Compression.NONE);
|
||||||
request.setResponseFormat(ResponseFormat.CSV);
|
request.setResponseFormat(ResponseFormat.CSV);
|
||||||
|
|
||||||
LinkedHashSet<QueryField> queryFields = new LinkedHashSet<>();
|
LinkedHashSet<QueryField> queryFields = new LinkedHashSet<>();
|
||||||
@ -296,7 +297,7 @@ public class QueryRestControllerCsvTest extends AbstractDaqRestTest {
|
|||||||
startDate,
|
startDate,
|
||||||
endDate),
|
endDate),
|
||||||
TEST_CHANNEL_NAMES);
|
TEST_CHANNEL_NAMES);
|
||||||
request.setCompressed(false);
|
request.setCompression(Compression.NONE);
|
||||||
request.setResponseFormat(ResponseFormat.CSV);
|
request.setResponseFormat(ResponseFormat.CSV);
|
||||||
|
|
||||||
LinkedHashSet<QueryField> queryFields = new LinkedHashSet<>();
|
LinkedHashSet<QueryField> queryFields = new LinkedHashSet<>();
|
||||||
@ -375,7 +376,7 @@ public class QueryRestControllerCsvTest extends AbstractDaqRestTest {
|
|||||||
Ordering.asc,
|
Ordering.asc,
|
||||||
AggregationType.extrema,
|
AggregationType.extrema,
|
||||||
TEST_CHANNEL_NAMES[0]);
|
TEST_CHANNEL_NAMES[0]);
|
||||||
request.setCompressed(false);
|
request.setCompression(Compression.NONE);
|
||||||
request.setResponseFormat(ResponseFormat.CSV);
|
request.setResponseFormat(ResponseFormat.CSV);
|
||||||
|
|
||||||
String content = mapper.writeValueAsString(request);
|
String content = mapper.writeValueAsString(request);
|
||||||
@ -407,7 +408,7 @@ public class QueryRestControllerCsvTest extends AbstractDaqRestTest {
|
|||||||
Ordering.asc,
|
Ordering.asc,
|
||||||
AggregationType.index,
|
AggregationType.index,
|
||||||
TEST_CHANNEL_NAMES[0]);
|
TEST_CHANNEL_NAMES[0]);
|
||||||
request.setCompressed(false);
|
request.setCompression(Compression.NONE);
|
||||||
request.setResponseFormat(ResponseFormat.CSV);
|
request.setResponseFormat(ResponseFormat.CSV);
|
||||||
|
|
||||||
String content = mapper.writeValueAsString(request);
|
String content = mapper.writeValueAsString(request);
|
||||||
@ -441,7 +442,7 @@ public class QueryRestControllerCsvTest extends AbstractDaqRestTest {
|
|||||||
endDate),
|
endDate),
|
||||||
TEST_CHANNEL_01);
|
TEST_CHANNEL_01);
|
||||||
request.setNrOfBins(2);
|
request.setNrOfBins(2);
|
||||||
request.setCompressed(false);
|
request.setCompression(Compression.NONE);
|
||||||
request.setResponseFormat(ResponseFormat.CSV);
|
request.setResponseFormat(ResponseFormat.CSV);
|
||||||
|
|
||||||
LinkedHashSet<QueryField> queryFields = new LinkedHashSet<>();
|
LinkedHashSet<QueryField> queryFields = new LinkedHashSet<>();
|
||||||
@ -527,7 +528,7 @@ public class QueryRestControllerCsvTest extends AbstractDaqRestTest {
|
|||||||
endDate),
|
endDate),
|
||||||
TEST_CHANNEL_01);
|
TEST_CHANNEL_01);
|
||||||
request.setBinSize(100);
|
request.setBinSize(100);
|
||||||
request.setCompressed(false);
|
request.setCompression(Compression.NONE);
|
||||||
request.setResponseFormat(ResponseFormat.CSV);
|
request.setResponseFormat(ResponseFormat.CSV);
|
||||||
|
|
||||||
LinkedHashSet<QueryField> queryFields = new LinkedHashSet<>();
|
LinkedHashSet<QueryField> queryFields = new LinkedHashSet<>();
|
||||||
|
@ -15,6 +15,7 @@ import ch.psi.daq.cassandra.request.range.RequestRangePulseId;
|
|||||||
import ch.psi.daq.cassandra.request.range.RequestRangeTime;
|
import ch.psi.daq.cassandra.request.range.RequestRangeTime;
|
||||||
import ch.psi.daq.cassandra.util.test.CassandraDataGen;
|
import ch.psi.daq.cassandra.util.test.CassandraDataGen;
|
||||||
import ch.psi.daq.common.ordering.Ordering;
|
import ch.psi.daq.common.ordering.Ordering;
|
||||||
|
import ch.psi.daq.domain.Compression;
|
||||||
import ch.psi.daq.query.model.AggregationType;
|
import ch.psi.daq.query.model.AggregationType;
|
||||||
import ch.psi.daq.query.model.impl.DAQQuery;
|
import ch.psi.daq.query.model.impl.DAQQuery;
|
||||||
import ch.psi.daq.queryrest.controller.QueryRestController;
|
import ch.psi.daq.queryrest.controller.QueryRestController;
|
||||||
@ -173,7 +174,7 @@ public class QueryRestControllerJsonTest extends AbstractDaqRestTest {
|
|||||||
10,
|
10,
|
||||||
11),
|
11),
|
||||||
TEST_CHANNEL_NAMES);
|
TEST_CHANNEL_NAMES);
|
||||||
request.setCompressed(false);
|
request.setCompression(Compression.NONE);
|
||||||
|
|
||||||
String content = mapper.writeValueAsString(request);
|
String content = mapper.writeValueAsString(request);
|
||||||
System.out.println(content);
|
System.out.println(content);
|
||||||
@ -209,7 +210,7 @@ public class QueryRestControllerJsonTest extends AbstractDaqRestTest {
|
|||||||
100,
|
100,
|
||||||
110),
|
110),
|
||||||
TEST_CHANNEL_NAMES);
|
TEST_CHANNEL_NAMES);
|
||||||
request.setCompressed(false);
|
request.setCompression(Compression.NONE);
|
||||||
|
|
||||||
String content = mapper.writeValueAsString(request);
|
String content = mapper.writeValueAsString(request);
|
||||||
|
|
||||||
@ -246,7 +247,7 @@ public class QueryRestControllerJsonTest extends AbstractDaqRestTest {
|
|||||||
startDate,
|
startDate,
|
||||||
endDate),
|
endDate),
|
||||||
TEST_CHANNEL_NAMES);
|
TEST_CHANNEL_NAMES);
|
||||||
request.setCompressed(false);
|
request.setCompression(Compression.NONE);
|
||||||
|
|
||||||
String content = mapper.writeValueAsString(request);
|
String content = mapper.writeValueAsString(request);
|
||||||
System.out.println(content);
|
System.out.println(content);
|
||||||
@ -287,7 +288,7 @@ public class QueryRestControllerJsonTest extends AbstractDaqRestTest {
|
|||||||
Ordering.asc,
|
Ordering.asc,
|
||||||
AggregationType.extrema,
|
AggregationType.extrema,
|
||||||
TEST_CHANNEL_NAMES[0]);
|
TEST_CHANNEL_NAMES[0]);
|
||||||
request.setCompressed(false);
|
request.setCompression(Compression.NONE);
|
||||||
|
|
||||||
String content = mapper.writeValueAsString(request);
|
String content = mapper.writeValueAsString(request);
|
||||||
|
|
||||||
@ -331,7 +332,7 @@ public class QueryRestControllerJsonTest extends AbstractDaqRestTest {
|
|||||||
endDate),
|
endDate),
|
||||||
TEST_CHANNEL_01);
|
TEST_CHANNEL_01);
|
||||||
request.setNrOfBins(2);
|
request.setNrOfBins(2);
|
||||||
request.setCompressed(false);
|
request.setCompression(Compression.NONE);
|
||||||
|
|
||||||
String content = mapper.writeValueAsString(request);
|
String content = mapper.writeValueAsString(request);
|
||||||
System.out.println(content);
|
System.out.println(content);
|
||||||
@ -369,7 +370,7 @@ public class QueryRestControllerJsonTest extends AbstractDaqRestTest {
|
|||||||
endDate),
|
endDate),
|
||||||
TEST_CHANNEL_01);
|
TEST_CHANNEL_01);
|
||||||
request.setBinSize(100);
|
request.setBinSize(100);
|
||||||
request.setCompressed(false);
|
request.setCompression(Compression.NONE);
|
||||||
|
|
||||||
String content = mapper.writeValueAsString(request);
|
String content = mapper.writeValueAsString(request);
|
||||||
System.out.println(content);
|
System.out.println(content);
|
||||||
@ -418,4 +419,49 @@ public class QueryRestControllerJsonTest extends AbstractDaqRestTest {
|
|||||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].data[9].globalMillis").value(1900))
|
.andExpect(MockMvcResultMatchers.jsonPath("$[0].data[9].globalMillis").value(1900))
|
||||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].data[9].eventCount").value(10));
|
.andExpect(MockMvcResultMatchers.jsonPath("$[0].data[9].eventCount").value(10));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
public void testGzipCompression() throws Exception {
|
||||||
|
DAQQuery request = new DAQQuery(
|
||||||
|
new RequestRangePulseId(
|
||||||
|
10,
|
||||||
|
11),
|
||||||
|
TEST_CHANNEL_NAMES);
|
||||||
|
request.setCompression(Compression.GZIP);
|
||||||
|
|
||||||
|
String content = mapper.writeValueAsString(request);
|
||||||
|
System.out.println(content);
|
||||||
|
|
||||||
|
this.mockMvc
|
||||||
|
.perform(MockMvcRequestBuilders
|
||||||
|
.post(QueryRestController.QUERY)
|
||||||
|
.contentType(MediaType.APPLICATION_JSON)
|
||||||
|
.content(content))
|
||||||
|
|
||||||
|
.andDo(MockMvcResultHandlers.print())
|
||||||
|
.andExpect(MockMvcResultMatchers.status().isOk());
|
||||||
|
}
|
||||||
|
|
||||||
|
@Test
|
||||||
|
public void testDeflateCompression() throws Exception {
|
||||||
|
DAQQuery request = new DAQQuery(
|
||||||
|
new RequestRangePulseId(
|
||||||
|
10,
|
||||||
|
11),
|
||||||
|
TEST_CHANNEL_NAMES);
|
||||||
|
request.setCompression(Compression.DEFLATE);
|
||||||
|
|
||||||
|
String content = mapper.writeValueAsString(request);
|
||||||
|
System.out.println(content);
|
||||||
|
|
||||||
|
this.mockMvc
|
||||||
|
.perform(MockMvcRequestBuilders
|
||||||
|
.post(QueryRestController.QUERY)
|
||||||
|
.contentType(MediaType.APPLICATION_JSON)
|
||||||
|
.content(content))
|
||||||
|
|
||||||
|
.andDo(MockMvcResultHandlers.print())
|
||||||
|
.andExpect(MockMvcResultMatchers.status().isOk());
|
||||||
|
}
|
||||||
|
|
||||||
}
|
}
|
||||||
|
Reference in New Issue
Block a user