Config query fields.
This commit is contained in:
parent
593228ead5
commit
0ff19f14a9
53
Readme.md
53
Readme.md
@ -282,7 +282,11 @@ A request is performed by sending a valid JSON object in the HTTP request body.
|
||||
"endPulseId":3
|
||||
},
|
||||
"ordering":"asc",
|
||||
"fields":[
|
||||
"configFields":[
|
||||
"globalDate",
|
||||
"type"
|
||||
],
|
||||
"eventFields":[
|
||||
"pulseId",
|
||||
"globalDate",
|
||||
"value"
|
||||
@ -328,7 +332,8 @@ A request is performed by sending a valid JSON object in the HTTP request body.
|
||||
- **range**: The range of the query (see [here](Readme.md#query_range)).
|
||||
- **limit**: An optional limit for the number of elements to retrieve. Limit together with aggregation does not make sense and thus is not supported.
|
||||
- **ordering**: The ordering of the data (see [here](Readme.md#data_ordering)).
|
||||
- **fields**: Array of requested fields (see [here](Readme.md#requested_fields)).
|
||||
- **configFields**: Array of requested config fields (see [here](Readme.md#requested_fields)). Omitting this field disables the config query.
|
||||
- **eventFields**: Array of requested event fields (see [here](Readme.md#requested_fields)). Omitting this field results in a default set of event fields.
|
||||
- **aggregation**: Setting this attribute activates data aggregation (see [here](Readme.md#data_aggregation) for its specification).
|
||||
- **response**: Specifies the format of the response of the requested data (see [here](Readme.md#response_format)). If this value is not set it defaults to JSON.
|
||||
- **mapping**: Activates a table like alignment of the response which allows a mapping of values belonging to the same pulse-id/global time (see [here](Readme.md#value_mapping) - usually left undefined).
|
||||
@ -437,14 +442,20 @@ Queries are applied to a range. The following types of ranges are supported.
|
||||
### Requested Fields
|
||||
|
||||
```json
|
||||
"fields":[
|
||||
"configFields":[
|
||||
"pulseId",
|
||||
"globalDate",
|
||||
"type"
|
||||
],
|
||||
"eventFields":[
|
||||
"pulseId",
|
||||
"globalDate",
|
||||
"value"
|
||||
]
|
||||
```
|
||||
|
||||
- **fields**: Array of requested fields (see [here](https://github.psi.ch/sf_daq/ch.psi.daq.domain/blob/master/src/main/java/ch/psi/daq/domain/query/operation/EventField.java) for possible values of data queries and [here](https://github.psi.ch/sf_daq/ch.psi.daq.domain/blob/master/src/main/java/ch/psi/daq/domain/query/operation/ConfigField.java) for possible values of config queries).
|
||||
- **configFields**: Array of requested config fields (see [here](https://github.psi.ch/sf_daq/ch.psi.daq.domain/blob/master/src/main/java/ch/psi/daq/domain/query/operation/ConfigField.java)) for possible values of config queries.
|
||||
- **eventFields**: Array of requested event fields (see [here](https://github.psi.ch/sf_daq/ch.psi.daq.domain/blob/master/src/main/java/ch/psi/daq/domain/query/operation/EventField.java) for possible values of data queries.
|
||||
|
||||
It is possible to request the time in seconds (since midnight, January 1, 1970 UTC (the UNIX epoch) as a decimal value including fractional seconds - using fields *globalSeconds* and *iocSeconds*), in milliseconds (since midnight, January 1, 1970 UTC (the JAVA epoch) - using fields *globalMillis* and *iocMillis*) or as a ISO8601 formatted String - using fields *globalDate* and *iocDate* (such as 1997-07-16T19:20:30.123456789+02:00).
|
||||
|
||||
@ -885,7 +896,7 @@ Allows for server side optimizations since not all data needs to be retrieved.
|
||||
|
||||
```json
|
||||
{
|
||||
"fields":["pulseId","value"],
|
||||
"eventFields":["pulseId","value"],
|
||||
"range":{
|
||||
"startPulseId":0,
|
||||
"endPulseId":3
|
||||
@ -899,7 +910,7 @@ Allows for server side optimizations since not all data needs to be retrieved.
|
||||
##### Command
|
||||
|
||||
```bash
|
||||
curl -H "Content-Type: application/json" -X POST -d '{"fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' https://data-api.psi.ch/sf/query | python -m json.tool
|
||||
curl -H "Content-Type: application/json" -X POST -d '{"eventFields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' https://data-api.psi.ch/sf/query | python -m json.tool
|
||||
```
|
||||
|
||||
##### Response
|
||||
@ -948,7 +959,7 @@ curl -H "Content-Type: application/json" -X POST -d '{"fields":["pulseId","valu
|
||||
"channel1",
|
||||
"channel2"
|
||||
],
|
||||
"fields":[
|
||||
"eventFields":[
|
||||
"channel",
|
||||
"pulseId",
|
||||
"iocSeconds",
|
||||
@ -963,7 +974,7 @@ curl -H "Content-Type: application/json" -X POST -d '{"fields":["pulseId","valu
|
||||
##### Command
|
||||
|
||||
```bash
|
||||
curl -H "Content-Type: application/json" -X POST -d '{"response":{"format":"csv"},"range":{"startPulseId":0,"endPulseId":4},"channels":["channel1","channel2"],"fields":["channel","pulseId","iocSeconds","globalSeconds","shape","eventCount","value"]}' https://data-api.psi.ch/sf/query
|
||||
curl -H "Content-Type: application/json" -X POST -d '{"response":{"format":"csv"},"range":{"startPulseId":0,"endPulseId":4},"channels":["channel1","channel2"],"eventFields":["channel","pulseId","iocSeconds","globalSeconds","shape","eventCount","value"]}' https://data-api.psi.ch/sf/query
|
||||
```
|
||||
|
||||
##### Response
|
||||
@ -989,7 +1000,7 @@ testChannel2;4;0.040000000;0.040000000;[1];1;4
|
||||
```json
|
||||
{
|
||||
"ordering":"desc",
|
||||
"fields":["pulseId","value"],
|
||||
"eventFields":["pulseId","value"],
|
||||
"range":{
|
||||
"startPulseId":0,
|
||||
"endPulseId":3
|
||||
@ -1006,7 +1017,7 @@ Use **none** in case ordering does not matter (allows for server side optimizati
|
||||
##### Command
|
||||
|
||||
```bash
|
||||
curl -H "Content-Type: application/json" -X POST -d '{"ordering":"desc","fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' https://data-api.psi.ch/sf/query | python -m json.tool
|
||||
curl -H "Content-Type: application/json" -X POST -d '{"ordering":"desc","eventFields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' https://data-api.psi.ch/sf/query | python -m json.tool
|
||||
```
|
||||
|
||||
##### Response
|
||||
@ -1050,7 +1061,7 @@ curl -H "Content-Type: application/json" -X POST -d '{"ordering":"desc","fields
|
||||
"aggregationType":"value",
|
||||
"aggregations":["min","mean","max"]
|
||||
},
|
||||
"fields":["pulseId","value"],
|
||||
"eventFields":["pulseId","value"],
|
||||
"range":{
|
||||
"startPulseId":0,
|
||||
"endPulseId":3
|
||||
@ -1064,7 +1075,7 @@ curl -H "Content-Type: application/json" -X POST -d '{"ordering":"desc","fields
|
||||
##### Command
|
||||
|
||||
```bash
|
||||
curl -H "Content-Type: application/json" -X POST -d '{"aggregation":{"aggregationType":"value","aggregations":["min","mean","max"]},"fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' https://data-api.psi.ch/sf/query | python -m json.tool
|
||||
curl -H "Content-Type: application/json" -X POST -d '{"aggregation":{"aggregationType":"value","aggregations":["min","mean","max"]},"eventFields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' https://data-api.psi.ch/sf/query | python -m json.tool
|
||||
```
|
||||
|
||||
##### Response
|
||||
@ -1129,7 +1140,7 @@ Illustration of array value aggregation:
|
||||
"aggregationType":"value",
|
||||
"aggregations":["min","mean","max"]
|
||||
},
|
||||
"fields":["pulseId","value"],
|
||||
"eventFields":["pulseId","value"],
|
||||
"range":{
|
||||
"startPulseId":0,
|
||||
"endPulseId":3
|
||||
@ -1143,7 +1154,7 @@ Illustration of array value aggregation:
|
||||
##### Command
|
||||
|
||||
```bash
|
||||
curl -H "Content-Type: application/json" -X POST -d '{"aggregation":{"nrOfBins":2,"aggregationType":"value","aggregations":["min","mean","max"]},"fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' https://data-api.psi.ch/sf/query | python -m json.tool
|
||||
curl -H "Content-Type: application/json" -X POST -d '{"aggregation":{"nrOfBins":2,"aggregationType":"value","aggregations":["min","mean","max"]},"eventFields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' https://data-api.psi.ch/sf/query | python -m json.tool
|
||||
```
|
||||
|
||||
##### Response
|
||||
@ -1195,7 +1206,7 @@ Illustration of array value aggregation with additional binning:
|
||||
"aggregationType":"value",
|
||||
"aggregations":["min","mean","max"]
|
||||
},
|
||||
"fields":["globalMillis","value"],
|
||||
"eventFields":["globalMillis","value"],
|
||||
"range":{
|
||||
"startSeconds":"0.0",
|
||||
"endSeconds":"0.030000000"
|
||||
@ -1209,7 +1220,7 @@ Illustration of array value aggregation with additional binning:
|
||||
##### Command
|
||||
|
||||
```bash
|
||||
curl -H "Content-Type: application/json" -X POST -d '{"aggregation":{"pulsesPerBin":2,"aggregationType":"value","aggregations":["min","mean","max"]},"fields":["globalMillis","value"],"range":{"startSeconds":"0.0","endSeconds":"0.030000000"},"channels":["Channel_01"]}' https://data-api.psi.ch/sf/query | python -m json.tool
|
||||
curl -H "Content-Type: application/json" -X POST -d '{"aggregation":{"pulsesPerBin":2,"aggregationType":"value","aggregations":["min","mean","max"]},"eventFields":["globalMillis","value"],"range":{"startSeconds":"0.0","endSeconds":"0.030000000"},"channels":["Channel_01"]}' https://data-api.psi.ch/sf/query | python -m json.tool
|
||||
```
|
||||
|
||||
##### Response
|
||||
@ -1257,7 +1268,7 @@ Illustration of array value aggregation with additional binning:
|
||||
"aggregationType":"index",
|
||||
"aggregations":["min","mean","max","sum"]
|
||||
},
|
||||
"fields":["pulseId","value"],
|
||||
"eventFields":["pulseId","value"],
|
||||
"range":{
|
||||
"startPulseId":0,
|
||||
"endPulseId":3
|
||||
@ -1271,7 +1282,7 @@ Illustration of array value aggregation with additional binning:
|
||||
##### Command
|
||||
|
||||
```bash
|
||||
curl -H "Content-Type: application/json" -X POST -d '{"aggregation":{"nrOfBins":1,"aggregationType":"index","aggregations":["min","max","mean","sum"]},"fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' https://data-api.psi.ch/sf/query | python -m json.tool
|
||||
curl -H "Content-Type: application/json" -X POST -d '{"aggregation":{"nrOfBins":1,"aggregationType":"index","aggregations":["min","max","mean","sum"]},"eventFields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01"]}' https://data-api.psi.ch/sf/query | python -m json.tool
|
||||
```
|
||||
|
||||
##### Response
|
||||
@ -1329,7 +1340,7 @@ Maps values based on their pulse-id/global time. Please note that the response f
|
||||
|
||||
```json
|
||||
{
|
||||
"fields":["pulseId","value"],
|
||||
"eventFields":["pulseId","value"],
|
||||
"range":{
|
||||
"startPulseId":0,
|
||||
"endPulseId":3
|
||||
@ -1347,7 +1358,7 @@ Maps values based on their pulse-id/global time. Please note that the response f
|
||||
##### Command
|
||||
|
||||
```bash
|
||||
curl -H "Content-Type: application/json" -X POST -d '{"fields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01","Channel_02"],"mapping":{"incomplete":"provide-as-is"}}' https://data-api.psi.ch/sf/query | python -m json.tool
|
||||
curl -H "Content-Type: application/json" -X POST -d '{"eventFields":["pulseId","value"],"range":{"startPulseId":0,"endPulseId":3},"channels":["Channel_01","Channel_02"],"mapping":{"incomplete":"provide-as-is"}}' https://data-api.psi.ch/sf/query | python -m json.tool
|
||||
```
|
||||
|
||||
##### Response
|
||||
@ -1432,7 +1443,7 @@ A request is performed by sending a valid JSON object in the HTTP request body.
|
||||
"endPulseId":3
|
||||
},
|
||||
"ordering":"asc",
|
||||
"fields":[
|
||||
"configFields":[
|
||||
"pulseId",
|
||||
"globalDate",
|
||||
"type"
|
||||
|
@ -38,6 +38,7 @@ import com.fasterxml.jackson.databind.ObjectMapper;
|
||||
import com.google.common.collect.Lists;
|
||||
|
||||
import ch.psi.daq.common.ordering.Ordering;
|
||||
import ch.psi.daq.common.tuple.Quadruple;
|
||||
import ch.psi.daq.domain.backend.Backend;
|
||||
import ch.psi.daq.domain.config.DomainConfig;
|
||||
import ch.psi.daq.domain.events.ChannelConfiguration;
|
||||
@ -54,6 +55,7 @@ import ch.psi.daq.domain.query.channels.LongHash;
|
||||
import ch.psi.daq.domain.query.operation.Aggregation;
|
||||
import ch.psi.daq.domain.query.operation.AggregationType;
|
||||
import ch.psi.daq.domain.query.operation.Compression;
|
||||
import ch.psi.daq.domain.query.operation.ConfigField;
|
||||
import ch.psi.daq.domain.query.operation.EventField;
|
||||
import ch.psi.daq.domain.query.response.Response;
|
||||
import ch.psi.daq.domain.query.response.ResponseFormat;
|
||||
@ -386,7 +388,7 @@ public class QueryRestController implements ApplicationContextAware {
|
||||
|
||||
httpResponse.validateQuery(queries);
|
||||
// execute query
|
||||
final List<Entry<DAQQueryElement, Stream<Triple<BackendQuery, ChannelName, ?>>>> result =
|
||||
final List<Entry<DAQQueryElement, Stream<Quadruple<BackendQuery, ChannelName, ?, ?>>>> result =
|
||||
queryManager.queryEvents(queries);
|
||||
|
||||
httpResponse.respond(
|
||||
@ -436,11 +438,42 @@ public class QueryRestController implements ApplicationContextAware {
|
||||
return Lists.newArrayList(ResponseFormat.values());
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the current list of {@link ConfigField}s available.
|
||||
*
|
||||
* @return list of {@link ConfigField}s as String array
|
||||
*/
|
||||
@RequestMapping(
|
||||
value = DomainConfig.PATH_PARAMETERS_ROOT + "/configfields",
|
||||
method = {RequestMethod.GET},
|
||||
produces = {MediaType.APPLICATION_JSON_VALUE})
|
||||
public @ResponseBody List<ConfigField> getConfigFieldValues() {
|
||||
return Arrays.stream(ConfigField.values())
|
||||
.filter(queryField -> queryField.isPublish())
|
||||
.collect(Collectors.toList());
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the current list of {@link EventField}s available.
|
||||
*
|
||||
* @return list of {@link EventField}s as String array
|
||||
*/
|
||||
@RequestMapping(
|
||||
value = DomainConfig.PATH_PARAMETERS_ROOT + "/eventfields",
|
||||
method = {RequestMethod.GET},
|
||||
produces = {MediaType.APPLICATION_JSON_VALUE})
|
||||
public @ResponseBody List<EventField> getEventFieldValues() {
|
||||
return Arrays.stream(EventField.values())
|
||||
.filter(queryField -> queryField.isPublish())
|
||||
.collect(Collectors.toList());
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the current list of {@link EventField}s available.
|
||||
*
|
||||
* @return list of {@link EventField}s as String array
|
||||
*/
|
||||
@Deprecated
|
||||
@RequestMapping(
|
||||
value = DomainConfig.PATH_PARAMETERS_ROOT + "/queryfields",
|
||||
method = {RequestMethod.GET},
|
||||
|
@ -50,8 +50,8 @@ public class ConfigQueryValidator implements Validator, ApplicationContextAware
|
||||
|
||||
private void checkElement(final DAQConfigQuery query, final Errors errors) {
|
||||
// set default values (if not set)
|
||||
if (query.getFields() == null || query.getFields().isEmpty()) {
|
||||
query.setFields(new LinkedHashSet<>(queryResponseFields));
|
||||
if (query.getConfigFields() == null || query.getConfigFields().isEmpty()) {
|
||||
query.setConfigFields(new LinkedHashSet<>(queryResponseFields));
|
||||
}
|
||||
|
||||
RequestRange range = query.getRange();
|
||||
|
@ -64,8 +64,8 @@ public class EventQueryValidator implements Validator, ApplicationContextAware {
|
||||
|
||||
private void checkElement(final DAQQueryElement query, final Errors errors) {
|
||||
// set default values (if not set)
|
||||
if (query.getFields() == null || query.getFields().isEmpty()) {
|
||||
query.setFields(new LinkedHashSet<>(defaultResponseFields));
|
||||
if (query.getEventFields() == null || query.getEventFields().isEmpty()) {
|
||||
query.setEventFields(new LinkedHashSet<>(defaultResponseFields));
|
||||
}
|
||||
|
||||
RequestRange range = query.getRange();
|
||||
@ -134,7 +134,7 @@ public class EventQueryValidator implements Validator, ApplicationContextAware {
|
||||
|
||||
if (query.getValueTransformations() != null && !query.getValueTransformations().isEmpty()) {
|
||||
// without this field, json will not contain transformedValue
|
||||
query.addField(EventField.transformedValue);
|
||||
query.addEventField(EventField.transformedValue);
|
||||
|
||||
for (final ValueTransformationSequence transformationSequence : query.getValueTransformations()) {
|
||||
transformationSequence.setExecutionEnvironment(ExecutionEnvironment.QUERYING);
|
||||
|
@ -6,6 +6,7 @@ import java.util.stream.Stream;
|
||||
|
||||
import org.apache.commons.lang3.tuple.Triple;
|
||||
|
||||
import ch.psi.daq.common.tuple.Quadruple;
|
||||
import ch.psi.daq.domain.events.ChannelConfiguration;
|
||||
import ch.psi.daq.domain.json.ChannelName;
|
||||
import ch.psi.daq.domain.query.DAQConfigQuery;
|
||||
@ -35,6 +36,6 @@ public interface QueryManager {
|
||||
Entry<DAQConfigQueryElement, Stream<Triple<BackendQuery, ChannelName, ?>>> queryConfigs(final DAQConfigQuery query)
|
||||
throws Exception;
|
||||
|
||||
List<Entry<DAQQueryElement, Stream<Triple<BackendQuery, ChannelName, ?>>>> queryEvents(final DAQQueries queries)
|
||||
List<Entry<DAQQueryElement, Stream<Quadruple<BackendQuery, ChannelName, ?, ?>>>> queryEvents(final DAQQueries queries)
|
||||
throws Exception;
|
||||
}
|
||||
|
@ -16,6 +16,7 @@ import org.springframework.beans.BeansException;
|
||||
import org.springframework.context.ApplicationContext;
|
||||
import org.springframework.context.ApplicationContextAware;
|
||||
|
||||
import ch.psi.daq.common.tuple.Quadruple;
|
||||
import ch.psi.daq.domain.DataEvent;
|
||||
import ch.psi.daq.domain.backend.Backend;
|
||||
import ch.psi.daq.domain.config.DomainConfig;
|
||||
@ -122,18 +123,18 @@ public class QueryManagerImpl implements QueryManager, ApplicationContextAware {
|
||||
}
|
||||
|
||||
@Override
|
||||
public List<Entry<DAQQueryElement, Stream<Triple<BackendQuery, ChannelName, ?>>>> queryEvents(
|
||||
public List<Entry<DAQQueryElement, Stream<Quadruple<BackendQuery, ChannelName, ?, ?>>>> queryEvents(
|
||||
final DAQQueries queries) {
|
||||
// set backends if not defined yet
|
||||
for (DAQQueryElement daqQuery : queries) {
|
||||
channelsCache.configureBackends(daqQuery.getChannels());
|
||||
}
|
||||
|
||||
final List<Entry<DAQQueryElement, Stream<Triple<BackendQuery, ChannelName, ?>>>> results =
|
||||
final List<Entry<DAQQueryElement, Stream<Quadruple<BackendQuery, ChannelName, ?, ?>>>> results =
|
||||
new ArrayList<>(queries.getQueries().size());
|
||||
|
||||
for (final DAQQueryElement queryElement : queries) {
|
||||
Stream<Triple<BackendQuery, ChannelName, ?>> resultStreams =
|
||||
Stream<Quadruple<BackendQuery, ChannelName, ?, ?>> resultStreams =
|
||||
BackendQueryImpl
|
||||
.getBackendQueries(queryElement)
|
||||
.stream()
|
||||
@ -146,6 +147,7 @@ public class QueryManagerImpl implements QueryManager, ApplicationContextAware {
|
||||
query.getBackend().getBackendAccess().getQueryProcessor();
|
||||
final BackendQueryAnalyzer queryAnalizer = queryAnalizerFactory.apply(query);
|
||||
|
||||
// ChannelEvent query
|
||||
/* all the magic happens here */
|
||||
final Stream<Entry<ChannelName, Stream<? extends DataEvent>>> channelToDataEvents =
|
||||
processor.process(queryAnalizer);
|
||||
@ -153,8 +155,17 @@ public class QueryManagerImpl implements QueryManager, ApplicationContextAware {
|
||||
final Stream<Entry<ChannelName, ?>> channelToData =
|
||||
queryAnalizer.postProcess(channelToDataEvents);
|
||||
|
||||
// ChannelConfig query
|
||||
final BackendQuery configQuery = new BackendQueryImpl(query, queryElement.getConfigFields());
|
||||
final Map<String, Stream<? extends ChannelConfiguration>> channelToConfig =
|
||||
configQuery.getChannelConfigurations();
|
||||
|
||||
return channelToData.map(entry -> {
|
||||
return Triple.of(query, entry.getKey(), entry.getValue());
|
||||
return Quadruple.of(
|
||||
query,
|
||||
entry.getKey(),
|
||||
channelToConfig.get(entry.getKey().getName()),
|
||||
entry.getValue());
|
||||
});
|
||||
});
|
||||
|
||||
|
@ -53,8 +53,8 @@ public class CSVHTTPResponse extends AbstractHTTPResponse {
|
||||
}
|
||||
|
||||
|
||||
if (!ArrayUtils.contains(query.getColumns(), FieldNames.FIELD_GLOBAL_TIME)) {
|
||||
query.addField(EventField.globalMillis);
|
||||
if (!ArrayUtils.contains(query.getRequest().getColumns(), FieldNames.FIELD_GLOBAL_TIME)) {
|
||||
query.addEventField(EventField.globalMillis);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -17,6 +17,7 @@ import java.util.Set;
|
||||
import java.util.concurrent.atomic.AtomicReference;
|
||||
import java.util.function.Function;
|
||||
import java.util.function.ToLongFunction;
|
||||
import java.util.stream.Collectors;
|
||||
import java.util.stream.Stream;
|
||||
|
||||
import javax.servlet.ServletResponse;
|
||||
@ -24,7 +25,6 @@ import javax.servlet.ServletResponse;
|
||||
import org.apache.commons.csv.CSVFormat;
|
||||
import org.apache.commons.csv.CSVPrinter;
|
||||
import org.apache.commons.lang3.tuple.Pair;
|
||||
import org.apache.commons.lang3.tuple.Triple;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.BeansException;
|
||||
@ -36,9 +36,11 @@ import ch.psi.daq.common.stream.match.MapCreator;
|
||||
import ch.psi.daq.common.stream.match.MapFiller;
|
||||
import ch.psi.daq.common.stream.match.Padder;
|
||||
import ch.psi.daq.common.stream.match.StreamMatcher;
|
||||
import ch.psi.daq.common.tuple.Quadruple;
|
||||
import ch.psi.daq.domain.DataEvent;
|
||||
import ch.psi.daq.domain.backend.Backend;
|
||||
import ch.psi.daq.domain.config.DomainConfig;
|
||||
import ch.psi.daq.domain.events.ChannelConfiguration;
|
||||
import ch.psi.daq.domain.json.ChannelName;
|
||||
import ch.psi.daq.domain.query.DAQQueries;
|
||||
import ch.psi.daq.domain.query.DAQQueryElement;
|
||||
@ -47,8 +49,9 @@ import ch.psi.daq.domain.query.backend.analyzer.BackendQueryAnalyzer;
|
||||
import ch.psi.daq.domain.query.mapping.IncompleteStrategy;
|
||||
import ch.psi.daq.domain.query.mapping.Mapping;
|
||||
import ch.psi.daq.domain.query.operation.Aggregation;
|
||||
import ch.psi.daq.domain.query.operation.Extrema;
|
||||
import ch.psi.daq.domain.query.operation.ConfigField;
|
||||
import ch.psi.daq.domain.query.operation.EventField;
|
||||
import ch.psi.daq.domain.query.operation.Extrema;
|
||||
import ch.psi.daq.domain.query.response.Response;
|
||||
import ch.psi.daq.queryrest.config.QueryRestConfig;
|
||||
import ch.psi.daq.queryrest.response.AbstractHTTPResponse;
|
||||
@ -94,7 +97,7 @@ public class CSVResponseStreamWriter implements ResponseStreamWriter, Applicatio
|
||||
final AbstractHTTPResponse response,
|
||||
final ResponseFormatter<R> formatter) throws Exception {
|
||||
if (query instanceof DAQQueries) {
|
||||
respond((List<Entry<DAQQueryElement, Stream<Triple<BackendQuery, ChannelName, ?>>>>) result,
|
||||
respond((List<Entry<DAQQueryElement, Stream<Quadruple<BackendQuery, ChannelName, ?, ?>>>>) result,
|
||||
out, response);
|
||||
} else {
|
||||
final String message = String.format("'%s' has no response type for '%s'.", query);
|
||||
@ -104,7 +107,7 @@ public class CSVResponseStreamWriter implements ResponseStreamWriter, Applicatio
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
public void respond(final List<Entry<DAQQueryElement, Stream<Triple<BackendQuery, ChannelName, ?>>>> results,
|
||||
public void respond(final List<Entry<DAQQueryElement, Stream<Quadruple<BackendQuery, ChannelName, ?, ?>>>> results,
|
||||
final OutputStream out, final Response response) throws Exception {
|
||||
if (results.size() > 1) {
|
||||
throw new IllegalStateException("CSV format does not allow for multiple queries.");
|
||||
@ -112,9 +115,13 @@ public class CSVResponseStreamWriter implements ResponseStreamWriter, Applicatio
|
||||
|
||||
final AtomicReference<Exception> exception = new AtomicReference<>();
|
||||
|
||||
final Map<ChannelName, Stream<DataEvent>> streams = new LinkedHashMap<>(results.size());
|
||||
final List<String> header = new ArrayList<>();
|
||||
final Collection<Pair<ChannelName, Function<DataEvent, String>>> accessors = new ArrayList<>();
|
||||
final Map<ChannelName, List<ChannelConfiguration>> configLists = new LinkedHashMap<>(results.size());
|
||||
final List<String> configHeader = new ArrayList<>();
|
||||
final Collection<Pair<ChannelName, Function<ChannelConfiguration, String>>> configAccessors = new ArrayList<>();
|
||||
|
||||
final Map<ChannelName, Stream<DataEvent>> eventStreams = new LinkedHashMap<>(results.size());
|
||||
final List<String> eventHeader = new ArrayList<>();
|
||||
final Collection<Pair<ChannelName, Function<DataEvent, String>>> eventAccessors = new ArrayList<>();
|
||||
final AtomicReference<DAQQueryElement> daqQueryRef = new AtomicReference<>();
|
||||
final AtomicReference<BackendQuery> backendQueryRef = new AtomicReference<>();
|
||||
|
||||
@ -125,16 +132,27 @@ public class CSVResponseStreamWriter implements ResponseStreamWriter, Applicatio
|
||||
|
||||
entry.getValue()
|
||||
.sequential()
|
||||
.forEach(triple -> {
|
||||
backendQueryRef.compareAndSet(null, triple.getLeft());
|
||||
.forEach(quadruple -> {
|
||||
backendQueryRef.compareAndSet(null, quadruple.getFirst());
|
||||
|
||||
if (triple.getRight() instanceof Stream) {
|
||||
setupChannelColumns(query, triple.getLeft(), triple.getMiddle(), header, accessors);
|
||||
if (query.hasConfigFields() && quadruple.getThird() instanceof Stream) {
|
||||
setupChannelConfigColumns(query, quadruple.getFirst(), quadruple.getSecond(), configHeader,
|
||||
configAccessors);
|
||||
|
||||
final Stream<DataEvent> eventStream = ((Stream<DataEvent>) triple.getRight());
|
||||
streams.put(triple.getMiddle(), eventStream);
|
||||
final List<ChannelConfiguration> configList =
|
||||
((Stream<ChannelConfiguration>) quadruple.getThird())
|
||||
.collect(Collectors.toList());
|
||||
configLists.put(quadruple.getSecond(), configList);
|
||||
}
|
||||
|
||||
if (quadruple.getFourth() instanceof Stream) {
|
||||
setupChannelEventColumns(query, quadruple.getFirst(), quadruple.getSecond(), eventHeader,
|
||||
eventAccessors);
|
||||
|
||||
final Stream<DataEvent> eventStream = ((Stream<DataEvent>) quadruple.getFourth());
|
||||
eventStreams.put(quadruple.getSecond(), eventStream);
|
||||
} else {
|
||||
final String message = String.format("Expect a DataEvent Stream for '%s'.", triple.getMiddle());
|
||||
final String message = String.format("Expect a DataEvent Stream for '%s'.", quadruple.getSecond());
|
||||
LOGGER.warn(message);
|
||||
}
|
||||
});
|
||||
@ -152,8 +170,8 @@ public class CSVResponseStreamWriter implements ResponseStreamWriter, Applicatio
|
||||
new MapFiller<>(),
|
||||
null,
|
||||
padder,
|
||||
streams.keySet(),
|
||||
streams.values());
|
||||
eventStreams.keySet(),
|
||||
eventStreams.values());
|
||||
final Iterator<Map<ChannelName, DataEvent>> streamsMatchIter = streamMatcher.iterator();
|
||||
|
||||
// prepare csv output
|
||||
@ -167,13 +185,32 @@ public class CSVResponseStreamWriter implements ResponseStreamWriter, Applicatio
|
||||
// response.getCharacterEncoding()));
|
||||
csvFilePrinter = new CSVPrinter(writer, csvFormat);
|
||||
|
||||
csvFilePrinter.printRecord(header);
|
||||
if (!configHeader.isEmpty()) {
|
||||
csvFilePrinter.printRecord(configHeader);
|
||||
|
||||
// ensure correct order
|
||||
final Stream<String> rowStream = configAccessors.stream().sequential()
|
||||
.map(accessorPair -> {
|
||||
final List<ChannelConfiguration> configs = configLists.get(accessorPair.getKey());
|
||||
if (!configs.isEmpty()) {
|
||||
// print the last (current) config
|
||||
return accessorPair.getValue().apply(configs.get(configs.size() - 1));
|
||||
} else {
|
||||
return EMPTY_VALUE;
|
||||
}
|
||||
});
|
||||
|
||||
csvFilePrinter.printRecord(new StreamIterable<String>(rowStream));
|
||||
csvFilePrinter.println();
|
||||
}
|
||||
|
||||
csvFilePrinter.printRecord(eventHeader);
|
||||
|
||||
while (streamsMatchIter.hasNext()) {
|
||||
final Map<ChannelName, DataEvent> match = streamsMatchIter.next();
|
||||
|
||||
// ensure correct order
|
||||
final Stream<String> rowStream = accessors.stream().sequential()
|
||||
final Stream<String> rowStream = eventAccessors.stream().sequential()
|
||||
.map(accessorPair -> {
|
||||
DataEvent event = match.get(accessorPair.getKey());
|
||||
if (event != null) {
|
||||
@ -218,17 +255,35 @@ public class CSVResponseStreamWriter implements ResponseStreamWriter, Applicatio
|
||||
}
|
||||
|
||||
|
||||
private void setupChannelColumns(final DAQQueryElement daqQuery, final BackendQuery backendQuery,
|
||||
private void setupChannelConfigColumns(final DAQQueryElement daqQuery, final BackendQuery backendQuery,
|
||||
final ChannelName channelName,
|
||||
final Collection<String> header,
|
||||
Collection<Pair<ChannelName, Function<ChannelConfiguration, String>>> accessors) {
|
||||
final Set<ConfigField> queryFields = daqQuery.getConfigFields();
|
||||
|
||||
for (final ConfigField field : queryFields) {
|
||||
final StringBuilder buf = new StringBuilder(3)
|
||||
.append(channelName.getName())
|
||||
.append(DELIMITER_CHANNELNAME_FIELDNAME)
|
||||
.append(field.name());
|
||||
|
||||
header.add(buf.toString());
|
||||
accessors.add(Pair.of(channelName, new ConfigFieldStringifyer(field.getAccessor(), EMPTY_VALUE,
|
||||
DELIMITER_ARRAY)));
|
||||
}
|
||||
}
|
||||
|
||||
private void setupChannelEventColumns(final DAQQueryElement daqQuery, final BackendQuery backendQuery,
|
||||
final ChannelName channelName,
|
||||
final Collection<String> header, Collection<Pair<ChannelName, Function<DataEvent, String>>> accessors) {
|
||||
final Set<EventField> queryFields = daqQuery.getFields();
|
||||
final Set<EventField> eventFields = daqQuery.getEventFields();
|
||||
final List<Aggregation> aggregations =
|
||||
daqQuery.getAggregation() != null ? daqQuery.getAggregation().getAggregations() : null;
|
||||
final List<Extrema> extrema = daqQuery.getAggregation() != null ? daqQuery.getAggregation().getExtrema() : null;
|
||||
|
||||
final BackendQueryAnalyzer queryAnalyzer = queryAnalizerFactory.apply(backendQuery);
|
||||
|
||||
for (final EventField field : queryFields) {
|
||||
for (final EventField field : eventFields) {
|
||||
if (!(EventField.value.equals(field) && queryAnalyzer.isAggregationEnabled())) {
|
||||
final StringBuilder buf = new StringBuilder(3)
|
||||
.append(channelName.getName())
|
||||
@ -236,7 +291,7 @@ public class CSVResponseStreamWriter implements ResponseStreamWriter, Applicatio
|
||||
.append(field.name());
|
||||
|
||||
header.add(buf.toString());
|
||||
accessors.add(Pair.of(channelName, new QueryFieldStringifyer(field.getAccessor(), EMPTY_VALUE,
|
||||
accessors.add(Pair.of(channelName, new EventFieldStringifyer(field.getAccessor(), EMPTY_VALUE,
|
||||
DELIMITER_ARRAY)));
|
||||
}
|
||||
}
|
||||
@ -257,7 +312,7 @@ public class CSVResponseStreamWriter implements ResponseStreamWriter, Applicatio
|
||||
|
||||
if (extrema != null && queryAnalyzer.isAggregationEnabled()) {
|
||||
for (final Extrema extremum : extrema) {
|
||||
for (final EventField field : queryFields) {
|
||||
for (final EventField field : eventFields) {
|
||||
final Function<Object, Object> accessor = extremum.getAccessor(field);
|
||||
if (accessor != null) {
|
||||
final StringBuilder buf = new StringBuilder(7)
|
||||
@ -271,7 +326,7 @@ public class CSVResponseStreamWriter implements ResponseStreamWriter, Applicatio
|
||||
|
||||
header.add(buf.toString());
|
||||
accessors
|
||||
.add(Pair.of(channelName, new QueryFieldStringifyer(accessor, EMPTY_VALUE,
|
||||
.add(Pair.of(channelName, new EventFieldStringifyer(accessor, EMPTY_VALUE,
|
||||
DELIMITER_ARRAY)));
|
||||
}
|
||||
}
|
||||
|
@ -0,0 +1,21 @@
|
||||
package ch.psi.daq.queryrest.response.csv;
|
||||
|
||||
import java.util.function.Function;
|
||||
|
||||
import ch.psi.daq.domain.events.ChannelConfiguration;
|
||||
|
||||
public class ConfigFieldStringifyer extends QueryFieldStringifyer implements Function<ChannelConfiguration, String> {
|
||||
|
||||
public ConfigFieldStringifyer(Function<Object, Object> accessor, String nonValue, String arraySeparator) {
|
||||
super(accessor, nonValue, arraySeparator);
|
||||
}
|
||||
|
||||
@Override
|
||||
public String apply(ChannelConfiguration config) {
|
||||
if (config == null) {
|
||||
return getNonValue();
|
||||
}
|
||||
|
||||
return toString(getAccessor().apply(config));
|
||||
}
|
||||
}
|
@ -0,0 +1,21 @@
|
||||
package ch.psi.daq.queryrest.response.csv;
|
||||
|
||||
import java.util.function.Function;
|
||||
|
||||
import ch.psi.daq.domain.DataEvent;
|
||||
|
||||
public class EventFieldStringifyer extends QueryFieldStringifyer implements Function<DataEvent, String> {
|
||||
|
||||
public EventFieldStringifyer(Function<Object, Object> accessor, String nonValue, String arraySeparator) {
|
||||
super(accessor, nonValue, arraySeparator);
|
||||
}
|
||||
|
||||
@Override
|
||||
public String apply(DataEvent event) {
|
||||
if (event == null) {
|
||||
return getNonValue();
|
||||
}
|
||||
|
||||
return toString(getAccessor().apply(event));
|
||||
}
|
||||
}
|
@ -3,10 +3,9 @@ package ch.psi.daq.queryrest.response.csv;
|
||||
import java.util.function.Function;
|
||||
|
||||
import ch.psi.daq.common.util.Arrays;
|
||||
import ch.psi.daq.domain.DataEvent;
|
||||
import ch.psi.data.collection.PrimitiveList;
|
||||
|
||||
public class QueryFieldStringifyer implements Function<DataEvent, String> {
|
||||
public class QueryFieldStringifyer {
|
||||
public static final String OPEN_BRACKET = "[";
|
||||
public static final String CLOSE_BRACKET = "]";
|
||||
|
||||
@ -20,14 +19,16 @@ public class QueryFieldStringifyer implements Function<DataEvent, String> {
|
||||
this.arraySeparator = arraySeparator;
|
||||
}
|
||||
|
||||
@SuppressWarnings("rawtypes")
|
||||
@Override
|
||||
public String apply(DataEvent event) {
|
||||
if (event == null) {
|
||||
return nonValue;
|
||||
}
|
||||
protected Function<Object, Object> getAccessor() {
|
||||
return accessor;
|
||||
}
|
||||
|
||||
Object value = accessor.apply(event);
|
||||
protected String getNonValue(){
|
||||
return nonValue;
|
||||
}
|
||||
|
||||
@SuppressWarnings("rawtypes")
|
||||
public String toString(Object value) {
|
||||
if (value == null) {
|
||||
return nonValue;
|
||||
} else if (value instanceof PrimitiveList) {
|
||||
|
@ -50,7 +50,7 @@ public class DAQConfigQueryResponseFormatter
|
||||
final JsonGenerator generator = factory.createGenerator(out, JsonEncoding.UTF8);
|
||||
|
||||
final DAQConfigQueryElement daqQuery = result.getKey();
|
||||
final Set<String> includedFields = getFields(daqQuery, true);
|
||||
final Set<String> includedFields = getConfigFields(daqQuery.getConfigFields(), true);
|
||||
final ObjectWriter writer = DAQQueriesResponseFormatter.configureWriter(mapper, null, includedFields);
|
||||
|
||||
try {
|
||||
@ -104,10 +104,8 @@ public class DAQConfigQueryResponseFormatter
|
||||
}
|
||||
}
|
||||
|
||||
private static Set<String> getFields(final DAQConfigQueryElement query,
|
||||
public static Set<String> getConfigFields(final Set<? extends QueryField> queryFields,
|
||||
final boolean removeIdentifiers) {
|
||||
final Set<? extends QueryField> queryFields = query.getFields();
|
||||
|
||||
final Set<String> includedFields =
|
||||
new LinkedHashSet<String>(queryFields.size());
|
||||
|
||||
|
@ -6,15 +6,14 @@ import java.util.LinkedHashMap;
|
||||
import java.util.LinkedHashSet;
|
||||
import java.util.List;
|
||||
import java.util.Map;
|
||||
import java.util.Set;
|
||||
import java.util.Map.Entry;
|
||||
import java.util.Set;
|
||||
import java.util.concurrent.atomic.AtomicReference;
|
||||
import java.util.function.Function;
|
||||
import java.util.function.ToLongFunction;
|
||||
import java.util.stream.Collectors;
|
||||
import java.util.stream.Stream;
|
||||
|
||||
import org.apache.commons.lang3.tuple.Triple;
|
||||
import org.slf4j.Logger;
|
||||
import org.slf4j.LoggerFactory;
|
||||
import org.springframework.beans.BeansException;
|
||||
@ -34,7 +33,9 @@ import ch.psi.daq.common.stream.match.ListFiller;
|
||||
import ch.psi.daq.common.stream.match.Padder;
|
||||
import ch.psi.daq.common.stream.match.StreamMatcher;
|
||||
import ch.psi.daq.common.time.TimeUtils;
|
||||
import ch.psi.daq.common.tuple.Quadruple;
|
||||
import ch.psi.daq.domain.DataEvent;
|
||||
import ch.psi.daq.domain.events.ChannelConfiguration;
|
||||
import ch.psi.daq.domain.json.ChannelName;
|
||||
import ch.psi.daq.domain.query.DAQQueryElement;
|
||||
import ch.psi.daq.domain.query.backend.BackendQuery;
|
||||
@ -56,10 +57,13 @@ import ch.psi.daq.queryrest.response.AbstractHTTPResponse;
|
||||
import ch.psi.daq.queryrest.response.ResponseFormatter;
|
||||
import ch.psi.daq.queryrest.response.json.JSONResponseStreamWriter;
|
||||
|
||||
public class DAQQueriesResponseFormatter implements ResponseFormatter<List<Entry<DAQQueryElement, Stream<Triple<BackendQuery, ChannelName, ?>>>>>, ApplicationContextAware {
|
||||
public class DAQQueriesResponseFormatter
|
||||
implements ResponseFormatter<List<Entry<DAQQueryElement, Stream<Quadruple<BackendQuery, ChannelName, ?, ?>>>>>,
|
||||
ApplicationContextAware {
|
||||
private static final Logger LOGGER = LoggerFactory.getLogger(JSONResponseStreamWriter.class);
|
||||
|
||||
public static final String DATA_RESP_FIELD = "data";
|
||||
public static final String META_RESP_FIELD = "meta";
|
||||
|
||||
public static final Mapping DEFAULT_MAPPING = new Mapping(IncompleteStrategy.PROVIDE_AS_IS);
|
||||
private static final long MILLIS_PER_PULSE = TimeUtils.MILLIS_PER_PULSE;
|
||||
@ -69,11 +73,11 @@ public class DAQQueriesResponseFormatter implements ResponseFormatter<List<Entry
|
||||
// buckets.
|
||||
private static final ToLongFunction<DataEvent> MATCHER_PROVIDER = (event) -> event.getGlobalMillis()
|
||||
/ MILLIS_PER_PULSE;
|
||||
|
||||
|
||||
// In case ArchiverAppliance had several events within the 10ms mapping interval, return these
|
||||
// aggregations (only used for table format)
|
||||
private Set<String> defaultEventResponseAggregations;
|
||||
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
@Override
|
||||
public void setApplicationContext(ApplicationContext context) throws BeansException {
|
||||
@ -88,7 +92,7 @@ public class DAQQueriesResponseFormatter implements ResponseFormatter<List<Entry
|
||||
public void format(
|
||||
final JsonFactory factory,
|
||||
final ObjectMapper mapper,
|
||||
final List<Entry<DAQQueryElement, Stream<Triple<BackendQuery, ChannelName, ?>>>> results,
|
||||
final List<Entry<DAQQueryElement, Stream<Quadruple<BackendQuery, ChannelName, ?, ?>>>> results,
|
||||
final OutputStream out,
|
||||
final AbstractHTTPResponse response) throws Exception {
|
||||
final AtomicReference<Exception> exception = new AtomicReference<>();
|
||||
@ -104,7 +108,10 @@ public class DAQQueriesResponseFormatter implements ResponseFormatter<List<Entry
|
||||
final DAQQueryElement daqQuery = entryy.getKey();
|
||||
|
||||
if (response.useTableFormat(daqQuery)) {
|
||||
final Set<String> includedFields = getFields(daqQuery, false);
|
||||
final Set<String> includedConfigFields =
|
||||
DAQConfigQueryResponseFormatter.getConfigFields(daqQuery.getConfigFields(), true);
|
||||
|
||||
final Set<String> includedFields = getEventFields(daqQuery, false);
|
||||
/* make sure identifiers are available */
|
||||
includedFields.add(EventField.channel.name());
|
||||
includedFields.add(EventField.backend.name());
|
||||
@ -113,12 +120,14 @@ public class DAQQueriesResponseFormatter implements ResponseFormatter<List<Entry
|
||||
includedFields.addAll(defaultEventResponseAggregations);
|
||||
}
|
||||
|
||||
final ObjectWriter writer = configureWriter(mapper, includedFields, null);
|
||||
final ObjectWriter writer = configureWriter(mapper, includedFields, includedConfigFields);
|
||||
|
||||
writeTableFormat(generator, writer, entryy, exception);
|
||||
} else {
|
||||
final Set<String> includedFields = getFields(daqQuery, true);
|
||||
final ObjectWriter writer = configureWriter(mapper, includedFields, null);
|
||||
final Set<String> includedConfigFields =
|
||||
DAQConfigQueryResponseFormatter.getConfigFields(daqQuery.getConfigFields(), true);
|
||||
final Set<String> includedFields = getEventFields(daqQuery, true);
|
||||
final ObjectWriter writer = configureWriter(mapper, includedFields, includedConfigFields);
|
||||
|
||||
writeArrayFormat(generator, writer, entryy, exception);
|
||||
}
|
||||
@ -138,7 +147,7 @@ public class DAQQueriesResponseFormatter implements ResponseFormatter<List<Entry
|
||||
}
|
||||
|
||||
private static void writeArrayFormat(final JsonGenerator generator, final ObjectWriter writer,
|
||||
final Entry<DAQQueryElement, Stream<Triple<BackendQuery, ChannelName, ?>>> entryy,
|
||||
final Entry<DAQQueryElement, Stream<Quadruple<BackendQuery, ChannelName, ?, ?>>> entryy,
|
||||
final AtomicReference<Exception> exception) {
|
||||
final DAQQueryElement daqQuery = entryy.getKey();
|
||||
|
||||
@ -149,21 +158,28 @@ public class DAQQueriesResponseFormatter implements ResponseFormatter<List<Entry
|
||||
/* ensure elements are sequentially written */
|
||||
.sequential()
|
||||
.forEach(
|
||||
triple -> {
|
||||
quadruple -> {
|
||||
try {
|
||||
generator.writeStartObject();
|
||||
generator.writeFieldName(EventField.channel.name());
|
||||
writer.writeValue(generator, triple.getMiddle());
|
||||
writer.writeValue(generator, quadruple.getSecond());
|
||||
if (daqQuery.hasConfigFields()) {
|
||||
generator.writeFieldName(DAQConfigQueryResponseFormatter.CONFIGS_RESP_FIELD);
|
||||
writer.writeValue(generator, quadruple.getThird());
|
||||
}
|
||||
generator.writeFieldName(DATA_RESP_FIELD);
|
||||
writer.writeValue(generator, triple.getRight());
|
||||
writer.writeValue(generator, quadruple.getFourth());
|
||||
generator.writeEndObject();
|
||||
} catch (Exception e) {
|
||||
LOGGER.error("Could not write channel name of channel '{}'", triple.getMiddle(),
|
||||
LOGGER.error("Could not write channel name of channel '{}'", quadruple.getSecond(),
|
||||
e);
|
||||
exception.compareAndSet(null, e);
|
||||
} finally {
|
||||
if (triple.getRight() instanceof Stream) {
|
||||
((Stream<?>) (triple.getRight())).close();
|
||||
if (quadruple.getThird() instanceof Stream) {
|
||||
((Stream<?>) (quadruple.getThird())).close();
|
||||
}
|
||||
if (quadruple.getFourth() instanceof Stream) {
|
||||
((Stream<?>) (quadruple.getFourth())).close();
|
||||
}
|
||||
}
|
||||
});
|
||||
@ -177,28 +193,31 @@ public class DAQQueriesResponseFormatter implements ResponseFormatter<List<Entry
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
private static void writeTableFormat(JsonGenerator generator, ObjectWriter writer,
|
||||
Entry<DAQQueryElement, Stream<Triple<BackendQuery, ChannelName, ?>>> entryy,
|
||||
Entry<DAQQueryElement, Stream<Quadruple<BackendQuery, ChannelName, ?, ?>>> entryy,
|
||||
AtomicReference<Exception> exception) {
|
||||
final Map<ChannelName, Stream<ChannelConfiguration>> configStreams = new LinkedHashMap<>();
|
||||
/* get DataEvent stream of sub-queries for later match */
|
||||
final Map<ChannelName, Stream<DataEvent>> streams =
|
||||
new LinkedHashMap<>();
|
||||
final Map<ChannelName, Stream<DataEvent>> eventStreams = new LinkedHashMap<>();
|
||||
final AtomicReference<BackendQuery> backendQueryRef = new AtomicReference<>();
|
||||
final DAQQueryElement daqQuery = entryy.getKey();
|
||||
|
||||
entryy.getValue()
|
||||
.sequential()
|
||||
.forEach(
|
||||
triple -> {
|
||||
backendQueryRef.compareAndSet(null, triple.getLeft());
|
||||
quadruple -> {
|
||||
backendQueryRef.compareAndSet(null, quadruple.getFirst());
|
||||
|
||||
if (triple.getRight() instanceof Stream) {
|
||||
streams.put(triple.getMiddle(), ((Stream<DataEvent>) triple.getRight()));
|
||||
if (entryy.getKey().hasConfigFields() && quadruple.getThird() instanceof Stream) {
|
||||
configStreams.put(quadruple.getSecond(), ((Stream<ChannelConfiguration>) quadruple.getThird()));
|
||||
}
|
||||
if (quadruple.getFourth() instanceof Stream) {
|
||||
eventStreams.put(quadruple.getSecond(), ((Stream<DataEvent>) quadruple.getFourth()));
|
||||
} else {
|
||||
final String message =
|
||||
String.format("Expect a DataEvent Stream for '%s' but got '%s'.",
|
||||
triple.getMiddle(), triple.getRight().getClass().getSimpleName());
|
||||
quadruple.getSecond(), quadruple.getFourth().getClass().getSimpleName());
|
||||
LOGGER.warn(message);
|
||||
streams.put(triple.getMiddle(), Stream.empty());
|
||||
eventStreams.put(quadruple.getSecond(), Stream.empty());
|
||||
}
|
||||
});
|
||||
|
||||
@ -233,12 +252,44 @@ public class DAQQueriesResponseFormatter implements ResponseFormatter<List<Entry
|
||||
new ListFiller<ChannelName, DataEvent>(),
|
||||
new BinnedValueCombiner(binningStrategy),
|
||||
padder,
|
||||
streams.keySet(),
|
||||
streams.values());
|
||||
eventStreams.keySet(),
|
||||
eventStreams.values());
|
||||
final Iterator<List<DataEvent>> streamsMatchIter = streamMatcher.iterator();
|
||||
|
||||
try {
|
||||
generator.writeStartObject();
|
||||
// configs if available (use same format as for array form)
|
||||
if (!configStreams.isEmpty()) {
|
||||
generator.writeFieldName(META_RESP_FIELD);
|
||||
generator.writeStartArray();
|
||||
|
||||
configStreams.entrySet()
|
||||
/* ensure elements are sequentially written */
|
||||
.stream()
|
||||
.forEach(
|
||||
entry -> {
|
||||
try {
|
||||
generator.writeStartObject();
|
||||
generator.writeFieldName(EventField.channel.name());
|
||||
writer.writeValue(generator, entry.getKey());
|
||||
|
||||
generator.writeFieldName(DAQConfigQueryResponseFormatter.CONFIGS_RESP_FIELD);
|
||||
writer.writeValue(generator, entry.getValue());
|
||||
|
||||
generator.writeEndObject();
|
||||
} catch (Exception e) {
|
||||
LOGGER.error("Could not write channel name of channel '{}'", entry.getKey(),
|
||||
e);
|
||||
exception.compareAndSet(null, e);
|
||||
} finally {
|
||||
entry.getValue().close();
|
||||
}
|
||||
});
|
||||
|
||||
generator.writeEndArray();
|
||||
}
|
||||
|
||||
// write event table
|
||||
generator.writeFieldName(DATA_RESP_FIELD);
|
||||
writer.writeValue(generator, streamsMatchIter);
|
||||
generator.writeEndObject();
|
||||
@ -284,18 +335,18 @@ public class DAQQueriesResponseFormatter implements ResponseFormatter<List<Entry
|
||||
final ObjectWriter writer = mapper.writer(propertyFilter);
|
||||
return writer;
|
||||
}
|
||||
|
||||
private static Set<String> getFields(final DAQQueryElement query, final boolean removeIdentifiers) {
|
||||
final Set<? extends QueryField> queryFields = query.getFields();
|
||||
|
||||
private static Set<String> getEventFields(final DAQQueryElement query, final boolean removeIdentifiers) {
|
||||
final Set<? extends QueryField> eventFields = query.getEventFields();
|
||||
final List<Aggregation> aggregations =
|
||||
query.getAggregation() != null ? query.getAggregation().getAggregations() : null;
|
||||
final List<Extrema> extrema = query.getAggregation() != null ? query.getAggregation().getExtrema() : null;
|
||||
|
||||
final Set<String> includedFields =
|
||||
new LinkedHashSet<String>(queryFields.size() + (aggregations != null ? aggregations.size() : 0)
|
||||
new LinkedHashSet<String>(eventFields.size() + (aggregations != null ? aggregations.size() : 0)
|
||||
+ (extrema != null ? extrema.size() : 0));
|
||||
|
||||
for (final QueryField field : queryFields) {
|
||||
for (final QueryField field : eventFields) {
|
||||
includedFields.add(field.getName());
|
||||
}
|
||||
if (aggregations != null) {
|
||||
@ -317,7 +368,7 @@ public class DAQQueriesResponseFormatter implements ResponseFormatter<List<Entry
|
||||
|
||||
return includedFields;
|
||||
}
|
||||
|
||||
|
||||
private static boolean containsAggregation(final Set<String> includedFields) {
|
||||
for (final Aggregation aggregation : Aggregation.values()) {
|
||||
if (includedFields.contains(aggregation.name())) {
|
||||
|
@ -75,8 +75,8 @@ public class JSONHTTPResponse extends AbstractHTTPResponse {
|
||||
final DAQQueries queries = (DAQQueries) queryObj;
|
||||
for (final DAQQueryElement query : queries) {
|
||||
if (query.getMapping() != null) {
|
||||
if (!ArrayUtils.contains(query.getColumns(), FieldNames.FIELD_GLOBAL_TIME)) {
|
||||
query.addField(EventField.globalMillis);
|
||||
if (!ArrayUtils.contains(query.getRequest().getColumns(), FieldNames.FIELD_GLOBAL_TIME)) {
|
||||
query.addEventField(EventField.globalMillis);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -58,12 +58,12 @@ public abstract class AbstractQueryRestControllerTableTest extends AbstractDaqRe
|
||||
101),
|
||||
TEST_CHANNEL_NAMES);
|
||||
request.setMapping(new Mapping());
|
||||
request.addField(EventField.pulseId);
|
||||
request.addField(EventField.globalSeconds);
|
||||
request.addField(EventField.globalMillis);
|
||||
request.addField(EventField.iocSeconds);
|
||||
request.addField(EventField.iocMillis);
|
||||
request.addField(EventField.value);
|
||||
request.addEventField(EventField.pulseId);
|
||||
request.addEventField(EventField.globalSeconds);
|
||||
request.addEventField(EventField.globalMillis);
|
||||
request.addEventField(EventField.iocSeconds);
|
||||
request.addEventField(EventField.iocMillis);
|
||||
request.addEventField(EventField.value);
|
||||
request.setResponse(getResponse());
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
|
@ -56,12 +56,12 @@ public abstract class AbstractQueryRestControllerTest extends AbstractDaqRestTes
|
||||
100,
|
||||
101),
|
||||
TEST_CHANNEL_NAMES);
|
||||
request.addField(EventField.pulseId);
|
||||
request.addField(EventField.globalSeconds);
|
||||
request.addField(EventField.globalMillis);
|
||||
request.addField(EventField.iocSeconds);
|
||||
request.addField(EventField.iocMillis);
|
||||
request.addField(EventField.value);
|
||||
request.addEventField(EventField.pulseId);
|
||||
request.addEventField(EventField.globalSeconds);
|
||||
request.addEventField(EventField.globalMillis);
|
||||
request.addEventField(EventField.iocSeconds);
|
||||
request.addEventField(EventField.iocMillis);
|
||||
request.addEventField(EventField.value);
|
||||
request.setResponse(getResponse());
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
|
@ -18,20 +18,26 @@ import org.apache.commons.csv.CSVRecord;
|
||||
import org.apache.http.client.utils.URIBuilder;
|
||||
import org.junit.After;
|
||||
import org.junit.Test;
|
||||
import org.springframework.beans.BeansException;
|
||||
import org.springframework.context.ApplicationContext;
|
||||
import org.springframework.context.ApplicationContextAware;
|
||||
import org.springframework.http.MediaType;
|
||||
import org.springframework.test.web.servlet.MvcResult;
|
||||
import org.springframework.test.web.servlet.request.MockMvcRequestBuilders;
|
||||
import org.springframework.test.web.servlet.result.MockMvcResultHandlers;
|
||||
import org.springframework.test.web.servlet.result.MockMvcResultMatchers;
|
||||
|
||||
import ch.psi.bsread.message.Type;
|
||||
import ch.psi.daq.common.ordering.Ordering;
|
||||
import ch.psi.daq.common.time.TimeUtils;
|
||||
import ch.psi.daq.domain.backend.Backend;
|
||||
import ch.psi.daq.domain.config.DomainConfig;
|
||||
import ch.psi.daq.domain.query.DAQQuery;
|
||||
import ch.psi.daq.domain.query.operation.Aggregation;
|
||||
import ch.psi.daq.domain.query.operation.AggregationDescriptor;
|
||||
import ch.psi.daq.domain.query.operation.AggregationType;
|
||||
import ch.psi.daq.domain.query.operation.Compression;
|
||||
import ch.psi.daq.domain.query.operation.ConfigField;
|
||||
import ch.psi.daq.domain.query.operation.Extrema;
|
||||
import ch.psi.daq.domain.query.operation.EventField;
|
||||
import ch.psi.daq.domain.request.range.RequestRangeDate;
|
||||
@ -45,13 +51,21 @@ import ch.psi.daq.test.queryrest.AbstractDaqRestTest;
|
||||
/**
|
||||
* Tests the {@link DaqController} implementation.
|
||||
*/
|
||||
public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
public class CSVQueryRestControllerTest extends AbstractDaqRestTest implements ApplicationContextAware {
|
||||
|
||||
public static final String TEST_CHANNEL = "testChannel";
|
||||
public static final String TEST_CHANNEL_01 = TEST_CHANNEL + "1";
|
||||
public static final String TEST_CHANNEL_02 = TEST_CHANNEL + "2";
|
||||
public static final String[] TEST_CHANNEL_NAMES = new String[] {TEST_CHANNEL_01, TEST_CHANNEL_02};
|
||||
|
||||
private Backend backend;
|
||||
|
||||
@Override
|
||||
public void setApplicationContext(ApplicationContext context) throws BeansException {
|
||||
backend = context.getBean(DomainConfig.BEAN_NAME_BACKEND_DEFAULT, Backend.class);
|
||||
context = backend.getApplicationContext();
|
||||
}
|
||||
|
||||
@After
|
||||
public void tearDown() throws Exception {}
|
||||
|
||||
@ -65,17 +79,17 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
channels);
|
||||
request.setResponse(new CSVHTTPResponse());
|
||||
|
||||
LinkedHashSet<EventField> queryFields = new LinkedHashSet<>();
|
||||
queryFields.add(EventField.channel);
|
||||
queryFields.add(EventField.pulseId);
|
||||
queryFields.add(EventField.iocSeconds);
|
||||
queryFields.add(EventField.iocMillis);
|
||||
queryFields.add(EventField.globalSeconds);
|
||||
queryFields.add(EventField.globalMillis);
|
||||
queryFields.add(EventField.shape);
|
||||
queryFields.add(EventField.eventCount);
|
||||
queryFields.add(EventField.value);
|
||||
request.setFields(queryFields);
|
||||
LinkedHashSet<EventField> eventFields = new LinkedHashSet<>();
|
||||
eventFields.add(EventField.channel);
|
||||
eventFields.add(EventField.pulseId);
|
||||
eventFields.add(EventField.iocSeconds);
|
||||
eventFields.add(EventField.iocMillis);
|
||||
eventFields.add(EventField.globalSeconds);
|
||||
eventFields.add(EventField.globalMillis);
|
||||
eventFields.add(EventField.shape);
|
||||
eventFields.add(EventField.eventCount);
|
||||
eventFields.add(EventField.value);
|
||||
request.setFields(eventFields);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
@ -98,25 +112,25 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
|
||||
try {
|
||||
long pulse = 0;
|
||||
int totalRows = 2;
|
||||
int totalEventRows = 2;
|
||||
|
||||
List<CSVRecord> records = csvParser.getRecords();
|
||||
assertEquals(totalRows + 1, records.size());
|
||||
assertEquals(totalEventRows + 1, records.size());
|
||||
// remove header
|
||||
CSVRecord record = records.remove(0);
|
||||
assertEquals(queryFields.size() * channels.size(), record.size());
|
||||
assertEquals(eventFields.size() * channels.size(), record.size());
|
||||
int column = 0;
|
||||
for (String channel : channels) {
|
||||
for (EventField queryField : queryFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + queryField.name(),
|
||||
for (EventField eventField : eventFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + eventField.name(),
|
||||
record.get(column++));
|
||||
}
|
||||
}
|
||||
|
||||
for (int row = 0; row < totalRows; ++row) {
|
||||
for (int row = 0; row < totalEventRows; ++row) {
|
||||
record = records.get(row);
|
||||
|
||||
assertEquals(queryFields.size() * channels.size(), record.size());
|
||||
assertEquals(eventFields.size() * channels.size(), record.size());
|
||||
|
||||
column = 0;
|
||||
for (String channel : channels) {
|
||||
@ -160,12 +174,12 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
channels);
|
||||
request.setResponse(new CSVHTTPResponse());
|
||||
|
||||
LinkedHashSet<EventField> queryFields = new LinkedHashSet<>();
|
||||
queryFields.add(EventField.channel);
|
||||
queryFields.add(EventField.pulseId);
|
||||
queryFields.add(EventField.globalMillis);
|
||||
queryFields.add(EventField.value);
|
||||
request.setFields(queryFields);
|
||||
LinkedHashSet<EventField> eventFields = new LinkedHashSet<>();
|
||||
eventFields.add(EventField.channel);
|
||||
eventFields.add(EventField.pulseId);
|
||||
eventFields.add(EventField.globalMillis);
|
||||
eventFields.add(EventField.value);
|
||||
request.setFields(eventFields);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
@ -188,24 +202,24 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
|
||||
try {
|
||||
long pulse = 0;
|
||||
int totalRows = 6;
|
||||
int totalEventRows = 6;
|
||||
|
||||
List<CSVRecord> records = csvParser.getRecords();
|
||||
assertEquals(totalRows + 1, records.size());
|
||||
assertEquals(totalEventRows + 1, records.size());
|
||||
CSVRecord record = records.remove(0);
|
||||
assertEquals(queryFields.size() * channels.size(), record.size());
|
||||
assertEquals(eventFields.size() * channels.size(), record.size());
|
||||
int column = 0;
|
||||
for (String channel : channels) {
|
||||
for (EventField queryField : queryFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + queryField.name(),
|
||||
for (EventField eventField : eventFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + eventField.name(),
|
||||
record.get(column++));
|
||||
}
|
||||
}
|
||||
|
||||
for (int row = 0; row < totalRows; ++row) {
|
||||
for (int row = 0; row < totalEventRows; ++row) {
|
||||
record = records.get(row);
|
||||
|
||||
assertEquals(queryFields.size() * channels.size(), record.size());
|
||||
assertEquals(eventFields.size() * channels.size(), record.size());
|
||||
|
||||
column = 0;
|
||||
for (String channel : channels) {
|
||||
@ -252,10 +266,10 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
channels);
|
||||
request.setResponse(new CSVHTTPResponse());
|
||||
|
||||
LinkedHashSet<EventField> queryFields = new LinkedHashSet<>();
|
||||
queryFields.add(EventField.channel);
|
||||
queryFields.add(EventField.value);
|
||||
request.setFields(queryFields);
|
||||
LinkedHashSet<EventField> eventFields = new LinkedHashSet<>();
|
||||
eventFields.add(EventField.channel);
|
||||
eventFields.add(EventField.value);
|
||||
request.setFields(eventFields);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
@ -278,17 +292,17 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
|
||||
try {
|
||||
long pulse = 0;
|
||||
int totalRows = 6;
|
||||
int totalEventRows = 6;
|
||||
|
||||
List<CSVRecord> records = csvParser.getRecords();
|
||||
assertEquals(totalRows + 1, records.size());
|
||||
assertEquals(totalEventRows + 1, records.size());
|
||||
// remove header
|
||||
CSVRecord record = records.remove(0);
|
||||
assertEquals((queryFields.size() + 1) * channels.size(), record.size());
|
||||
assertEquals((eventFields.size() + 1) * channels.size(), record.size());
|
||||
int column = 0;
|
||||
for (String channel : channels) {
|
||||
for (EventField queryField : queryFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + queryField.name(),
|
||||
for (EventField eventField : eventFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + eventField.name(),
|
||||
record.get(column++));
|
||||
}
|
||||
assertEquals(
|
||||
@ -296,10 +310,10 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
record.get(column++));
|
||||
}
|
||||
|
||||
for (int row = 0; row < totalRows; ++row) {
|
||||
for (int row = 0; row < totalEventRows; ++row) {
|
||||
record = records.get(row);
|
||||
|
||||
assertEquals((queryFields.size() + 1) * channels.size(), record.size());
|
||||
assertEquals((eventFields.size() + 1) * channels.size(), record.size());
|
||||
|
||||
column = 0;
|
||||
for (String channel : channels) {
|
||||
@ -340,18 +354,18 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
// request.setResponse(new CSVHTTPResponse());
|
||||
// channels = Arrays.asList(TEST_CHANNEL_01, TEST_CHANNEL_02, testChannel3);
|
||||
//
|
||||
// LinkedHashSet<QueryField> queryFields = new LinkedHashSet<>();
|
||||
// queryFields.add(QueryField.channel);
|
||||
// queryFields.add(QueryField.pulseId);
|
||||
// queryFields.add(QueryField.iocSeconds);
|
||||
// queryFields.add(QueryField.iocMillis);
|
||||
// queryFields.add(QueryField.globalSeconds);
|
||||
// queryFields.add(QueryField.globalMillis);
|
||||
// queryFields.add(QueryField.shape);
|
||||
// queryFields.add(QueryField.eventCount);
|
||||
// queryFields.add(QueryField.value);
|
||||
// LinkedHashSet<eventField> eventFields = new LinkedHashSet<>();
|
||||
// eventFields.add(eventField.channel);
|
||||
// eventFields.add(eventField.pulseId);
|
||||
// eventFields.add(eventField.iocSeconds);
|
||||
// eventFields.add(eventField.iocMillis);
|
||||
// eventFields.add(eventField.globalSeconds);
|
||||
// eventFields.add(eventField.globalMillis);
|
||||
// eventFields.add(eventField.shape);
|
||||
// eventFields.add(eventField.eventCount);
|
||||
// eventFields.add(eventField.value);
|
||||
// for (DAQQueryElement element : request) {
|
||||
// element.setFields(queryFields);
|
||||
// element.setFields(eventFields);
|
||||
// }
|
||||
//
|
||||
// String content = mapper.writeValueAsString(request);
|
||||
@ -375,26 +389,26 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
//
|
||||
// try {
|
||||
// long pulse = 0;
|
||||
// int totalRows = 2;
|
||||
// int totalEventRows = 2;
|
||||
//
|
||||
// List<CSVRecord> records = csvParser.getRecords();
|
||||
// assertEquals(totalRows + 1, records.size());
|
||||
// assertEquals(totalEventRows + 1, records.size());
|
||||
// // remove header
|
||||
// CSVRecord record = records.remove(0);
|
||||
// assertEquals(queryFields.size() * channels.size(), record.size());
|
||||
// assertEquals(eventFields.size() * channels.size(), record.size());
|
||||
// int column = 0;
|
||||
// for (String channel : channels) {
|
||||
// for (QueryField queryField : queryFields) {
|
||||
// for (eventField eventField : eventFields) {
|
||||
// assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME +
|
||||
// queryField.name(),
|
||||
// eventField.name(),
|
||||
// record.get(column++));
|
||||
// }
|
||||
// }
|
||||
//
|
||||
// for (int row = 0; row < totalRows; ++row) {
|
||||
// for (int row = 0; row < totalEventRows; ++row) {
|
||||
// record = records.get(row);
|
||||
//
|
||||
// assertEquals(queryFields.size() * channels.size(), record.size());
|
||||
// assertEquals(eventFields.size() * channels.size(), record.size());
|
||||
//
|
||||
// column = 0;
|
||||
// for (String channel : channels) {
|
||||
@ -430,17 +444,17 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
channels);
|
||||
request.setResponse(new CSVHTTPResponse());
|
||||
|
||||
LinkedHashSet<EventField> queryFields = new LinkedHashSet<>();
|
||||
queryFields.add(EventField.channel);
|
||||
queryFields.add(EventField.pulseId);
|
||||
queryFields.add(EventField.iocSeconds);
|
||||
queryFields.add(EventField.iocMillis);
|
||||
queryFields.add(EventField.globalSeconds);
|
||||
queryFields.add(EventField.globalMillis);
|
||||
queryFields.add(EventField.shape);
|
||||
queryFields.add(EventField.eventCount);
|
||||
queryFields.add(EventField.value);
|
||||
request.setFields(queryFields);
|
||||
LinkedHashSet<EventField> eventFields = new LinkedHashSet<>();
|
||||
eventFields.add(EventField.channel);
|
||||
eventFields.add(EventField.pulseId);
|
||||
eventFields.add(EventField.iocSeconds);
|
||||
eventFields.add(EventField.iocMillis);
|
||||
eventFields.add(EventField.globalSeconds);
|
||||
eventFields.add(EventField.globalMillis);
|
||||
eventFields.add(EventField.shape);
|
||||
eventFields.add(EventField.eventCount);
|
||||
eventFields.add(EventField.value);
|
||||
request.setFields(eventFields);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
@ -463,25 +477,25 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
|
||||
try {
|
||||
long pulse = 0;
|
||||
int totalRows = 2;
|
||||
int totalEventRows = 2;
|
||||
|
||||
List<CSVRecord> records = csvParser.getRecords();
|
||||
assertEquals(totalRows + 1, records.size());
|
||||
assertEquals(totalEventRows + 1, records.size());
|
||||
// remove header
|
||||
CSVRecord record = records.remove(0);
|
||||
assertEquals(queryFields.size() * channels.size(), record.size());
|
||||
assertEquals(eventFields.size() * channels.size(), record.size());
|
||||
int column = 0;
|
||||
for (String channel : channels) {
|
||||
for (EventField queryField : queryFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + queryField.name(),
|
||||
for (EventField eventField : eventFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + eventField.name(),
|
||||
record.get(column++));
|
||||
}
|
||||
}
|
||||
|
||||
for (int row = 0; row < totalRows; ++row) {
|
||||
for (int row = 0; row < totalEventRows; ++row) {
|
||||
record = records.get(row);
|
||||
|
||||
assertEquals(queryFields.size() * channels.size(), record.size());
|
||||
assertEquals(eventFields.size() * channels.size(), record.size());
|
||||
|
||||
column = 0;
|
||||
for (String channel : channels) {
|
||||
@ -514,17 +528,17 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
channels);
|
||||
request.setResponse(new CSVHTTPResponse());
|
||||
|
||||
LinkedHashSet<EventField> queryFields = new LinkedHashSet<>();
|
||||
queryFields.add(EventField.channel);
|
||||
queryFields.add(EventField.pulseId);
|
||||
queryFields.add(EventField.iocSeconds);
|
||||
queryFields.add(EventField.iocMillis);
|
||||
queryFields.add(EventField.globalSeconds);
|
||||
queryFields.add(EventField.globalMillis);
|
||||
queryFields.add(EventField.shape);
|
||||
queryFields.add(EventField.eventCount);
|
||||
queryFields.add(EventField.value);
|
||||
request.setFields(queryFields);
|
||||
LinkedHashSet<EventField> eventFields = new LinkedHashSet<>();
|
||||
eventFields.add(EventField.channel);
|
||||
eventFields.add(EventField.pulseId);
|
||||
eventFields.add(EventField.iocSeconds);
|
||||
eventFields.add(EventField.iocMillis);
|
||||
eventFields.add(EventField.globalSeconds);
|
||||
eventFields.add(EventField.globalMillis);
|
||||
eventFields.add(EventField.shape);
|
||||
eventFields.add(EventField.eventCount);
|
||||
eventFields.add(EventField.value);
|
||||
request.setFields(eventFields);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
@ -547,25 +561,169 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
|
||||
try {
|
||||
long pulse = 0;
|
||||
int totalRows = 2;
|
||||
int totalEventRows = 2;
|
||||
|
||||
List<CSVRecord> records = csvParser.getRecords();
|
||||
assertEquals(totalRows + 1, records.size());
|
||||
assertEquals(totalEventRows + 1, records.size());
|
||||
// remove header
|
||||
CSVRecord record = records.remove(0);
|
||||
assertEquals(queryFields.size() * channels.size(), record.size());
|
||||
assertEquals(eventFields.size() * channels.size(), record.size());
|
||||
int column = 0;
|
||||
for (String channel : channels) {
|
||||
for (EventField queryField : queryFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + queryField.name(),
|
||||
for (EventField eventField : eventFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + eventField.name(),
|
||||
record.get(column++));
|
||||
}
|
||||
}
|
||||
|
||||
for (int row = 0; row < totalRows; ++row) {
|
||||
for (int row = 0; row < totalEventRows; ++row) {
|
||||
record = records.get(row);
|
||||
|
||||
assertEquals(queryFields.size() * channels.size(), record.size());
|
||||
assertEquals(eventFields.size() * channels.size(), record.size());
|
||||
|
||||
column = 0;
|
||||
for (String channel : channels) {
|
||||
assertEquals(channel, record.get(column++));
|
||||
assertEquals("" + pulse, record.get(column++));
|
||||
assertEquals(TimeUtils.getTimeStr(TestTimeUtils.getTimeFromPulseId(pulse)), record.get(column++));
|
||||
assertEquals("" + TimeUtils.getMillis(TestTimeUtils.getTimeFromPulseId(pulse)), record.get(column++));
|
||||
assertEquals(TimeUtils.getTimeStr(TestTimeUtils.getTimeFromPulseId(pulse)), record.get(column++));
|
||||
assertEquals("" + TimeUtils.getMillis(TestTimeUtils.getTimeFromPulseId(pulse)), record.get(column++));
|
||||
assertEquals("[1]", record.get(column++));
|
||||
assertEquals("1", record.get(column++));
|
||||
assertEquals("" + pulse, record.get(column++));
|
||||
}
|
||||
++pulse;
|
||||
}
|
||||
} finally {
|
||||
reader.close();
|
||||
csvParser.close();
|
||||
}
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testTimeRangeQueryConfigFields() throws Exception {
|
||||
List<String> channels = Arrays.asList(TEST_CHANNEL_01, TEST_CHANNEL_02);
|
||||
DAQQuery request = new DAQQuery(
|
||||
new RequestRangeTime(
|
||||
TimeUtils.getTimeFromMillis(0, 0),
|
||||
TimeUtils.getTimeFromMillis(10, 0)),
|
||||
channels);
|
||||
request.setResponse(new CSVHTTPResponse());
|
||||
|
||||
LinkedHashSet<ConfigField> configFields = new LinkedHashSet<>();
|
||||
configFields.add(ConfigField.name);
|
||||
configFields.add(ConfigField.pulseId);
|
||||
configFields.add(ConfigField.globalSeconds);
|
||||
configFields.add(ConfigField.globalMillis);
|
||||
configFields.add(ConfigField.shape);
|
||||
configFields.add(ConfigField.description);
|
||||
configFields.add(ConfigField.backend);
|
||||
configFields.add(ConfigField.modulo);
|
||||
configFields.add(ConfigField.offset);
|
||||
configFields.add(ConfigField.keyspace);
|
||||
configFields.add(ConfigField.precision);
|
||||
configFields.add(ConfigField.source);
|
||||
configFields.add(ConfigField.type);
|
||||
configFields.add(ConfigField.unit);
|
||||
request.setConfigFields(configFields);
|
||||
|
||||
LinkedHashSet<EventField> eventFields = new LinkedHashSet<>();
|
||||
eventFields.add(EventField.channel);
|
||||
eventFields.add(EventField.pulseId);
|
||||
eventFields.add(EventField.iocSeconds);
|
||||
eventFields.add(EventField.iocMillis);
|
||||
eventFields.add(EventField.globalSeconds);
|
||||
eventFields.add(EventField.globalMillis);
|
||||
eventFields.add(EventField.shape);
|
||||
eventFields.add(EventField.eventCount);
|
||||
eventFields.add(EventField.value);
|
||||
request.setFields(eventFields);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
|
||||
MvcResult result = this.mockMvc
|
||||
.perform(MockMvcRequestBuilders
|
||||
.post(DomainConfig.PATH_QUERY)
|
||||
.contentType(MediaType.APPLICATION_JSON)
|
||||
.content(content))
|
||||
.andDo(MockMvcResultHandlers.print())
|
||||
.andExpect(MockMvcResultMatchers.status().isOk())
|
||||
.andReturn();
|
||||
|
||||
String response = result.getResponse().getContentAsString();
|
||||
System.out.println("Response: " + response);
|
||||
|
||||
CSVFormat csvFormat = CSVFormat.EXCEL.withDelimiter(CSVResponseStreamWriter.DELIMITER_CVS);
|
||||
StringReader reader = new StringReader(response);
|
||||
CSVParser csvParser = new CSVParser(reader, csvFormat);
|
||||
|
||||
try {
|
||||
long pulse = 0;
|
||||
// pulse of last element
|
||||
long configPulse = 1;
|
||||
int totalConfigRows = 1;
|
||||
int totalEventRows = 2;
|
||||
|
||||
List<CSVRecord> records = csvParser.getRecords();
|
||||
assertEquals(totalEventRows + 1 + totalConfigRows + 1 + 1, records.size());
|
||||
// remove config header
|
||||
CSVRecord record = records.remove(0);
|
||||
assertEquals(configFields.size() * channels.size(), record.size());
|
||||
int column = 0;
|
||||
for (String channel : channels) {
|
||||
for (ConfigField configField : configFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + configField.name(),
|
||||
record.get(column++));
|
||||
}
|
||||
}
|
||||
|
||||
for (int row = 0; row < totalConfigRows; ++row) {
|
||||
record = records.remove(0);
|
||||
|
||||
assertEquals(configFields.size() * channels.size(), record.size());
|
||||
|
||||
column = 0;
|
||||
for (String channel : channels) {
|
||||
assertEquals(channel, record.get(column++));
|
||||
assertEquals("" + configPulse, record.get(column++));
|
||||
assertEquals(TimeUtils.getTimeStr(TestTimeUtils.getTimeFromPulseId(configPulse)), record.get(column++));
|
||||
assertEquals("" + TimeUtils.getMillis(TestTimeUtils.getTimeFromPulseId(configPulse)),
|
||||
record.get(column++));
|
||||
assertEquals("[1]", record.get(column++));
|
||||
assertEquals("", record.get(column++));
|
||||
assertEquals(backend.getName(), record.get(column++));
|
||||
assertEquals("1", record.get(column++));
|
||||
assertEquals("0", record.get(column++));
|
||||
assertEquals("1", record.get(column++));
|
||||
assertEquals("0", record.get(column++));
|
||||
assertEquals("unknown", record.get(column++));
|
||||
assertEquals(Type.Int32.getKey(), record.get(column++));
|
||||
assertEquals("", record.get(column++));
|
||||
}
|
||||
}
|
||||
|
||||
// empty line
|
||||
record = records.remove(0);
|
||||
assertEquals(1, record.size());
|
||||
assertEquals("", record.get(0));
|
||||
|
||||
// remove event header
|
||||
record = records.remove(0);
|
||||
assertEquals(eventFields.size() * channels.size(), record.size());
|
||||
column = 0;
|
||||
for (String channel : channels) {
|
||||
for (EventField eventField : eventFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + eventField.name(),
|
||||
record.get(column++));
|
||||
}
|
||||
}
|
||||
|
||||
for (int row = 0; row < totalEventRows; ++row) {
|
||||
record = records.get(row);
|
||||
|
||||
assertEquals(eventFields.size() * channels.size(), record.size());
|
||||
|
||||
column = 0;
|
||||
for (String channel : channels) {
|
||||
@ -599,19 +757,19 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
channels);
|
||||
request.setResponse(new CSVHTTPResponse());
|
||||
|
||||
LinkedHashSet<EventField> queryFields = new LinkedHashSet<>();
|
||||
queryFields.add(EventField.channel);
|
||||
queryFields.add(EventField.pulseId);
|
||||
queryFields.add(EventField.iocSeconds);
|
||||
queryFields.add(EventField.iocDate);
|
||||
queryFields.add(EventField.iocMillis);
|
||||
queryFields.add(EventField.globalSeconds);
|
||||
queryFields.add(EventField.globalDate);
|
||||
queryFields.add(EventField.globalMillis);
|
||||
queryFields.add(EventField.shape);
|
||||
queryFields.add(EventField.eventCount);
|
||||
queryFields.add(EventField.value);
|
||||
request.setFields(queryFields);
|
||||
LinkedHashSet<EventField> eventFields = new LinkedHashSet<>();
|
||||
eventFields.add(EventField.channel);
|
||||
eventFields.add(EventField.pulseId);
|
||||
eventFields.add(EventField.iocSeconds);
|
||||
eventFields.add(EventField.iocDate);
|
||||
eventFields.add(EventField.iocMillis);
|
||||
eventFields.add(EventField.globalSeconds);
|
||||
eventFields.add(EventField.globalDate);
|
||||
eventFields.add(EventField.globalMillis);
|
||||
eventFields.add(EventField.shape);
|
||||
eventFields.add(EventField.eventCount);
|
||||
eventFields.add(EventField.value);
|
||||
request.setFields(eventFields);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
@ -634,25 +792,25 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
|
||||
try {
|
||||
long pulse = 0;
|
||||
int totalRows = 2;
|
||||
int totalEventRows = 2;
|
||||
|
||||
List<CSVRecord> records = csvParser.getRecords();
|
||||
assertEquals(totalRows + 1, records.size());
|
||||
assertEquals(totalEventRows + 1, records.size());
|
||||
// remove header
|
||||
CSVRecord record = records.remove(0);
|
||||
assertEquals(queryFields.size() * channels.size(), record.size());
|
||||
assertEquals(eventFields.size() * channels.size(), record.size());
|
||||
int column = 0;
|
||||
for (String channel : channels) {
|
||||
for (EventField queryField : queryFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + queryField.name(),
|
||||
for (EventField eventField : eventFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + eventField.name(),
|
||||
record.get(column++));
|
||||
}
|
||||
}
|
||||
|
||||
for (int row = 0; row < totalRows; ++row) {
|
||||
for (int row = 0; row < totalEventRows; ++row) {
|
||||
record = records.get(row);
|
||||
|
||||
assertEquals(queryFields.size() * channels.size(), record.size());
|
||||
assertEquals(eventFields.size() * channels.size(), record.size());
|
||||
|
||||
column = 0;
|
||||
for (String channel : channels) {
|
||||
@ -756,16 +914,16 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
request.setAggregation(new AggregationDescriptor().setNrOfBins(2).setAggregations(aggregations));
|
||||
request.setResponse(new CSVHTTPResponse());
|
||||
|
||||
LinkedHashSet<EventField> queryFields = new LinkedHashSet<>();
|
||||
queryFields.add(EventField.channel);
|
||||
queryFields.add(EventField.pulseId);
|
||||
queryFields.add(EventField.iocSeconds);
|
||||
queryFields.add(EventField.iocMillis);
|
||||
queryFields.add(EventField.globalSeconds);
|
||||
queryFields.add(EventField.globalMillis);
|
||||
queryFields.add(EventField.shape);
|
||||
queryFields.add(EventField.eventCount);
|
||||
request.setFields(queryFields);
|
||||
LinkedHashSet<EventField> eventFields = new LinkedHashSet<>();
|
||||
eventFields.add(EventField.channel);
|
||||
eventFields.add(EventField.pulseId);
|
||||
eventFields.add(EventField.iocSeconds);
|
||||
eventFields.add(EventField.iocMillis);
|
||||
eventFields.add(EventField.globalSeconds);
|
||||
eventFields.add(EventField.globalMillis);
|
||||
eventFields.add(EventField.shape);
|
||||
eventFields.add(EventField.eventCount);
|
||||
request.setFields(eventFields);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
@ -788,17 +946,17 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
|
||||
try {
|
||||
long pulse = 0;
|
||||
int totalRows = 2;
|
||||
int totalEventRows = 2;
|
||||
|
||||
List<CSVRecord> records = csvParser.getRecords();
|
||||
assertEquals(totalRows + 1, records.size());
|
||||
assertEquals(totalEventRows + 1, records.size());
|
||||
// remove header
|
||||
CSVRecord record = records.remove(0);
|
||||
assertEquals((queryFields.size() + aggregations.size()) * channels.size(), record.size());
|
||||
assertEquals((eventFields.size() + aggregations.size()) * channels.size(), record.size());
|
||||
int column = 0;
|
||||
for (String channel : channels) {
|
||||
for (EventField queryField : queryFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + queryField.name(),
|
||||
for (EventField eventField : eventFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + eventField.name(),
|
||||
record.get(column++));
|
||||
}
|
||||
for (Aggregation aggregation : aggregations) {
|
||||
@ -808,10 +966,10 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
}
|
||||
}
|
||||
|
||||
for (int row = 0; row < totalRows; ++row) {
|
||||
for (int row = 0; row < totalEventRows; ++row) {
|
||||
record = records.get(row);
|
||||
|
||||
assertEquals((queryFields.size() + aggregations.size()) * channels.size(), record.size());
|
||||
assertEquals((eventFields.size() + aggregations.size()) * channels.size(), record.size());
|
||||
|
||||
column = 0;
|
||||
for (String channel : channels) {
|
||||
@ -859,23 +1017,23 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
.setExtrema(extrema));
|
||||
request.setResponse(new CSVHTTPResponse());
|
||||
|
||||
LinkedHashSet<EventField> queryFields = new LinkedHashSet<>();
|
||||
queryFields.add(EventField.channel);
|
||||
queryFields.add(EventField.pulseId);
|
||||
queryFields.add(EventField.iocSeconds);
|
||||
queryFields.add(EventField.iocMillis);
|
||||
queryFields.add(EventField.globalSeconds);
|
||||
queryFields.add(EventField.globalMillis);
|
||||
queryFields.add(EventField.shape);
|
||||
queryFields.add(EventField.eventCount);
|
||||
queryFields.add(EventField.value);
|
||||
request.setFields(queryFields);
|
||||
LinkedHashSet<EventField> eventFields = new LinkedHashSet<>();
|
||||
eventFields.add(EventField.channel);
|
||||
eventFields.add(EventField.pulseId);
|
||||
eventFields.add(EventField.iocSeconds);
|
||||
eventFields.add(EventField.iocMillis);
|
||||
eventFields.add(EventField.globalSeconds);
|
||||
eventFields.add(EventField.globalMillis);
|
||||
eventFields.add(EventField.shape);
|
||||
eventFields.add(EventField.eventCount);
|
||||
eventFields.add(EventField.value);
|
||||
request.setFields(eventFields);
|
||||
|
||||
Set<EventField> extremaFields = new LinkedHashSet<>();
|
||||
for (Extrema extremum : extrema) {
|
||||
for (EventField queryField : queryFields) {
|
||||
if (extremum.getAccessor(queryField) != null) {
|
||||
extremaFields.add(queryField);
|
||||
for (EventField eventField : eventFields) {
|
||||
if (extremum.getAccessor(eventField) != null) {
|
||||
extremaFields.add(eventField);
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -900,22 +1058,22 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
CSVParser csvParser = new CSVParser(reader, csvFormat);
|
||||
|
||||
// will not be included as it is an aggregation
|
||||
queryFields.remove(EventField.value);
|
||||
eventFields.remove(EventField.value);
|
||||
try {
|
||||
long pulse = 0;
|
||||
int totalRows = 2;
|
||||
int totalEventRows = 2;
|
||||
|
||||
List<CSVRecord> records = csvParser.getRecords();
|
||||
assertEquals(totalRows + 1, records.size());
|
||||
assertEquals(totalEventRows + 1, records.size());
|
||||
// remove header
|
||||
CSVRecord record = records.remove(0);
|
||||
assertEquals(
|
||||
(queryFields.size() + aggregations.size() + (extremaFields.size() * extrema.size())) * channels.size(),
|
||||
(eventFields.size() + aggregations.size() + (extremaFields.size() * extrema.size())) * channels.size(),
|
||||
record.size());
|
||||
int column = 0;
|
||||
for (String channel : channels) {
|
||||
for (EventField queryField : queryFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + queryField.name(),
|
||||
for (EventField eventField : eventFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + eventField.name(),
|
||||
record.get(column++));
|
||||
}
|
||||
for (Aggregation aggregation : aggregations) {
|
||||
@ -924,20 +1082,20 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
record.get(column++));
|
||||
}
|
||||
for (Extrema extremum : extrema) {
|
||||
for (EventField queryField : extremaFields) {
|
||||
for (EventField eventField : extremaFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME
|
||||
+ CSVResponseStreamWriter.FIELDNAME_EXTREMA
|
||||
+ CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + extremum.name()
|
||||
+ CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + queryField.name(),
|
||||
+ CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + eventField.name(),
|
||||
record.get(column++));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (int row = 0; row < totalRows; ++row) {
|
||||
for (int row = 0; row < totalEventRows; ++row) {
|
||||
record = records.get(row);
|
||||
|
||||
assertEquals((queryFields.size() + aggregations.size() + (extremaFields.size() * extrema.size()))
|
||||
assertEquals((eventFields.size() + aggregations.size() + (extremaFields.size() * extrema.size()))
|
||||
* channels.size(), record.size());
|
||||
|
||||
column = 0;
|
||||
@ -995,16 +1153,16 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
request.setAggregation(new AggregationDescriptor().setDurationPerBin(100).setAggregations(aggregations));
|
||||
request.setResponse(new CSVHTTPResponse());
|
||||
|
||||
LinkedHashSet<EventField> queryFields = new LinkedHashSet<>();
|
||||
queryFields.add(EventField.channel);
|
||||
queryFields.add(EventField.pulseId);
|
||||
queryFields.add(EventField.iocSeconds);
|
||||
queryFields.add(EventField.iocMillis);
|
||||
queryFields.add(EventField.globalSeconds);
|
||||
queryFields.add(EventField.globalMillis);
|
||||
queryFields.add(EventField.shape);
|
||||
queryFields.add(EventField.eventCount);
|
||||
request.setFields(queryFields);
|
||||
LinkedHashSet<EventField> eventFields = new LinkedHashSet<>();
|
||||
eventFields.add(EventField.channel);
|
||||
eventFields.add(EventField.pulseId);
|
||||
eventFields.add(EventField.iocSeconds);
|
||||
eventFields.add(EventField.iocMillis);
|
||||
eventFields.add(EventField.globalSeconds);
|
||||
eventFields.add(EventField.globalMillis);
|
||||
eventFields.add(EventField.shape);
|
||||
eventFields.add(EventField.eventCount);
|
||||
request.setFields(eventFields);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
@ -1021,28 +1179,28 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
String response = result.getResponse().getContentAsString();
|
||||
System.out.println("Response: " + response);
|
||||
|
||||
checkDateRangeQueryBinSizeAggregate(channels, aggregations, queryFields, response);
|
||||
checkDateRangeQueryBinSizeAggregate(channels, aggregations, eventFields, response);
|
||||
}
|
||||
|
||||
private void checkDateRangeQueryBinSizeAggregate(final List<String> channels, final List<Aggregation> aggregations,
|
||||
final Set<EventField> queryFields, final String response) throws Exception {
|
||||
final Set<EventField> eventFields, final String response) throws Exception {
|
||||
CSVFormat csvFormat = CSVFormat.EXCEL.withDelimiter(CSVResponseStreamWriter.DELIMITER_CVS);
|
||||
StringReader reader = new StringReader(response);
|
||||
CSVParser csvParser = new CSVParser(reader, csvFormat);
|
||||
|
||||
try {
|
||||
long pulse = 0;
|
||||
int totalRows = 10;
|
||||
int totalEventRows = 10;
|
||||
|
||||
List<CSVRecord> records = csvParser.getRecords();
|
||||
assertEquals(totalRows + 1, records.size());
|
||||
assertEquals(totalEventRows + 1, records.size());
|
||||
// remove header
|
||||
CSVRecord record = records.remove(0);
|
||||
assertEquals((queryFields.size() + aggregations.size()) * channels.size(), record.size());
|
||||
assertEquals((eventFields.size() + aggregations.size()) * channels.size(), record.size());
|
||||
int column = 0;
|
||||
for (String channel : channels) {
|
||||
for (EventField queryField : queryFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + queryField.name(),
|
||||
for (EventField eventField : eventFields) {
|
||||
assertEquals(channel + CSVResponseStreamWriter.DELIMITER_CHANNELNAME_FIELDNAME + eventField.name(),
|
||||
record.get(column++));
|
||||
}
|
||||
for (Aggregation aggregation : aggregations) {
|
||||
@ -1052,10 +1210,10 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
}
|
||||
}
|
||||
|
||||
for (int row = 0; row < totalRows; ++row) {
|
||||
for (int row = 0; row < totalEventRows; ++row) {
|
||||
record = records.get(row);
|
||||
|
||||
assertEquals((queryFields.size() + aggregations.size()) * channels.size(), record.size());
|
||||
assertEquals((eventFields.size() + aggregations.size()) * channels.size(), record.size());
|
||||
|
||||
column = 0;
|
||||
for (String channel : channels) {
|
||||
@ -1102,16 +1260,16 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
request.setAggregation(new AggregationDescriptor().setDurationPerBin(100));
|
||||
request.setResponse(new CSVHTTPResponse());
|
||||
|
||||
LinkedHashSet<EventField> queryFields = new LinkedHashSet<>();
|
||||
queryFields.add(EventField.channel);
|
||||
queryFields.add(EventField.pulseId);
|
||||
queryFields.add(EventField.iocSeconds);
|
||||
queryFields.add(EventField.iocMillis);
|
||||
queryFields.add(EventField.globalSeconds);
|
||||
queryFields.add(EventField.globalMillis);
|
||||
queryFields.add(EventField.shape);
|
||||
queryFields.add(EventField.eventCount);
|
||||
request.setFields(queryFields);
|
||||
LinkedHashSet<EventField> eventFields = new LinkedHashSet<>();
|
||||
eventFields.add(EventField.channel);
|
||||
eventFields.add(EventField.pulseId);
|
||||
eventFields.add(EventField.iocSeconds);
|
||||
eventFields.add(EventField.iocMillis);
|
||||
eventFields.add(EventField.globalSeconds);
|
||||
eventFields.add(EventField.globalMillis);
|
||||
eventFields.add(EventField.shape);
|
||||
eventFields.add(EventField.eventCount);
|
||||
request.setFields(eventFields);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
@ -1131,7 +1289,7 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
String response = result.getResponse().getContentAsString();
|
||||
System.out.println("Response: " + response);
|
||||
|
||||
checkDateRangeQueryBinSizeAggregate(channels, aggregations, queryFields, response);
|
||||
checkDateRangeQueryBinSizeAggregate(channels, aggregations, eventFields, response);
|
||||
}
|
||||
|
||||
@Test
|
||||
@ -1144,10 +1302,10 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
channels);
|
||||
request.setResponse(new CSVHTTPResponse());
|
||||
|
||||
LinkedHashSet<EventField> queryFields = new LinkedHashSet<>();
|
||||
queryFields.add(EventField.channel);
|
||||
queryFields.add(EventField.value);
|
||||
request.setFields(queryFields);
|
||||
LinkedHashSet<EventField> eventFields = new LinkedHashSet<>();
|
||||
eventFields.add(EventField.channel);
|
||||
eventFields.add(EventField.value);
|
||||
request.setFields(eventFields);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
@ -1170,17 +1328,17 @@ public class CSVQueryRestControllerTest extends AbstractDaqRestTest {
|
||||
|
||||
try {
|
||||
long pulse = 0;
|
||||
int totalRows = 2;
|
||||
int totalEventRows = 2;
|
||||
|
||||
List<CSVRecord> records = csvParser.getRecords();
|
||||
assertEquals(totalRows + 1, records.size());
|
||||
assertEquals(totalEventRows + 1, records.size());
|
||||
// remove header
|
||||
records.remove(0);
|
||||
|
||||
for (int row = 0; row < totalRows; ++row) {
|
||||
for (int row = 0; row < totalEventRows; ++row) {
|
||||
CSVRecord record = records.get(row);
|
||||
|
||||
assertEquals((queryFields.size() + 1) * channels.size(), record.size());
|
||||
assertEquals((eventFields.size() + 1) * channels.size(), record.size());
|
||||
|
||||
int column = 0;
|
||||
for (String channel : channels) {
|
||||
|
@ -27,6 +27,7 @@ import org.springframework.test.web.servlet.result.MockMvcResultMatchers;
|
||||
import com.jayway.jsonpath.Configuration;
|
||||
import com.jayway.jsonpath.JsonPath;
|
||||
|
||||
import ch.psi.bsread.message.Type;
|
||||
import ch.psi.daq.common.ordering.Ordering;
|
||||
import ch.psi.daq.common.time.TimeUtils;
|
||||
import ch.psi.daq.domain.backend.Backend;
|
||||
@ -40,6 +41,7 @@ import ch.psi.daq.domain.query.mapping.Mapping;
|
||||
import ch.psi.daq.domain.query.operation.Aggregation;
|
||||
import ch.psi.daq.domain.query.operation.AggregationDescriptor;
|
||||
import ch.psi.daq.domain.query.operation.AggregationType;
|
||||
import ch.psi.daq.domain.query.operation.ConfigField;
|
||||
import ch.psi.daq.domain.query.operation.Extrema;
|
||||
import ch.psi.daq.domain.query.operation.EventField;
|
||||
import ch.psi.daq.domain.query.transform.ValueTransformationSequence;
|
||||
@ -95,11 +97,11 @@ public class JsonQueryRestControllerTableTest extends AbstractDaqRestTest implem
|
||||
101),
|
||||
TEST_CHANNEL_NAMES);
|
||||
request.setMapping(new Mapping());
|
||||
request.addField(EventField.pulseId);
|
||||
request.addField(EventField.globalSeconds);
|
||||
request.addField(EventField.globalMillis);
|
||||
request.addField(EventField.iocSeconds);
|
||||
request.addField(EventField.iocMillis);
|
||||
request.addEventField(EventField.pulseId);
|
||||
request.addEventField(EventField.globalSeconds);
|
||||
request.addEventField(EventField.globalMillis);
|
||||
request.addEventField(EventField.iocSeconds);
|
||||
request.addEventField(EventField.iocMillis);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
@ -174,11 +176,11 @@ public class JsonQueryRestControllerTableTest extends AbstractDaqRestTest implem
|
||||
101),
|
||||
TEST_CHANNEL_NAMES);
|
||||
request.setMapping(new Mapping(IncompleteStrategy.FILL_NULL));
|
||||
request.addField(EventField.pulseId);
|
||||
request.addField(EventField.globalSeconds);
|
||||
request.addField(EventField.globalMillis);
|
||||
request.addField(EventField.iocSeconds);
|
||||
request.addField(EventField.iocMillis);
|
||||
request.addEventField(EventField.pulseId);
|
||||
request.addEventField(EventField.globalSeconds);
|
||||
request.addEventField(EventField.globalMillis);
|
||||
request.addEventField(EventField.iocSeconds);
|
||||
request.addEventField(EventField.iocMillis);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
@ -255,12 +257,12 @@ public class JsonQueryRestControllerTableTest extends AbstractDaqRestTest implem
|
||||
101),
|
||||
TEST_CHANNEL_NAMES);
|
||||
request.setMapping(new Mapping());
|
||||
request.addField(EventField.pulseId);
|
||||
request.addField(EventField.globalSeconds);
|
||||
request.addField(EventField.globalMillis);
|
||||
request.addField(EventField.iocSeconds);
|
||||
request.addField(EventField.iocMillis);
|
||||
request.addField(EventField.value);
|
||||
request.addEventField(EventField.pulseId);
|
||||
request.addEventField(EventField.globalSeconds);
|
||||
request.addEventField(EventField.globalMillis);
|
||||
request.addEventField(EventField.iocSeconds);
|
||||
request.addEventField(EventField.iocMillis);
|
||||
request.addEventField(EventField.value);
|
||||
|
||||
AggregationDescriptor aggregation = new AggregationDescriptor(AggregationType.value);
|
||||
aggregation.setNrOfBins(1);
|
||||
@ -320,11 +322,11 @@ public class JsonQueryRestControllerTableTest extends AbstractDaqRestTest implem
|
||||
101),
|
||||
TEST_CHANNEL_02, TEST_CHANNEL_01);
|
||||
request.setMapping(new Mapping());
|
||||
request.addField(EventField.pulseId);
|
||||
request.addField(EventField.globalSeconds);
|
||||
request.addField(EventField.globalMillis);
|
||||
request.addField(EventField.iocSeconds);
|
||||
request.addField(EventField.iocMillis);
|
||||
request.addEventField(EventField.pulseId);
|
||||
request.addEventField(EventField.globalSeconds);
|
||||
request.addEventField(EventField.globalMillis);
|
||||
request.addEventField(EventField.iocSeconds);
|
||||
request.addEventField(EventField.iocMillis);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
@ -397,11 +399,11 @@ public class JsonQueryRestControllerTableTest extends AbstractDaqRestTest implem
|
||||
101),
|
||||
TEST_CHANNEL_01);
|
||||
request.setMapping(new Mapping());
|
||||
request.addField(EventField.pulseId);
|
||||
request.addField(EventField.globalSeconds);
|
||||
request.addField(EventField.globalMillis);
|
||||
request.addField(EventField.iocSeconds);
|
||||
request.addField(EventField.iocMillis);
|
||||
request.addEventField(EventField.pulseId);
|
||||
request.addEventField(EventField.globalSeconds);
|
||||
request.addEventField(EventField.globalMillis);
|
||||
request.addEventField(EventField.iocSeconds);
|
||||
request.addEventField(EventField.iocMillis);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
@ -455,8 +457,8 @@ public class JsonQueryRestControllerTableTest extends AbstractDaqRestTest implem
|
||||
new AggregationDescriptor().setNrOfBins(2),
|
||||
TEST_CHANNEL_NAMES);
|
||||
request.setMapping(new Mapping());
|
||||
request.addField(EventField.pulseId);
|
||||
request.addField(EventField.eventCount);
|
||||
request.addEventField(EventField.pulseId);
|
||||
request.addEventField(EventField.eventCount);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
@ -665,6 +667,139 @@ public class JsonQueryRestControllerTableTest extends AbstractDaqRestTest implem
|
||||
|
||||
.andDo(MockMvcResultHandlers.print())
|
||||
.andExpect(MockMvcResultMatchers.status().isOk())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta").doesNotExist())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data").isArray())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[0]").exists())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[0]").isArray())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[0][0]").exists())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[0][0]").isMap())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[0][0].channel").value(TEST_CHANNEL_01))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[0][0].backend").value(backend.getName()))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[0][0].pulseId").value(200))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[0][0].globalSeconds").value(
|
||||
TestTimeUtils.getTimeStr(2, 0)))
|
||||
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[0][1]").exists())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[0][1]").isMap())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[0][1].channel").value(TEST_CHANNEL_02))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[0][1].backend").value(backend.getName()))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[0][1].pulseId").value(200))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[0][1].globalSeconds").value(
|
||||
TestTimeUtils.getTimeStr(2, 0)))
|
||||
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[1][0]").exists())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[1][0]").isMap())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[1][0].channel").value(TEST_CHANNEL_01))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[1][0].backend").value(backend.getName()))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[1][0].pulseId").value(201))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[1][0].globalSeconds").value(
|
||||
TestTimeUtils.getTimeStr(2, 10000000)))
|
||||
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[1][1]").exists())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[1][1]").isMap())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[1][1].channel").value(TEST_CHANNEL_02))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[1][1].backend").value(backend.getName()))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[1][1].pulseId").value(201))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[1][1].globalSeconds").value(
|
||||
TestTimeUtils.getTimeStr(2, 10000000)));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testTimeRangeQuery_01_ConfigFields() throws Exception {
|
||||
DAQQuery request = new DAQQuery(
|
||||
new RequestRangeTime(
|
||||
TimeUtils.getTimeFromMillis(2000, 0),
|
||||
TimeUtils.getTimeFromMillis(2010, 0)),
|
||||
TEST_CHANNEL_NAMES);
|
||||
request.addConfigField(ConfigField.pulseId);
|
||||
request.addConfigField(ConfigField.globalSeconds);
|
||||
request.addConfigField(ConfigField.globalMillis);
|
||||
request.addConfigField(ConfigField.shape);
|
||||
request.addConfigField(ConfigField.description);
|
||||
request.addConfigField(ConfigField.backend);
|
||||
request.addConfigField(ConfigField.modulo);
|
||||
request.addConfigField(ConfigField.offset);
|
||||
request.addConfigField(ConfigField.keyspace);
|
||||
request.addConfigField(ConfigField.precision);
|
||||
request.addConfigField(ConfigField.source);
|
||||
request.addConfigField(ConfigField.type);
|
||||
request.addConfigField(ConfigField.unit);
|
||||
request.setMapping(new Mapping());
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
|
||||
this.mockMvc.perform(MockMvcRequestBuilders
|
||||
.post(DomainConfig.PATH_QUERY)
|
||||
.contentType(MediaType.APPLICATION_JSON)
|
||||
.content(content))
|
||||
|
||||
.andDo(MockMvcResultHandlers.print())
|
||||
.andExpect(MockMvcResultMatchers.status().isOk())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta").isArray())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0]").exists())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].channel").isMap())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].channel.name").value(TEST_CHANNEL_01))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].channel.backend").value(backend.getName()))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs").isArray())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[0].pulseId").value(200))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[0].globalSeconds").value(
|
||||
TestTimeUtils.getTimeStr(2, 0)))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[0].globalMillis").value(2000))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[0].shape").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[0].description").doesNotExist())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[0].modulo").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[0].offset").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[0].keyspace").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[0].precision").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[0].source").value("unknown"))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[0].type").value(Type.Int32.getKey()))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[0].unit").doesNotExist())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[1].pulseId").value(201))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[1].globalSeconds").value(
|
||||
TestTimeUtils.getTimeStr(2, 10000000)))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[1].globalMillis").value(2010))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[1].shape").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[1].description").doesNotExist())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[1].modulo").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[1].offset").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[1].keyspace").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[1].precision").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[1].source").value("unknown"))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[1].type").value(Type.Int32.getKey()))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[0].configs[1].unit").doesNotExist())
|
||||
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1]").exists())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].channel").isMap())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].channel.name").value(TEST_CHANNEL_02))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].channel.backend").value(backend.getName()))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs").isArray())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[0].pulseId").value(200))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[0].globalSeconds").value(
|
||||
TestTimeUtils.getTimeStr(2, 0)))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[0].globalMillis").value(2000))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[0].shape").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[0].description").doesNotExist())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[0].modulo").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[0].offset").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[0].keyspace").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[0].precision").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[0].source").value("unknown"))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[0].type").value(Type.Int32.getKey()))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[0].unit").doesNotExist())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[1].pulseId").value(201))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[1].globalSeconds").value(
|
||||
TestTimeUtils.getTimeStr(2, 10000000)))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[1].globalMillis").value(2010))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[1].shape").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[1].description").doesNotExist())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[1].modulo").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[1].offset").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[1].keyspace").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[1].precision").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[1].source").value("unknown"))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[1].type").value(Type.Int32.getKey()))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.meta[1].configs[1].unit").doesNotExist())
|
||||
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data").isArray())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[0]").exists())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$.data[0]").isArray())
|
||||
@ -2313,11 +2448,11 @@ public class JsonQueryRestControllerTableTest extends AbstractDaqRestTest implem
|
||||
100,
|
||||
101),
|
||||
channelName);
|
||||
request.addField(EventField.pulseId);
|
||||
request.addField(EventField.globalSeconds);
|
||||
request.addField(EventField.globalMillis);
|
||||
request.addField(EventField.iocSeconds);
|
||||
request.addField(EventField.iocMillis);
|
||||
request.addEventField(EventField.pulseId);
|
||||
request.addEventField(EventField.globalSeconds);
|
||||
request.addEventField(EventField.globalMillis);
|
||||
request.addEventField(EventField.iocSeconds);
|
||||
request.addEventField(EventField.iocMillis);
|
||||
request.setMapping(new Mapping());
|
||||
request.addValueTransformation(
|
||||
new ValueTransformationSequence(
|
||||
@ -2360,11 +2495,11 @@ public class JsonQueryRestControllerTableTest extends AbstractDaqRestTest implem
|
||||
100,
|
||||
101),
|
||||
channelName);
|
||||
request.addField(EventField.pulseId);
|
||||
request.addField(EventField.globalSeconds);
|
||||
request.addField(EventField.globalMillis);
|
||||
request.addField(EventField.iocSeconds);
|
||||
request.addField(EventField.iocMillis);
|
||||
request.addEventField(EventField.pulseId);
|
||||
request.addEventField(EventField.globalSeconds);
|
||||
request.addEventField(EventField.globalMillis);
|
||||
request.addEventField(EventField.iocSeconds);
|
||||
request.addEventField(EventField.iocMillis);
|
||||
request.setMapping(new Mapping());
|
||||
request.addValueTransformation(
|
||||
new ValueTransformationSequence(
|
||||
@ -2404,11 +2539,11 @@ public class JsonQueryRestControllerTableTest extends AbstractDaqRestTest implem
|
||||
100,
|
||||
101),
|
||||
channelName, channelName2);
|
||||
request.addField(EventField.pulseId);
|
||||
request.addField(EventField.globalSeconds);
|
||||
request.addField(EventField.globalMillis);
|
||||
request.addField(EventField.iocSeconds);
|
||||
request.addField(EventField.iocMillis);
|
||||
request.addEventField(EventField.pulseId);
|
||||
request.addEventField(EventField.globalSeconds);
|
||||
request.addEventField(EventField.globalMillis);
|
||||
request.addEventField(EventField.iocSeconds);
|
||||
request.addEventField(EventField.iocMillis);
|
||||
request.setMapping(new Mapping());
|
||||
request.addValueTransformation(
|
||||
new ValueTransformationSequence(
|
||||
|
@ -38,6 +38,7 @@ import ch.psi.daq.domain.query.operation.Aggregation;
|
||||
import ch.psi.daq.domain.query.operation.AggregationDescriptor;
|
||||
import ch.psi.daq.domain.query.operation.AggregationType;
|
||||
import ch.psi.daq.domain.query.operation.Compression;
|
||||
import ch.psi.daq.domain.query.operation.ConfigField;
|
||||
import ch.psi.daq.domain.query.operation.Extrema;
|
||||
import ch.psi.daq.domain.query.operation.EventField;
|
||||
import ch.psi.daq.domain.query.transform.ValueTransformationSequence;
|
||||
@ -74,7 +75,7 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
private Backend backend;
|
||||
private Backend backend2;
|
||||
private Backend backend3;
|
||||
|
||||
|
||||
private ObjectMapper objectMapper = new ObjectMapper();
|
||||
|
||||
@Override
|
||||
@ -126,7 +127,7 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[2].channels[1]").value("BoolWaveform"));
|
||||
|
||||
}
|
||||
|
||||
|
||||
@Test
|
||||
public void testChannelsHash() throws Exception {
|
||||
MvcResult result = this.mockMvc
|
||||
@ -140,10 +141,10 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
|
||||
String response = result.getResponse().getContentAsString();
|
||||
System.out.println("Response: " + response);
|
||||
|
||||
|
||||
LongHash hash1 =
|
||||
objectMapper.readValue(response, LongHash.class);
|
||||
|
||||
|
||||
result = this.mockMvc
|
||||
.perform(
|
||||
MockMvcRequestBuilders
|
||||
@ -155,10 +156,10 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
|
||||
response = result.getResponse().getContentAsString();
|
||||
System.out.println("Response: " + response);
|
||||
|
||||
|
||||
LongHash hash2 =
|
||||
objectMapper.readValue(response, LongHash.class);
|
||||
|
||||
|
||||
assertEquals(hash1.getHash(), hash2.getHash());
|
||||
}
|
||||
|
||||
@ -334,11 +335,11 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
100,
|
||||
101),
|
||||
TEST_CHANNEL_NAMES);
|
||||
request.addField(EventField.pulseId);
|
||||
request.addField(EventField.globalSeconds);
|
||||
request.addField(EventField.globalMillis);
|
||||
request.addField(EventField.iocSeconds);
|
||||
request.addField(EventField.iocMillis);
|
||||
request.addEventField(EventField.pulseId);
|
||||
request.addEventField(EventField.globalSeconds);
|
||||
request.addEventField(EventField.globalMillis);
|
||||
request.addEventField(EventField.iocSeconds);
|
||||
request.addEventField(EventField.iocMillis);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
@ -395,8 +396,8 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
199),
|
||||
new AggregationDescriptor().setNrOfBins(2),
|
||||
TEST_CHANNEL_NAMES);
|
||||
request.addField(EventField.pulseId);
|
||||
request.addField(EventField.eventCount);
|
||||
request.addEventField(EventField.pulseId);
|
||||
request.addEventField(EventField.eventCount);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
System.out.println(content);
|
||||
@ -546,6 +547,7 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0]").exists())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].channel").isMap())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].channel.name").value(TEST_CHANNEL_01))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs").doesNotExist())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].data").isArray())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].data[0].pulseId").value(200))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].data[0].globalSeconds").value(
|
||||
@ -556,6 +558,114 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1]").exists())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].channel").isMap())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].channel.name").value(TEST_CHANNEL_02))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs").doesNotExist())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].data").isArray())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].data[0].pulseId").value(200))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].data[0].globalSeconds").value(
|
||||
TestTimeUtils.getTimeStr(2, 0)))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].data[1].pulseId").value(201))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].data[1].globalSeconds").value(
|
||||
TestTimeUtils.getTimeStr(2, 10000000)));
|
||||
}
|
||||
|
||||
@Test
|
||||
public void testTimeRangeQuery_01_ConfigFields() throws Exception {
|
||||
DAQQuery request = new DAQQuery(
|
||||
new RequestRangeTime(
|
||||
TimeUtils.getTimeFromMillis(2000, 0),
|
||||
TimeUtils.getTimeFromMillis(2010, 0)),
|
||||
TEST_CHANNEL_NAMES);
|
||||
request.addConfigField(ConfigField.pulseId);
|
||||
request.addConfigField(ConfigField.globalSeconds);
|
||||
request.addConfigField(ConfigField.globalMillis);
|
||||
request.addConfigField(ConfigField.shape);
|
||||
request.addConfigField(ConfigField.description);
|
||||
request.addConfigField(ConfigField.backend);
|
||||
request.addConfigField(ConfigField.modulo);
|
||||
request.addConfigField(ConfigField.offset);
|
||||
request.addConfigField(ConfigField.keyspace);
|
||||
request.addConfigField(ConfigField.precision);
|
||||
request.addConfigField(ConfigField.source);
|
||||
request.addConfigField(ConfigField.type);
|
||||
request.addConfigField(ConfigField.unit);
|
||||
|
||||
String content = mapper.writeValueAsString(request);
|
||||
|
||||
this.mockMvc.perform(MockMvcRequestBuilders
|
||||
.post(DomainConfig.PATH_QUERY)
|
||||
.contentType(MediaType.APPLICATION_JSON)
|
||||
.content(content))
|
||||
|
||||
.andDo(MockMvcResultHandlers.print())
|
||||
.andExpect(MockMvcResultMatchers.status().isOk())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$").isArray())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0]").exists())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].channel").isMap())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].channel.name").value(TEST_CHANNEL_01))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs").isArray())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[0].pulseId").value(200))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[0].globalSeconds").value(
|
||||
TestTimeUtils.getTimeStr(2, 0)))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[0].globalMillis").value(2000))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[0].shape").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[0].description").doesNotExist())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[0].modulo").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[0].offset").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[0].keyspace").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[0].precision").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[0].source").value("unknown"))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[0].type").value(Type.Int32.getKey()))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[0].unit").doesNotExist())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[1].pulseId").value(201))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[1].globalSeconds").value(
|
||||
TestTimeUtils.getTimeStr(2, 10000000)))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[1].globalMillis").value(2010))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[1].shape").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[1].description").doesNotExist())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[1].modulo").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[1].offset").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[1].keyspace").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[1].precision").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[1].source").value("unknown"))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[1].type").value(Type.Int32.getKey()))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].configs[1].unit").doesNotExist())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].data").isArray())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].data[0].pulseId").value(200))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].data[0].globalSeconds").value(
|
||||
TestTimeUtils.getTimeStr(2, 0)))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].data[1].pulseId").value(201))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[0].data[1].globalSeconds").value(
|
||||
TestTimeUtils.getTimeStr(2, 10000000)))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1]").exists())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].channel").isMap())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].channel.name").value(TEST_CHANNEL_02))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs").isArray())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[0].pulseId").value(200))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[0].globalSeconds").value(
|
||||
TestTimeUtils.getTimeStr(2, 0)))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[0].globalMillis").value(2000))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[0].shape").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[0].description").doesNotExist())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[0].modulo").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[0].offset").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[0].keyspace").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[0].precision").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[0].source").value("unknown"))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[0].type").value(Type.Int32.getKey()))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[0].unit").doesNotExist())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[1].pulseId").value(201))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[1].globalSeconds").value(
|
||||
TestTimeUtils.getTimeStr(2, 10000000)))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[1].globalMillis").value(2010))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[1].shape").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[1].description").doesNotExist())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[1].modulo").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[1].offset").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[1].keyspace").value(1))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[1].precision").value(0))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[1].source").value("unknown"))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[1].type").value(Type.Int32.getKey()))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].configs[1].unit").doesNotExist())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].data").isArray())
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].data[0].pulseId").value(200))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].data[0].globalSeconds").value(
|
||||
@ -654,7 +764,7 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].data[1].globalSeconds").value(
|
||||
TestTimeUtils.getTimeStr(1, 10000000)));
|
||||
}
|
||||
|
||||
|
||||
@Test
|
||||
public void testOpenTimeRangeQueryStart_01() throws Exception {
|
||||
DAQQuery request = new DAQQuery(
|
||||
@ -696,7 +806,7 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
TestTimeUtils.getTimeStr(2, 10000000)))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].data[2]").doesNotExist());
|
||||
}
|
||||
|
||||
|
||||
@Test
|
||||
public void testOpenTimeRangeQueryStart_01_Exclusive() throws Exception {
|
||||
DAQQuery request = new DAQQuery(
|
||||
@ -739,7 +849,7 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
TestTimeUtils.getTimeStr(2, 20000000)))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].data[2]").doesNotExist());
|
||||
}
|
||||
|
||||
|
||||
@Test
|
||||
public void testOpenTimeRangeQueryStartDate_01_Exclusive() throws Exception {
|
||||
DAQQuery request = new DAQQuery(
|
||||
@ -919,7 +1029,7 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
TestTimeUtils.getTimeStr(2, 0)))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].data[2]").doesNotExist());
|
||||
}
|
||||
|
||||
|
||||
@Test
|
||||
public void testOpenTimeRangeQueryEndDate_01_Exclusive() throws Exception {
|
||||
DAQQuery request = new DAQQuery(
|
||||
@ -988,7 +1098,7 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
assertTrue(true);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@Test
|
||||
public void testOpenTimeRangeQueryEnd_03() throws Exception {
|
||||
DAQQuery request = new DAQQuery(
|
||||
@ -1014,7 +1124,7 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
assertTrue(true);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@Test
|
||||
public void testOpenPulseRangeQueryStart_01() throws Exception {
|
||||
DAQQuery request = new DAQQuery(
|
||||
@ -1056,7 +1166,7 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
TestTimeUtils.getTimeStr(2, 10000000)))
|
||||
.andExpect(MockMvcResultMatchers.jsonPath("$[1].data[2]").doesNotExist());
|
||||
}
|
||||
|
||||
|
||||
@Test
|
||||
public void testOpenPulseRangeQueryStart_01_Exclusive() throws Exception {
|
||||
DAQQuery request = new DAQQuery(
|
||||
@ -1261,7 +1371,7 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
assertTrue(true);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@Test
|
||||
public void testOpenPulseRangeQueryEnd_03() throws Exception {
|
||||
DAQQuery request = new DAQQuery(
|
||||
@ -1888,11 +1998,11 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
100,
|
||||
101),
|
||||
channelName);
|
||||
request.addField(EventField.pulseId);
|
||||
request.addField(EventField.globalSeconds);
|
||||
request.addField(EventField.globalMillis);
|
||||
request.addField(EventField.iocSeconds);
|
||||
request.addField(EventField.iocMillis);
|
||||
request.addEventField(EventField.pulseId);
|
||||
request.addEventField(EventField.globalSeconds);
|
||||
request.addEventField(EventField.globalMillis);
|
||||
request.addEventField(EventField.iocSeconds);
|
||||
request.addEventField(EventField.iocMillis);
|
||||
request.addValueTransformation(
|
||||
new ValueTransformationSequence(
|
||||
ValueTransformationSequence.ALL_CHANNELS,
|
||||
@ -1934,11 +2044,11 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
100,
|
||||
101),
|
||||
channelName);
|
||||
request.addField(EventField.pulseId);
|
||||
request.addField(EventField.globalSeconds);
|
||||
request.addField(EventField.globalMillis);
|
||||
request.addField(EventField.iocSeconds);
|
||||
request.addField(EventField.iocMillis);
|
||||
request.addEventField(EventField.pulseId);
|
||||
request.addEventField(EventField.globalSeconds);
|
||||
request.addEventField(EventField.globalMillis);
|
||||
request.addEventField(EventField.iocSeconds);
|
||||
request.addEventField(EventField.iocMillis);
|
||||
request.addValueTransformation(
|
||||
new ValueTransformationSequence(
|
||||
channelName,
|
||||
@ -1977,11 +2087,11 @@ public class JsonQueryRestControllerTest extends AbstractDaqRestTest implements
|
||||
100,
|
||||
101),
|
||||
channelName, channelName2);
|
||||
request.addField(EventField.pulseId);
|
||||
request.addField(EventField.globalSeconds);
|
||||
request.addField(EventField.globalMillis);
|
||||
request.addField(EventField.iocSeconds);
|
||||
request.addField(EventField.iocMillis);
|
||||
request.addEventField(EventField.pulseId);
|
||||
request.addEventField(EventField.globalSeconds);
|
||||
request.addEventField(EventField.globalMillis);
|
||||
request.addEventField(EventField.iocSeconds);
|
||||
request.addEventField(EventField.iocMillis);
|
||||
request.addValueTransformation(
|
||||
new ValueTransformationSequence(
|
||||
null,
|
||||
|
Loading…
x
Reference in New Issue
Block a user