When our customers analyse data via the data explorer, they are always complaining about the large jump in aggregation between 1 minute and 1 hour. Something like 5 or 10 minutes would be a good intermediate average for analysis
Thankst for the feedback. In the short term, we do not plan to enhance the aggregation functionalities of our platform.
Currently, we do recommend to create aggregated measurements with a microservice for batch processing (e.g. every hour) or with an Apama streaming analytics rule.
Hello Everybody, I was in session with the customer and for aggregation they need a custom interval, because one our is not enough and one day is too much. If this is available all can build very specific dashboards.
Thankst for the feedback. In the short term, we do not plan to enhance the aggregation functionalities of our platform.
Currently, we do recommend to create aggregated measurements with a microservice for batch processing (e.g. every hour) or with an Apama streaming analytics rule.