MetricTimeSeries.Builder builder = new MetricTimeSeries.Builder(metricName, entry.getValue()); List<Map.Entry<Long, Double>> points = records.stream() .filter(record -> record.hasField(entry.getKey()) && record.getField(entry.getKey()).isSet()) return null; points.stream().forEach(kv -> builder.point(kv.getKey(), kv.getValue())); .start(firstTS) .end(lastTS) .attributes(attributes) .attribute("id", batchUID) .build();
builder.points(new LongList(timestamps, lastPointIndex), new DoubleList(values, lastPointIndex));
builder.points(new LongList(timestamps, lastPointIndex), new DoubleList(values, lastPointIndex));
return new MetricTimeSeries.Builder(name, type) .points(timestamps, values) .attributes(attributes) .build();
MetricTimeSeries.Builder metricBuilder = metrics.get(metric); if (metricBuilder == null) { metricBuilder = new MetricTimeSeries.Builder(metricName, METRIC_TYPE); for (Map.Entry<String, String> tagEntry : tags.entrySet()) { metricBuilder.attribute(tagEntry.getKey(), tagEntry.getValue()); metricBuilder.point(timestamp.toEpochMilli(), value);
MetricTimeSeries.Builder metricBuilder = metrics.get(metric); if (metricBuilder == null) { metricBuilder = new MetricTimeSeries.Builder(metricName, METRIC_TYPE); for (Map.Entry<String, String> tagEntry : tags.entrySet()) { metricBuilder.attribute(tagEntry.getKey(), tagEntry.getValue()); metricBuilder.point(timestamp.toEpochMilli(), value);
builder.point(times.get(i).longValue(), values.get(i));
builder.point(times.get(i).longValue(), values.get(i));
String type = doc.getFieldValue(Schema.TYPE).toString(); MetricTimeSeries.Builder ts = new MetricTimeSeries.Builder(name, type); ts.attribute(field.getKey(), ((ByteBuffer) field.getValue()).array()); } else { ts.attribute(field.getKey(), field.getValue()); IOUtils.closeQuietly(decompressed); return ts.build();
@Override public Iterable<MetricTimeSeries> parse(InputStream stream) throws FormatParseException { Map<String, MetricTimeSeries.Builder> metrics = new HashMap<>(); BufferedReader reader = new BufferedReader(new InputStreamReader(stream, UTF_8)); String line; try { while ((line = reader.readLine()) != null) { // Format is: <metric path> <metric value> <metric timestamp> String[] parts = StringUtils.split(line, ' '); if (parts.length != 3) { throw new FormatParseException("Expected 3 parts, found " + parts.length + " in line '" + line + "'"); } String metricName = getMetricName(parts); double value = getMetricValue(parts); Instant timestamp = getMetricTimestamp(parts); // If the metric is already known, add a point. Otherwise create the metric and add the point. MetricTimeSeries.Builder metricBuilder = metrics.get(metricName); if (metricBuilder == null) { metricBuilder = new MetricTimeSeries.Builder(metricName, METRIC_TYPE); metrics.put(metricName, metricBuilder); } metricBuilder.point(timestamp.toEpochMilli(), value); } } catch (IOException e) { throw new FormatParseException("IO exception while parsing Graphite format", e); } return metrics.values().stream().map(MetricTimeSeries.Builder::build).collect(Collectors.toList()); }
@Override public MetricTimeSeries from(BinaryTimeSeries binaryTimeSeries, long queryStart, long queryEnd) { LOGGER.debug("Converting {} to MetricTimeSeries starting at {} and ending at {}", binaryTimeSeries, queryStart, queryEnd); //get the metric String metric = binaryTimeSeries.get(MetricTSSchema.METRIC).toString(); //Third build a minimal time series MetricTimeSeries.Builder builder = new MetricTimeSeries.Builder(metric); //add all user defined attributes binaryTimeSeries.getFields().forEach((field, value) -> { if (MetricTSSchema.isUserDefined(field)) { builder.attribute(field, value); } }); //Default serialization is protocol buffers. if (binaryTimeSeries.getPoints().length > 0) { fromProtocolBuffers(binaryTimeSeries, queryStart, queryEnd, builder); } else if (binaryTimeSeries.getFields().containsKey(DATA_AS_JSON_FIELD)) { //do it from json fromJson(binaryTimeSeries, queryStart, queryEnd, builder); } else { //we have no data //set the start and end builder.start(binaryTimeSeries.getStart()); builder.end(binaryTimeSeries.getEnd()); } return builder.build(); }
@Override public MetricTimeSeries from(BinaryTimeSeries binaryTimeSeries, long queryStart, long queryEnd) { LOGGER.debug("Converting {} to MetricTimeSeries starting at {} and ending at {}", binaryTimeSeries, queryStart, queryEnd); //get the metric String metric = binaryTimeSeries.get(MetricTSSchema.METRIC).toString(); //Third build a minimal time series MetricTimeSeries.Builder builder = new MetricTimeSeries.Builder(metric); //add all user defined attributes binaryTimeSeries.getFields().forEach((field, value) -> { if (MetricTSSchema.isUserDefined(field)) { builder.attribute(field, value); } }); //Default serialization is protocol buffers. if (binaryTimeSeries.getPoints().length > 0) { fromProtocolBuffers(binaryTimeSeries, queryStart, queryEnd, builder); } else if (binaryTimeSeries.getFields().containsKey(DATA_AS_JSON_FIELD)) { //do it from json fromJson(binaryTimeSeries, queryStart, queryEnd, builder); } else { //we have no data //set the start and end builder.start(binaryTimeSeries.getStart()); builder.end(binaryTimeSeries.getEnd()); } return builder.build(); }
/** * Adds a point to the given metrics map. If the metric doesn't exist in the map, it will be created. * * @param metrics Metric map. * @param metricName Name of the metric. * @param timestamp Timestamp of the point. * @param value Value of the point. * @param tags Tags for the metric. These are only used if the metric doesn't already exist in the metrics map. */ private void addPoint(Map<Metric, MetricTimeSeries.Builder> metrics, String metricName, Instant timestamp, double value, Map<String, String> tags) { Metric metric = new Metric(metricName, tags); MetricTimeSeries.Builder metricBuilder = metrics.get(metric); if (metricBuilder == null) { metricBuilder = new MetricTimeSeries.Builder(metricName,METRIC_TYPE); for (Map.Entry<String, String> tagEntry : tags.entrySet()) { metricBuilder.attribute(tagEntry.getKey(), tagEntry.getValue()); } metrics.put(metric, metricBuilder); } metricBuilder.point(timestamp.toEpochMilli(), value); }
@Override public Iterable<MetricTimeSeries> parse(InputStream stream) throws FormatParseException { Map<Metric, MetricTimeSeries.Builder> metrics = new HashMap<>(); List<KairosDbMetric> kairosMetrics = parseJson(stream); for (KairosDbMetric kairosMetric : kairosMetrics) { // If the metric is already known, add a point. Otherwise create the metric and add the point. Metric metric = new Metric(kairosMetric.getName(), kairosMetric.getTags()); MetricTimeSeries.Builder metricBuilder = metrics.get(metric); if (metricBuilder == null) { metricBuilder = new MetricTimeSeries.Builder(kairosMetric.getName(), METRIC_TYPE); for (Map.Entry<String, String> tagEntry : kairosMetric.getTags().entrySet()) { metricBuilder.attribute(tagEntry.getKey(), tagEntry.getValue()); } metrics.put(metric, metricBuilder); } Instant timestamp = convertTimestamp(kairosMetric.getTimestamp()); metricBuilder.point(timestamp.toEpochMilli(), kairosMetric.getValue()); } return metrics.values().stream().map(MetricTimeSeries.Builder::build).collect(Collectors.toList()); }
@Override public Iterable<MetricTimeSeries> parse(InputStream stream) throws FormatParseException { Map<Metric, MetricTimeSeries.Builder> metrics = new HashMap<>(); List<TsdbMetric> tsdbMetrics = parseJson(stream); for (TsdbMetric tsdbMetric : tsdbMetrics) { // If the metric is already known, add a point. Otherwise create the metric and add the point. Metric metric = new Metric(tsdbMetric.getMetric(), tsdbMetric.getTags()); MetricTimeSeries.Builder metricBuilder = metrics.get(metric); if (metricBuilder == null) { metricBuilder = new MetricTimeSeries.Builder(tsdbMetric.getMetric(), METRIC_TYPE); // Assuming tag entry is always a string, which it should be for (Map.Entry<String, String> tagEntry : tsdbMetric.getTags().entrySet()) { metricBuilder.attribute(tagEntry.getKey().concat("_s"), tagEntry.getValue()); } metrics.put(metric, metricBuilder); } Instant timestamp = convertTimestamp(tsdbMetric.getTimestamp()); metricBuilder.point(timestamp.toEpochMilli(), tsdbMetric.getValue()); } return metrics.values().stream().map(MetricTimeSeries.Builder::build).collect(Collectors.toList()); }
private MetricTimeSeries map(TimeSeries<Long, Double> timeSeries) { MetricTimeSeries.Builder builder = new MetricTimeSeries.Builder(timeSeries.getAttribute("metric").toString()); //add points Iterator<Pair<Long, Double>> it = timeSeries.iterator(); //ignore the first element if (it.hasNext()) { it.next(); } while (it.hasNext()) { Pair<Long, Double> pair = it.next(); builder.point(pair.getFirst(), pair.getSecond()); } //add attributes timeSeries.getAttributes().forEachRemaining(attribute -> builder.attribute(attribute.getKey(), attribute.getValue())); return builder.build(); } }
/** * Copies a given metric time series. * * @param ts the time series * @return builder preconfigured with values from the given time series */ public MetricTimeSeries.Builder copy(MetricTimeSeries ts) { MetricTimeSeries.Builder result = new MetricTimeSeries.Builder(ts.getName(), ts.getType()); result.start(ts.getStart()); result.end(ts.getEnd()); result.points(ts.getTimestamps(), ts.getValues()); result.attributes(ts.attributes()); return result; } }
/** * @param delta the whole list is shifted * @return a new instance with shifted values */ public MetricTimeSeries shift(final long delta) { return new MetricTimeSeries.Builder(metric + " shifted by " + delta).points(timestamps.shift(delta), values).build(); }
/** * @param scale to be applied to the values of this list * @return a new instance scaled with the given parameter */ public MetricTimeSeries scale(final double scale) { return new MetricTimeSeries.Builder(metric + " scaled by " + scale).points(timestamps, values.scale(scale)).build(); }