private void updateGauge(Iterable<MetricResult<GaugeResult>> gauges) { for (MetricResult<GaugeResult> metricResult : gauges) { String flinkMetricName = getFlinkMetricNameString(metricResult); GaugeResult update = metricResult.getAttempted(); // update flink metric FlinkGauge gauge = flinkGaugeCache.get(flinkMetricName); if (gauge == null) { gauge = runtimeContext.getMetricGroup().gauge(flinkMetricName, new FlinkGauge(update)); flinkGaugeCache.put(flinkMetricName, gauge); } else { gauge.update(update); } } }
@Override protected boolean matchesSafely(MetricResult<T> item) { final T metricValue = isCommitted ? item.getCommitted() : item.getAttempted(); return Objects.equals(namespace, item.getName().getNamespace()) && Objects.equals(name, item.getName().getName()) && item.getStep().contains(step) && metricResultsEqual(value, metricValue); }
@Override protected void describeMismatchSafely(MetricResult<T> item, Description mismatchDescription) { mismatchDescription.appendText("MetricResult{"); final T metricValue = isCommitted ? item.getCommitted() : item.getAttempted(); describeMetricsResultMembersMismatch(item, mismatchDescription, namespace, name, step); if (!Objects.equals(value, metricValue)) { mismatchDescription .appendText(String.format("%s: ", metricState)) .appendValue(value) .appendText(" != ") .appendValue(metricValue); } mismatchDescription.appendText("}"); } };
/** * Create a feature row from Metrics. * The granularity is unrelated to the metrics themselves, and simply indicates the * granularity at which to store and overwrite the metrics downstream. * * @param counter * @param jobName * @param granularity * @return */ public static FeatureRow makeFeatureRow( MetricResult<Long> counter, String jobName, Granularity.Enum granularity) { String jobId = jobName; String namespace = counter.getName().getNamespace(); String step = counter.getStep(); String name = counter.getName().getName(); Long attempted = counter.getAttempted(); String entityId = String.join(":", new String[] {jobId, namespace, step, name}); return FeatureRow.newBuilder() .setEntityName(METRICS_ENTITY_NAME) .setEntityKey(entityId) .setGranularity(granularity) .setEventTimestamp(DateUtil.toTimestamp(DateTime.now())) .addFeatures(Features.of(METRICS_FEATURE_JOB_ID, Values.ofString(jobId))) .addFeatures(Features.of(METRICS_FEATURE_NAMESPACE, Values.ofString(namespace))) .addFeatures(Features.of(METRICS_FEATURE_STEP, Values.ofString(step))) .addFeatures(Features.of(METRICS_FEATURE_NAME, Values.ofString(name))) .addFeatures(Features.of(METRICS_FEATURE_ATTEMPTED, Values.ofInt64(attempted))) .build(); }
public static void queryMetrics(PipelineResult result) { MetricQueryResults metrics = result.metrics().queryMetrics( MetricsFilter.builder().addNameFilter(MetricNameFilter.inNamespace("PollingExample")).build()); Iterable<MetricResult<Long>> counters = metrics.getCounters(); for (MetricResult<Long> counter : counters) { System.out.println(counter.getName().name() + " : " + counter.getAttempted() + " " + Instant.now()); } }
@VisibleForTesting static String getFlinkMetricNameString(MetricResult<?> metricResult) { MetricName metricName = metricResult.getName(); // We use only the MetricName here, the step name is already contained // in the operator name which is passed to Flink's MetricGroup to which // the metric with the following name will be added. return metricName.getNamespace() + METRIC_KEY_SEPARATOR + metricName.getName(); }
MetricQueryResults result = dataflowMetrics.queryMetrics(null); try { result.getDistributions().iterator().next().getCommitted(); fail("Expected UnsupportedOperationException"); } catch (UnsupportedOperationException expected) {
@VisibleForTesting String renderName(MetricResult<?> metricResult) { String renderedStepName = metricResult.getStep().replaceAll(ILLEGAL_CHARACTERS_AND_PERIOD, "_"); if (renderedStepName.endsWith("_")) { renderedStepName = renderedStepName.substring(0, renderedStepName.length() - 1); } MetricName metricName = metricResult.getName(); return (renderedStepName + "." + metricName.getNamespace() + "." + metricName.getName()) .replaceAll(ILLEGAL_CHARACTERS, "_"); } }
@Override protected void describeMismatchSafely( MetricResult<DistributionResult> item, Description mismatchDescription) { mismatchDescription.appendText("MetricResult{"); describeMetricsResultMembersMismatch(item, mismatchDescription, namespace, name, step); DistributionResult metricValue = isCommitted ? item.getCommitted() : item.getAttempted(); if (!Objects.equals(min, metricValue.getMin())) { mismatchDescription .appendText(String.format("%sMin: ", metricState)) .appendValue(min) .appendText(" != ") .appendValue(metricValue.getMin()); } if (!Objects.equals(max, metricValue.getMax())) { mismatchDescription .appendText(String.format("%sMax: ", metricState)) .appendValue(max) .appendText(" != ") .appendValue(metricValue.getMax()); } mismatchDescription.appendText("}"); } };
public static Map<String, Long> getMetrics(PipelineResult result) { final MetricQueryResults metricQueryResults = result.metrics().queryMetrics(MetricsFilter.builder().build()); final Map<String, Long> gauges = StreamSupport.stream(metricQueryResults.getGauges().spliterator(), false) .collect(Collectors.groupingBy( MetricResult::getName, Collectors.reducing(GaugeResult.empty(), GET_COMMITTED_GAUGE, BinaryOperator.maxBy( Comparator.comparing(GaugeResult::getTimestamp))))) .entrySet().stream() .collect(Collectors.toMap(e -> e.getKey().getName(), e -> e.getValue().getValue())); final Map<String, Long> counters = StreamSupport.stream( metricQueryResults.getCounters().spliterator(), false) .collect(Collectors.groupingBy(m -> m.getName().getName(), Collectors.summingLong(GET_COMMITTED_COUNTER))); Map<String, Long> ret = new HashMap<>(); ret.putAll(gauges); ret.putAll(counters); return Collections.unmodifiableMap(ret); }
MetricQueryResults result = dataflowMetrics.queryMetrics(null); try { result.getCounters().iterator().next().getCommitted(); fail("Expected UnsupportedOperationException"); } catch (UnsupportedOperationException expected) {
private void updateGauge(Iterable<MetricResult<GaugeResult>> gauges) { for (MetricResult<GaugeResult> metricResult : gauges) { String flinkMetricName = getFlinkMetricNameString(GAUGE_PREFIX, metricResult); GaugeResult update = metricResult.getAttempted(); // update flink metric FlinkGauge gauge = flinkGaugeCache.get(flinkMetricName); if (gauge == null) { gauge = runtimeContext.getMetricGroup().gauge(flinkMetricName, new FlinkGauge(update)); flinkGaugeCache.put(flinkMetricName, gauge); } else { gauge.update(update); } } }
@Override protected boolean matchesSafely(MetricResult<DistributionResult> item) { DistributionResult metricValue = isCommitted ? item.getCommitted() : item.getAttempted(); return Objects.equals(namespace, item.getName().getNamespace()) && Objects.equals(name, item.getName().getName()) && item.getStep().contains(step) && Objects.equals(min, metricValue.getMin()) && Objects.equals(max, metricValue.getMax()); }
@Test public void testMetricNameGeneration() { MetricResult mock = Mockito.mock(MetricResult.class); when(mock.getStep()).thenReturn("step"); MetricName metricName = MetricName.named("namespace", "name"); when(mock.getName()).thenReturn(metricName); String name = FlinkMetricContainer.getFlinkMetricNameString(mock); assertThat(name, is("namespace.name")); }
@Override public void writeMetrics(MetricQueryResults metricQueryResults) throws Exception { counterValue = metricQueryResults.getCounters().iterator().hasNext() ? metricQueryResults.getCounters().iterator().next().getAttempted() : 0L; } }
String name, String step) { if (!Objects.equals(namespace, item.getName().getNamespace())) { mismatchDescription .appendText("inNamespace: ") .appendValue(namespace) .appendText(" != ") .appendValue(item.getName().getNamespace()); if (!Objects.equals(name, item.getName().getName())) { mismatchDescription .appendText("name: ") .appendValue(name) .appendText(" != ") .appendValue(item.getName().getName()); if (!item.getStep().contains(step)) { mismatchDescription .appendText("step: ") .appendValue(step) .appendText(" != ") .appendValue(item.getStep());
private void updateDistributions(Iterable<MetricResult<DistributionResult>> distributions) { for (MetricResult<DistributionResult> metricResult : distributions) { String flinkMetricName = getFlinkMetricNameString(metricResult); DistributionResult update = metricResult.getAttempted(); // update flink metric FlinkDistributionGauge gauge = flinkDistributionGaugeCache.get(flinkMetricName); if (gauge == null) { gauge = runtimeContext .getMetricGroup() .gauge(flinkMetricName, new FlinkDistributionGauge(update)); flinkDistributionGaugeCache.put(flinkMetricName, gauge); } else { gauge.update(update); } } }
private void updateDistributions(Iterable<MetricResult<DistributionResult>> distributions) { for (MetricResult<DistributionResult> metricResult : distributions) { String flinkMetricName = getFlinkMetricNameString(DISTRIBUTION_PREFIX, metricResult); DistributionResult update = metricResult.getAttempted(); // update flink metric FlinkDistributionGauge gauge = flinkDistributionGaugeCache.get(flinkMetricName); if (gauge == null) { gauge = runtimeContext .getMetricGroup() .gauge(flinkMetricName, new FlinkDistributionGauge(update)); flinkDistributionGaugeCache.put(flinkMetricName, gauge); } else { gauge.update(update); } } }
private void updateCounters(Iterable<MetricResult<Long>> counters) { for (MetricResult<Long> metricResult : counters) { String flinkMetricName = getFlinkMetricNameString(COUNTER_PREFIX, metricResult); Long update = metricResult.getAttempted(); // update flink metric Counter counter = flinkCounterCache.computeIfAbsent( flinkMetricName, n -> runtimeContext.getMetricGroup().counter(n)); counter.dec(counter.getCount()); counter.inc(update); } }