/** * Method to add attribute in flow file * @param key attribute key * @param value attribute value * @param flowFile flow file to update * @param processSession session * @return updated flow file */ public static FlowFile addAttribute(String key, String value, FlowFile flowFile, ProcessSession processSession) { Map<String, String> attributes = new HashMap<String, String>(); attributes.put(key, value); flowFile = processSession.putAllAttributes(flowFile, attributes); return flowFile; }
protected FlowFile populateErrorAttributes(final ProcessSession session, FlowFile flowFile, String query, String message) { Map<String,String> attributes = new HashMap<>(); attributes.put(INFLUX_DB_ERROR_MESSAGE, String.valueOf(message)); attributes.put(INFLUX_DB_EXECUTED_QUERY, String.valueOf(query)); flowFile = session.putAllAttributes(flowFile, attributes); return flowFile; }
private FlowFile populateAttributes(final ProcessSession session, FlowFile flowFile, Map<String, Object> result) { Map<String,String> resultAttributes = new HashMap<>(); resultAttributes.put(RETHINKDB_DELETE_RESULT_ERROR_KEY, String.valueOf(result.get(RESULT_ERROR_KEY))); resultAttributes.put(RETHINKDB_DELETE_RESULT_DELETED_KEY, String.valueOf(result.get(RESULT_DELETED_KEY))); resultAttributes.put(RETHINKDB_DELETE_RESULT_INSERTED_KEY, String.valueOf(result.get(RESULT_INSERTED_KEY))); resultAttributes.put(RETHINKDB_DELETE_RESULT_REPLACED_KEY, String.valueOf(result.get(RESULT_REPLACED_KEY))); resultAttributes.put(RETHINKDB_DELETE_RESULT_SKIPPED_KEY, String.valueOf(result.get(RESULT_SKIPPED_KEY))); resultAttributes.put(RETHINKDB_DELETE_RESULT_UNCHANGED_KEY, String.valueOf(result.get(RESULT_UNCHANGED_KEY))); flowFile = session.putAllAttributes(flowFile, resultAttributes); return flowFile; }
private FlowFile updateAttributes(ProcessSession processSession, FlowFile splitFlowFile, long splitLineCount, long splitFlowFileSize, String splitId, int splitIndex, String origFileName) { Map<String, String> attributes = new HashMap<>(); attributes.put(SPLIT_LINE_COUNT, String.valueOf(splitLineCount)); attributes.put(FRAGMENT_SIZE, String.valueOf(splitFlowFile.getSize())); attributes.put(FRAGMENT_ID, splitId); attributes.put(FRAGMENT_INDEX, String.valueOf(splitIndex)); attributes.put(SEGMENT_ORIGINAL_FILENAME, origFileName); return processSession.putAllAttributes(splitFlowFile, attributes); }
private FlowFile writeHitFlowFile(String json, ProcessSession session, FlowFile hitFlowFile, Map<String, String> attributes) { hitFlowFile = session.write(hitFlowFile, out -> out.write(json.getBytes())); return session.putAllAttributes(hitFlowFile, attributes); }
private FlowFile populateAttributes(final ProcessSession session, FlowFile flowFile, HashMap<String, Object> result) { Map<String,String> resultAttributes = new HashMap<>(); resultAttributes.put(RETHINKDB_INSERT_RESULT, result.toString()); resultAttributes.put(RETHINKDB_INSERT_RESULT_ERROR_KEY, String.valueOf(result.get(RESULT_ERROR_KEY))); resultAttributes.put(RETHINKDB_INSERT_RESULT_DELETED_KEY, String.valueOf(result.get(RESULT_DELETED_KEY))); resultAttributes.put(RETHINKDB_INSERT_RESULT_GENERATED_KEYS_KEY, String.valueOf(result.get(RESULT_GENERATED_KEYS_KEY))); resultAttributes.put(RETHINKDB_INSERT_RESULT_INSERTED_KEY, String.valueOf(result.get(RESULT_INSERTED_KEY))); resultAttributes.put(RETHINKDB_INSERT_RESULT_REPLACED_KEY, String.valueOf(result.get(RESULT_REPLACED_KEY))); resultAttributes.put(RETHINKDB_INSERT_RESULT_SKIPPED_KEY, String.valueOf(result.get(RESULT_SKIPPED_KEY))); resultAttributes.put(RETHINKDB_INSERT_RESULT_UNCHANGED_KEY, String.valueOf(result.get(RESULT_UNCHANGED_KEY))); resultAttributes.put(RETHINKDB_INSERT_RESULT_FIRST_ERROR_KEY, String.valueOf(result.get(RESULT_FIRST_ERROR_KEY))); resultAttributes.put(RETHINKDB_INSERT_RESULT_WARNINGS_KEY, String.valueOf(result.get(RESULT_WARNINGS_KEY))); flowFile = session.putAllAttributes(flowFile, resultAttributes); return flowFile; }
protected List<FlowFile> processClientException(final ProcessSession session, List<FlowFile> flowFiles, AmazonClientException exception) { List<FlowFile> failedFlowFiles = new ArrayList<>(); for (FlowFile flowFile : flowFiles) { Map<String,String> attributes = new HashMap<>(); attributes.put(DYNAMODB_ERROR_EXCEPTION_MESSAGE, exception.getMessage() ); attributes.put(DYNAMODB_ERROR_RETRYABLE, Boolean.toString(exception.isRetryable())); flowFile = session.putAllAttributes(flowFile, attributes); failedFlowFiles.add(flowFile); } return failedFlowFiles; }
public static FlowFile copyAttributesToOriginal(final ProcessSession processSession, final FlowFile originalFlowFile, final String fragmentId, final int fragmentCount) { final Map<String, String> attributesToOriginal = new HashMap<>(); if (fragmentId != null && fragmentId.length() > 0) { attributesToOriginal.put(FRAGMENT_ID.key(), fragmentId); } attributesToOriginal.put(FRAGMENT_COUNT.key(), String.valueOf(fragmentCount)); return processSession.putAllAttributes(originalFlowFile, attributesToOriginal); }
private FlowFile setTagAttributes(ProcessSession session, FlowFile flowFile, List<Tag> tags) { flowFile = session.removeAllAttributes(flowFile, Pattern.compile("^s3\\.tag\\..*")); final Map<String, String> tagAttrs = new HashMap<>(); tags.stream().forEach(t -> tagAttrs.put("s3.tag." + t.getKey(), t.getValue())); flowFile = session.putAllAttributes(flowFile, tagAttrs); return flowFile; } }
private FlowFile writeAggregationFlowFileContents(String name, String json, ProcessSession session, FlowFile aggFlowFile, Map<String, String> attributes) { aggFlowFile = session.write(aggFlowFile, out -> out.write(json.getBytes())); if (name != null) { aggFlowFile = session.putAttribute(aggFlowFile, "aggregation.name", name); } return session.putAllAttributes(aggFlowFile, attributes); }
/** * Method to construct {@link FlowFile} attributes from a {@link TreeEvent} * @param treeEvent a {@link TreeEvent} * @param flowFile instance of the {@link FlowFile} to update * @param processSession instance of {@link ProcessSession} * @return updated {@link FlowFile} */ public static FlowFile updateFlowFileAttributesWithTreeEventProperties(TreeEvent treeEvent, FlowFile flowFile, ProcessSession processSession) { Map<String, String> attributes = new HashMap<String, String>(); addWalkOidValues(attributes, treeEvent.getVariableBindings()); flowFile = processSession.putAllAttributes(flowFile, attributes); return flowFile; }
protected FlowFile saveRequestDetailsAsAttributes(final HttpServletRequest request, final ProcessSession session, String foundSubject, FlowFile flowFile) { Map<String, String> attributes = new HashMap<>(); addMatchingRequestHeaders(request, attributes); flowFile = session.putAllAttributes(flowFile, attributes); flowFile = session.putAttribute(flowFile, "restlistener.remote.source.host", request.getRemoteHost()); flowFile = session.putAttribute(flowFile, "restlistener.request.uri", request.getRequestURI()); flowFile = session.putAttribute(flowFile, "restlistener.remote.user.dn", foundSubject); return flowFile; }
private void completeFlowFile(final ProcessSession session, final FlowFile flowFile, final RecordSetWriter writer, final Relationship relationship, final String details) throws IOException { final WriteResult writeResult = writer.finishRecordSet(); writer.close(); final Map<String, String> attributes = new HashMap<>(); attributes.putAll(writeResult.getAttributes()); attributes.put("record.count", String.valueOf(writeResult.getRecordCount())); attributes.put(CoreAttributes.MIME_TYPE.key(), writer.getMimeType()); session.putAllAttributes(flowFile, attributes); session.transfer(flowFile, relationship); session.getProvenanceReporter().route(flowFile, relationship, details); }
private void writeData(final ProcessSession session, ConsumerRecord<byte[], byte[]> record, final TopicPartition topicPartition) { FlowFile flowFile = session.create(); final BundleTracker tracker = new BundleTracker(record, topicPartition, keyEncoding); tracker.incrementRecordCount(1); final byte[] value = record.value(); if (value != null) { flowFile = session.write(flowFile, out -> { out.write(value); }); } flowFile = session.putAllAttributes(flowFile, getAttributes(record)); tracker.updateFlowFile(flowFile); populateAttributes(tracker); session.transfer(tracker.flowFile, REL_SUCCESS); }
private void writeData(final ProcessSession session, ConsumerRecord<byte[], byte[]> record, final TopicPartition topicPartition) { FlowFile flowFile = session.create(); final BundleTracker tracker = new BundleTracker(record, topicPartition, keyEncoding); tracker.incrementRecordCount(1); final byte[] value = record.value(); if (value != null) { flowFile = session.write(flowFile, out -> { out.write(value); }); } flowFile = session.putAllAttributes(flowFile, getAttributes(record)); tracker.updateFlowFile(flowFile); populateAttributes(tracker); session.transfer(tracker.flowFile, REL_SUCCESS); }
protected void writeBatch(String payload, FlowFile parent, ProcessContext context, ProcessSession session, Map<String, String> extraAttributes, Relationship rel) throws UnsupportedEncodingException { String charset = context.getProperty(CHARSET).evaluateAttributeExpressions(parent).getValue(); FlowFile flowFile = parent != null ? session.create(parent) : session.create(); flowFile = session.importFrom(new ByteArrayInputStream(payload.getBytes(charset)), flowFile); flowFile = session.putAllAttributes(flowFile, extraAttributes); session.getProvenanceReporter().receive(flowFile, getURI(context)); session.transfer(flowFile, rel); }
protected static void transferEvent(final Event event, ProcessSession session, Relationship relationship) { FlowFile flowFile = session.create(); flowFile = session.putAllAttributes(flowFile, event.getHeaders()); flowFile = session.write(flowFile, new OutputStreamCallback() { @Override public void process(final OutputStream out) throws IOException { out.write(event.getBody()); } }); session.getProvenanceReporter() .create(flowFile); session.transfer(flowFile, relationship); }
private FlowFile savePartDetailsAsAttributes(final ProcessSession session, final Part part, final FlowFile flowFile, final int sequenceNumber, final int allPartsCount) { final Map<String, String> attributes = new HashMap<>(); for (String headerName : part.getHeaderNames()) { final String headerValue = part.getHeader(headerName); putAttribute(attributes, "http.headers.multipart." + headerName, headerValue); } putAttribute(attributes, "http.multipart.size", part.getSize()); putAttribute(attributes, "http.multipart.content.type", part.getContentType()); putAttribute(attributes, "http.multipart.name", part.getName()); putAttribute(attributes, "http.multipart.filename", part.getSubmittedFileName()); putAttribute(attributes, "http.multipart.fragments.sequence.number", sequenceNumber + 1); putAttribute(attributes, "http.multipart.fragments.total.number", allPartsCount); return session.putAllAttributes(flowFile, attributes); }
@Override public long writeEvent(ProcessSession session, String transitUri, T eventInfo, long currentSequenceId, Relationship relationship) { FlowFile flowFile = session.create(); flowFile = session.write(flowFile, (outputStream) -> { super.startJson(outputStream, eventInfo); writeJson(eventInfo); // Nothing in the body super.endJson(); }); flowFile = session.putAllAttributes(flowFile, getCommonAttributes(currentSequenceId, eventInfo)); session.transfer(flowFile, relationship); session.getProvenanceReporter().receive(flowFile, transitUri); return currentSequenceId + 1; } }
private FlowFile savePartAttributes(ProcessContext context, ProcessSession session, Part part, FlowFile flowFile, final int i, final int allPartsCount) { final Map<String, String> attributes = new HashMap<>(); for (String headerName : part.getHeaderNames()) { final String headerValue = part.getHeader(headerName); putAttribute(attributes, "http.headers.multipart." + headerName, headerValue); } putAttribute(attributes, "http.multipart.size", part.getSize()); putAttribute(attributes, "http.multipart.content.type", part.getContentType()); putAttribute(attributes, "http.multipart.name", part.getName()); putAttribute(attributes, "http.multipart.filename", part.getSubmittedFileName()); putAttribute(attributes, "http.multipart.fragments.sequence.number", i+1); putAttribute(attributes, "http.multipart.fragments.total.number", allPartsCount); return session.putAllAttributes(flowFile, attributes); }