Tabnine Logo
MetadataMapperService
Code IndexAdd Tabnine to your IDE (free)

How to use
MetadataMapperService
in
com.netflix.conductor.core.metadata

Best Java code snippets using com.netflix.conductor.core.metadata.MetadataMapperService (Showing top 20 results out of 315)

origin: Netflix/conductor

@Test(expected = ApplicationException.class)
public void testMetadataPopulationMissingDefinitions() {
  String nameTaskDefinition1 = "task4";
  WorkflowTask workflowTask1 = createWorkflowTask(nameTaskDefinition1);
  String nameTaskDefinition2 = "task5";
  WorkflowTask workflowTask2 = createWorkflowTask(nameTaskDefinition2);
  TaskDef taskDefinition = createTaskDefinition(nameTaskDefinition1);
  WorkflowDef workflowDefinition = createWorkflowDefinition("testMetadataPopulation");
  workflowDefinition.setTasks(ImmutableList.of(workflowTask1, workflowTask2));
  when(metadataDAO.getTaskDef(nameTaskDefinition1)).thenReturn(taskDefinition);
  when(metadataDAO.getTaskDef(nameTaskDefinition2)).thenReturn(null);
  metadataMapperService.populateTaskDefinitions(workflowDefinition);
}
origin: Netflix/conductor

public WorkflowDef lookupForWorkflowDefinition(String name, Integer version) {
  Optional<WorkflowDef> potentialDef =
      version == null ? lookupLatestWorkflowDefinition(name) : lookupWorkflowDefinition(name, version);
  //Check if the workflow definition is valid
  WorkflowDef workflowDefinition = potentialDef
      .orElseThrow(() -> {
            logger.error("There is no workflow defined with name {} and version {}", name, version);
            return new ApplicationException(
                ApplicationException.Code.NOT_FOUND,
                String.format("No such workflow defined. name=%s, version=%s", name, version)
            );
          }
      );
  return workflowDefinition;
}
origin: Netflix/conductor

public Workflow populateWorkflowWithDefinitions(Workflow workflow) {
  WorkflowDef workflowDefinition = Optional.ofNullable(workflow.getWorkflowDefinition())
      .orElseGet(() -> {
        WorkflowDef wd = lookupForWorkflowDefinition(workflow.getWorkflowName(), workflow.getWorkflowVersion());
        workflow.setWorkflowDefinition(wd);
        return wd;
      });
  workflowDefinition.collectTasks().forEach(
      workflowTask -> {
        if (shouldPopulateDefinition(workflowTask)) {
          workflowTask.setTaskDefinition(metadataDAO.getTaskDef(workflowTask.getName()));
        } else if (workflowTask.getType().equals(TaskType.SUB_WORKFLOW.name())) {
          populateVersionForSubWorkflow(workflowTask);
        }
      }
  );
  checkNotEmptyDefinitions(workflowDefinition);
  return workflow;
}
origin: Netflix/conductor

private WorkflowTask populateWorkflowTaskWithDefinition(WorkflowTask workflowTask) {
  if (shouldPopulateDefinition(workflowTask)) {
    workflowTask.setTaskDefinition(metadataDAO.getTaskDef(workflowTask.getName()));
  } else if (workflowTask.getType().equals(TaskType.SUB_WORKFLOW.name())) {
    populateVersionForSubWorkflow(workflowTask);
  }
  return workflowTask;
}
origin: Netflix/conductor

    Map<String, String> taskToDomain
) {
  WorkflowDef workflowDefinition = metadataMapperService.lookupForWorkflowDefinition(name, version);
    Map<String, String> taskToDomain
) {
  workflowDefinition = metadataMapperService.populateTaskDefinitions(workflowDefinition);
origin: Netflix/conductor

wf = metadataMapperService.populateWorkflowWithDefinitions(wf);
origin: Netflix/conductor

@Test(expected = IllegalArgumentException.class)
public void testLookupLatestWorkflowDefinition() {
  String workflowName = "test";
  when(metadataDAO.getLatest(workflowName)).thenReturn(Optional.of(new WorkflowDef()));
  Optional<WorkflowDef> optionalWorkflowDef = metadataMapperService.lookupLatestWorkflowDefinition(workflowName);
  assertTrue(optionalWorkflowDef.isPresent());
  metadataMapperService.lookupLatestWorkflowDefinition(null);
}
origin: Netflix/conductor

/**
 * This method is used to get the List of dynamic workflow tasks and their input based on the {@link WorkflowTask#getDynamicForkTasksParam()}
 *
 * @param taskToSchedule:       The Task of type FORK_JOIN_DYNAMIC that needs to scheduled, which has the input parameters
 * @param workflowInstance:     The instance of the {@link Workflow} which represents the workflow being executed.
 * @param dynamicForkTaskParam: The key representing the dynamic fork join json payload which is available in {@link WorkflowTask#getInputParameters()}
 * @throws TerminateWorkflowException : In case of input parameters of the dynamic fork tasks not represented as {@link Map}
 * @return a {@link Pair} representing the list of dynamic fork tasks in {@link Pair#getLeft()} and the input for the dynamic fork tasks in {@link Pair#getRight()}
 */
@SuppressWarnings("unchecked")
@VisibleForTesting
Pair<List<WorkflowTask>, Map<String, Map<String, Object>>> getDynamicForkTasksAndInput(WorkflowTask taskToSchedule, Workflow workflowInstance,
                                            String dynamicForkTaskParam) throws TerminateWorkflowException {
  Map<String, Object> input = parametersUtils.getTaskInput(taskToSchedule.getInputParameters(), workflowInstance, null, null);
  Object dynamicForkTasksJson = input.get(dynamicForkTaskParam);
  List<WorkflowTask> dynamicForkWorkflowTasks = objectMapper.convertValue(dynamicForkTasksJson, ListOfWorkflowTasks);
  for (WorkflowTask workflowTask : dynamicForkWorkflowTasks) {
    if (MetadataMapperService.shouldPopulateDefinition(workflowTask)) {
      workflowTask.setTaskDefinition(metadataDAO.getTaskDef(workflowTask.getName()));
    }
  }
  Object dynamicForkTasksInput = input.get(taskToSchedule.getDynamicForkTasksInputParamName());
  if (!(dynamicForkTasksInput instanceof Map)) {
    throw new TerminateWorkflowException("Input to the dynamically forked tasks is not a map -> expecting a map of K,V  but found " + dynamicForkTasksInput);
  }
  return new ImmutablePair<>(dynamicForkWorkflowTasks, (Map<String, Map<String, Object>>) dynamicForkTasksInput);
}
origin: Netflix/conductor

@Test(expected = IllegalArgumentException.class)
public void testLookupWorkflowDefinition() {
  try{
    String workflowName = "test";
    when(metadataDAO.get(workflowName, 0)).thenReturn(Optional.of(new WorkflowDef()));
    Optional<WorkflowDef> optionalWorkflowDef = metadataMapperService.lookupWorkflowDefinition(workflowName, 0);
    assertTrue(optionalWorkflowDef.isPresent());
    metadataMapperService.lookupWorkflowDefinition(null, 0);
  } catch (ConstraintViolationException ex){
    Assert.assertEquals(1, ex.getConstraintViolations().size());
    Set<String> messages = getConstraintViolationMessages(ex.getConstraintViolations());
    assertTrue(messages.contains("WorkflowIds list cannot be null."));
  }
}
origin: Netflix/conductor

public WorkflowDef populateTaskDefinitions(WorkflowDef workflowDefinition) {
  workflowDefinition.collectTasks().forEach(
      this::populateWorkflowTaskWithDefinition
  );
  checkNotEmptyDefinitions(workflowDefinition);
  return workflowDefinition;
}
origin: Netflix/conductor

public Task populateTaskWithDefinition(Task task) {
  populateWorkflowTaskWithDefinition(task.getWorkflowTask());
  return task;
}
origin: com.netflix.conductor/conductor-core

    Map<String, String> taskToDomain
) {
  WorkflowDef workflowDefinition = metadataMapperService.lookupForWorkflowDefinition(name, version);
    Map<String, String> taskToDomain
) {
  workflowDefinition = metadataMapperService.populateTaskDefinitions(workflowDefinition);
origin: Netflix/conductor

workflow = metadataMapperService.populateWorkflowWithDefinitions(workflow);
origin: com.netflix.conductor/conductor-core

private WorkflowTask populateWorkflowTaskWithDefinition(WorkflowTask workflowTask) {
  if (shouldPopulateDefinition(workflowTask)) {
    workflowTask.setTaskDefinition(metadataDAO.getTaskDef(workflowTask.getName()));
  } else if (workflowTask.getType().equals(TaskType.SUB_WORKFLOW.name())) {
    populateVersionForSubWorkflow(workflowTask);
  }
  return workflowTask;
}
origin: Netflix/conductor

dynamicForkJoinWorkflowTask.setName(dynamicForkJoinTask.getTaskName());
dynamicForkJoinWorkflowTask.setType(dynamicForkJoinTask.getType());
if (MetadataMapperService.shouldPopulateDefinition(dynamicForkJoinWorkflowTask)) {
  dynamicForkJoinWorkflowTask.setTaskDefinition(
      metadataDAO.getTaskDef(dynamicForkJoinTask.getTaskName()));
origin: com.netflix.conductor/conductor-core

public WorkflowDef populateTaskDefinitions(WorkflowDef workflowDefinition) {
  workflowDefinition.collectTasks().forEach(
      this::populateWorkflowTaskWithDefinition
  );
  checkNotEmptyDefinitions(workflowDefinition);
  return workflowDefinition;
}
origin: com.netflix.conductor/conductor-core

public Task populateTaskWithDefinition(Task task) {
  populateWorkflowTaskWithDefinition(task.getWorkflowTask());
  return task;
}
origin: Netflix/conductor

@Test
public void testMetadataPopulationOnSimpleTask() {
  String nameTaskDefinition = "task1";
  TaskDef taskDefinition = createTaskDefinition(nameTaskDefinition);
  WorkflowTask workflowTask = createWorkflowTask(nameTaskDefinition);
  when(metadataDAO.getTaskDef(nameTaskDefinition)).thenReturn(taskDefinition);
  WorkflowDef workflowDefinition = createWorkflowDefinition("testMetadataPopulation");
  workflowDefinition.setTasks(ImmutableList.of(workflowTask));
  metadataMapperService.populateTaskDefinitions(workflowDefinition);
  assertEquals(1, workflowDefinition.getTasks().size());
  WorkflowTask populatedWorkflowTask = workflowDefinition.getTasks().get(0);
  assertNotNull(populatedWorkflowTask.getTaskDefinition());
  verify(metadataDAO).getTaskDef(nameTaskDefinition);
}
origin: com.netflix.conductor/conductor-core

public Workflow populateWorkflowWithDefinitions(Workflow workflow) {
  WorkflowDef workflowDefinition = Optional.ofNullable(workflow.getWorkflowDefinition())
      .orElseGet(() -> {
        WorkflowDef wd = lookupForWorkflowDefinition(workflow.getWorkflowName(), workflow.getWorkflowVersion());
        workflow.setWorkflowDefinition(wd);
        return wd;
      });
  workflowDefinition.collectTasks().forEach(
      workflowTask -> {
        if (shouldPopulateDefinition(workflowTask)) {
          workflowTask.setTaskDefinition(metadataDAO.getTaskDef(workflowTask.getName()));
        } else if (workflowTask.getType().equals(TaskType.SUB_WORKFLOW.name())) {
          populateVersionForSubWorkflow(workflowTask);
        }
      }
  );
  checkNotEmptyDefinitions(workflowDefinition);
  return workflow;
}
origin: com.netflix.conductor/conductor-core

public WorkflowDef lookupForWorkflowDefinition(String name, Integer version) {
  Optional<WorkflowDef> potentialDef =
      version == null ? lookupLatestWorkflowDefinition(name) : lookupWorkflowDefinition(name, version);
  //Check if the workflow definition is valid
  WorkflowDef workflowDefinition = potentialDef
      .orElseThrow(() -> {
            logger.error("There is no workflow defined with name {} and version {}", name, version);
            return new ApplicationException(
                ApplicationException.Code.NOT_FOUND,
                String.format("No such workflow defined. name=%s, version=%s", name, version)
            );
          }
      );
  return workflowDefinition;
}
com.netflix.conductor.core.metadataMetadataMapperService

Javadoc

Populates metadata definitions within workflow objects. Benefits of loading and populating metadata definitions upfront could be: - Immutable definitions within a workflow execution with the added benefit of guaranteeing consistency at runtime. - Stress is reduced on the storage layer

Most used methods

  • populateTaskDefinitions
  • lookupLatestWorkflowDefinition
  • lookupWorkflowDefinition
  • checkNotEmptyDefinitions
  • lookupForWorkflowDefinition
  • populateVersionForSubWorkflow
  • populateWorkflowTaskWithDefinition
  • populateWorkflowWithDefinitions
  • shouldPopulateDefinition
  • <init>

Popular in Java

  • Reactive rest calls using spring rest template
  • putExtra (Intent)
  • getResourceAsStream (ClassLoader)
  • getApplicationContext (Context)
  • Thread (java.lang)
    A thread is a thread of execution in a program. The Java Virtual Machine allows an application to ha
  • DecimalFormat (java.text)
    A concrete subclass of NumberFormat that formats decimal numbers. It has a variety of features desig
  • MessageFormat (java.text)
    Produces concatenated messages in language-neutral way. New code should probably use java.util.Forma
  • TreeSet (java.util)
    TreeSet is an implementation of SortedSet. All optional operations (adding and removing) are support
  • JButton (javax.swing)
  • Option (scala)
  • CodeWhisperer alternatives
Tabnine Logo
  • Products

    Search for Java codeSearch for JavaScript code
  • IDE Plugins

    IntelliJ IDEAWebStormVisual StudioAndroid StudioEclipseVisual Studio CodePyCharmSublime TextPhpStormVimGoLandRubyMineEmacsJupyter NotebookJupyter LabRiderDataGripAppCode
  • Company

    About UsContact UsCareers
  • Resources

    FAQBlogTabnine AcademyTerms of usePrivacy policyJava Code IndexJavascript Code Index
Get Tabnine for your IDE now