Tabnine Logo
ExecutionConfig.setGlobalJobParameters
Code IndexAdd Tabnine to your IDE (free)

How to use
setGlobalJobParameters
method
in
org.apache.flink.api.common.ExecutionConfig

Best Java code snippets using org.apache.flink.api.common.ExecutionConfig.setGlobalJobParameters (Showing top 20 results out of 315)

origin: apache/flink

PythonStreamExecutionEnvironment(StreamExecutionEnvironment env, Path tmpLocalDir, String scriptName) {
  this.env = env;
  this.pythonTmpCachePath = tmpLocalDir;
  env.getConfig().setGlobalJobParameters(new PythonJobParameters(scriptName));
  registerJythonSerializers(this.env);
}
origin: apache/flink

public static StreamExecutionEnvironment prepareExecutionEnv(ParameterTool parameterTool)
  throws Exception {
  if (parameterTool.getNumberOfParameters() < 5) {
    System.out.println("Missing parameters!\n" +
      "Usage: Kafka --input-topic <topic> --output-topic <topic> " +
      "--bootstrap.servers <kafka brokers> " +
      "--zookeeper.connect <zk quorum> --group.id <some id>");
    throw new Exception("Missing parameters!\n" +
      "Usage: Kafka --input-topic <topic> --output-topic <topic> " +
      "--bootstrap.servers <kafka brokers> " +
      "--zookeeper.connect <zk quorum> --group.id <some id>");
  }
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  env.getConfig().disableSysoutLogging();
  env.getConfig().setRestartStrategy(RestartStrategies.fixedDelayRestart(4, 10000));
  env.enableCheckpointing(5000); // create a checkpoint every 5 seconds
  env.getConfig().setGlobalJobParameters(parameterTool); // make parameters available in the web interface
  env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);
  return env;
}
origin: apache/flink

public static void main(String[] args) throws Exception {
  // parse the parameters
  final ParameterTool params = ParameterTool.fromArgs(args);
  final long windowSize = params.getLong("windowSize", 2000);
  final long rate = params.getLong("rate", 3L);
  System.out.println("Using windowSize=" + windowSize + ", data rate=" + rate);
  System.out.println("To customize example, use: WindowJoin [--windowSize <window-size-in-millis>] [--rate <elements-per-second>]");
  // obtain execution environment, run this example in "ingestion time"
  StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
  env.setStreamTimeCharacteristic(TimeCharacteristic.IngestionTime);
  // make parameters available in the web interface
  env.getConfig().setGlobalJobParameters(params);
  // create the data sources for both grades and salaries
  DataStream<Tuple2<String, Integer>> grades = GradeSource.getSource(env, rate);
  DataStream<Tuple2<String, Integer>> salaries = SalarySource.getSource(env, rate);
  // run the actual window join program
  // for testability, this functionality is in a separate method.
  DataStream<Tuple3<String, Integer, Integer>> joinedStream = runWindowJoin(grades, salaries, windowSize);
  // print the results with a single thread, rather than in parallel
  joinedStream.print().setParallelism(1);
  // execute program
  env.execute("Windowed Join Example");
}
origin: apache/flink

final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
env.getConfig().setGlobalJobParameters(params);
env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);
env.setParallelism(1);
origin: apache/flink

public static void main(final String[] args) throws Exception {
  final ParameterTool params = ParameterTool.fromArgs(args);
  final ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
  // make parameters available in the web interface
  env.getConfig().setGlobalJobParameters(params);
  // get the data set
  final DataSet<StringTriple> file = getDataSet(env, params);
  // filter lines with empty fields
  final DataSet<StringTriple> filteredLines = file.filter(new EmptyFieldFilter());
  // Here, we could do further processing with the filtered lines...
  JobExecutionResult result;
  // output the filtered lines
  if (params.has("output")) {
    filteredLines.writeAsCsv(params.get("output"));
    // execute program
    result = env.execute("Accumulator example");
  } else {
    System.out.println("Printing result to stdout. Use --output to specify output path.");
    filteredLines.print();
    result = env.getLastJobExecutionResult();
  }
  // get the accumulator result via its registration key
  final List<Integer> emptyFields = result.getAccumulatorResult(EMPTY_FIELD_ACCUMULATOR);
  System.out.format("Number of detected empty fields per column: %s\n", emptyFields);
}
origin: apache/flink

env.getConfig().setGlobalJobParameters(params);
origin: apache/flink

env.getConfig().setGlobalJobParameters(params);
origin: apache/flink

env.getConfig().setGlobalJobParameters(params);
origin: apache/flink

env.getConfig().setGlobalJobParameters(params);
origin: apache/flink

env.getConfig().setGlobalJobParameters(params);
origin: apache/flink

env.getConfig().setGlobalJobParameters(params);
origin: apache/flink

env.getConfig().setGlobalJobParameters(params);
origin: apache/flink

env.getConfig().setGlobalJobParameters(params);
origin: apache/flink

env.getConfig().setGlobalJobParameters(params);
origin: apache/flink

env.getConfig().setGlobalJobParameters(params);
origin: apache/flink

env.getConfig().setGlobalJobParameters(pt);
origin: apache/flink

env.getConfig().setGlobalJobParameters(params); // make parameters available in the web interface
origin: apache/flink

env.getConfig().setGlobalJobParameters(params);
origin: apache/flink

env.getConfig().setGlobalJobParameters(params);
origin: apache/flink

env.getConfig().setGlobalJobParameters(params);
org.apache.flink.api.commonExecutionConfigsetGlobalJobParameters

Javadoc

Register a custom, serializable user configuration object.

Popular methods of ExecutionConfig

  • <init>
  • isObjectReuseEnabled
    Returns whether object reuse has been enabled or disabled. @see #enableObjectReuse()
  • disableSysoutLogging
    Disables the printing of progress update messages to System.out
  • getAutoWatermarkInterval
    Returns the interval of the automatic watermark emission.
  • enableObjectReuse
    Enables reusing objects that Flink internally uses for deserialization and passing data to user-code
  • setAutoWatermarkInterval
    Sets the interval of the automatic watermark emission. Watermarks are used throughout the streaming
  • disableObjectReuse
    Disables reusing objects that Flink internally uses for deserialization and passing data to user-cod
  • getRestartStrategy
    Returns the restart strategy which has been set for the current job.
  • isSysoutLoggingEnabled
    Gets whether progress update messages should be printed to System.out
  • registerKryoType
    Registers the given type with the serialization stack. If the type is eventually serialized as a POJ
  • registerTypeWithKryoSerializer
    Registers the given Serializer via its class as a serializer for the given type at the KryoSerialize
  • setRestartStrategy
    Sets the restart strategy to be used for recovery. ExecutionConfig config = env.getConfig();
  • registerTypeWithKryoSerializer,
  • setRestartStrategy,
  • getParallelism,
  • addDefaultKryoSerializer,
  • getGlobalJobParameters,
  • getNumberOfExecutionRetries,
  • getRegisteredKryoTypes,
  • setParallelism,
  • getDefaultKryoSerializerClasses

Popular in Java

  • Running tasks concurrently on multiple threads
  • setRequestProperty (URLConnection)
  • getContentResolver (Context)
  • startActivity (Activity)
  • Connection (java.sql)
    A connection represents a link from a Java application to a database. All SQL statements and results
  • TimeZone (java.util)
    TimeZone represents a time zone offset, and also figures out daylight savings. Typically, you get a
  • ReentrantLock (java.util.concurrent.locks)
    A reentrant mutual exclusion Lock with the same basic behavior and semantics as the implicit monitor
  • Notification (javax.management)
  • Options (org.apache.commons.cli)
    Main entry-point into the library. Options represents a collection of Option objects, which describ
  • Loader (org.hibernate.loader)
    Abstract superclass of object loading (and querying) strategies. This class implements useful common
  • From CI to AI: The AI layer in your organization
Tabnine Logo
  • Products

    Search for Java codeSearch for JavaScript code
  • IDE Plugins

    IntelliJ IDEAWebStormVisual StudioAndroid StudioEclipseVisual Studio CodePyCharmSublime TextPhpStormVimGoLandRubyMineEmacsJupyter NotebookJupyter LabRiderDataGripAppCode
  • Company

    About UsContact UsCareers
  • Resources

    FAQBlogTabnine AcademyTerms of usePrivacy policyJava Code IndexJavascript Code Index
Get Tabnine for your IDE now