Tabnine Logo
JavaDStream.cache
Code IndexAdd Tabnine to your IDE (free)

How to use
cache
method
in
org.apache.spark.streaming.api.java.JavaDStream

Best Java code snippets using org.apache.spark.streaming.api.java.JavaDStream.cache (Showing top 8 results out of 315)

origin: databricks/learning-spark

= logData.map(new Functions.ParseFromLogLine()).cache();
origin: co.cask.cdap/hydrator-spark-core2

@Override
public SparkCollection<T> cache() {
 return wrap(stream.cache());
}
origin: co.cask.cdap/hydrator-spark-core2

 @Override
 public void run() {
  Compat.foreachRDD(stream.cache(), new StreamingSparkSinkFunction<T>(sec, stageSpec));
 }
};
origin: co.cask.cdap/hydrator-spark-core2

 @Override
 public void run() {
  // cache since the streaming sink function will check if the rdd is empty, which can cause recomputation
  // and confusing metrics if its not cached.
  Compat.foreachRDD(stream.cache(), new StreamingBatchSinkFunction<>(sinkFunction, sec, stageSpec));
 }
};
origin: org.apache.beam/beam-runners-spark

@Override
@SuppressWarnings("unchecked")
public void cache(String storageLevel, Coder<?> coder) {
 // we "force" MEMORY storage level in streaming
 if (!StorageLevel.fromString(storageLevel).equals(StorageLevel.MEMORY_ONLY_SER())) {
  LOG.warn(
    "Provided StorageLevel: {} is ignored for streams, using the default level: {}",
    storageLevel,
    StorageLevel.MEMORY_ONLY_SER());
 }
 // Caching can cause Serialization, we need to code to bytes
 // more details in https://issues.apache.org/jira/browse/BEAM-2669
 Coder<WindowedValue<T>> wc = (Coder<WindowedValue<T>>) coder;
 this.dStream =
   dStream.map(CoderHelpers.toByteFunction(wc)).cache().map(CoderHelpers.fromByteFunction(wc));
}
origin: baghelamit/iot-traffic-monitor

filteredIotDataStream.cache();
origin: sectong/SparkToParquet

    return list;
}).cache();
origin: XavientInformationSystems/Data-Ingestion-Platform

public static void main(String[] args) throws DataIngestException {
  CmdLineParser cmdLineParser = new CmdLineParser();
  final AppArgs appArgs = cmdLineParser.validateArgs(args);
  
  System.setProperty("HADOOP_USER_NAME", appArgs.getProperty(DiPConfiguration.HADOOP_USER_NAME));
  SparkConf conf = new SparkConf().setAppName("SparkTwitterStreaming")
      .setMaster("local[*]");
  try (JavaStreamingContext jsc = new JavaStreamingContext(new JavaSparkContext(conf), new Duration(1000))) {
    JavaPairReceiverInputDStream<String, String> stream = KafkaUtils.createStream(jsc,
        appArgs.getProperty(DiPConfiguration.ZK_HOST)+":"+appArgs.getProperty(DiPConfiguration.ZK_PORT), "spark-stream", getKafkaTopics(appArgs));
    JavaDStream<Object[]> twitterStreams = stream.map(tuple -> FlatJsonConverter.convertToValuesArray(tuple._2))
        .cache();
    
    SparkHdfsWriter.write(twitterStreams, appArgs);
    new SparkHBaseWriter(jsc.sparkContext(), appArgs).write(twitterStreams);
    SparkJdbcSourceWriter jdbcSourceWriter = new SparkJdbcSourceWriter(new SQLContext(jsc.sparkContext()),
        appArgs);
    new TopNLocationByTweets(jdbcSourceWriter,Integer.valueOf(appArgs.getProperty("topN"))).compute(twitterStreams);
    new TopNUsersWithMaxFollowers(jdbcSourceWriter,Integer.valueOf(appArgs.getProperty("topN"))).compute(twitterStreams);
    jsc.start();
    jsc.awaitTermination();
  }
}
org.apache.spark.streaming.api.javaJavaDStreamcache

Popular methods of JavaDStream

  • foreachRDD
  • map
  • mapToPair
  • union
  • filter
  • flatMap
  • dstream
  • countByValue
  • transformToPair
  • window
  • count
  • transform
  • count,
  • transform,
  • countByValueAndWindow,
  • flatMapToPair,
  • print,
  • reduceByWindow,
  • repartition,
  • glom,
  • mapPartitions

Popular in Java

  • Finding current android device location
  • findViewById (Activity)
  • getContentResolver (Context)
  • setContentView (Activity)
  • ByteBuffer (java.nio)
    A buffer for bytes. A byte buffer can be created in either one of the following ways: * #allocate
  • Collections (java.util)
    This class consists exclusively of static methods that operate on or return collections. It contains
  • Locale (java.util)
    Locale represents a language/country/variant combination. Locales are used to alter the presentatio
  • TreeSet (java.util)
    TreeSet is an implementation of SortedSet. All optional operations (adding and removing) are support
  • Stream (java.util.stream)
    A sequence of elements supporting sequential and parallel aggregate operations. The following exampl
  • ZipFile (java.util.zip)
    This class provides random read access to a zip file. You pay more to read the zip file's central di
  • Top PhpStorm plugins
Tabnine Logo
  • Products

    Search for Java codeSearch for JavaScript code
  • IDE Plugins

    IntelliJ IDEAWebStormVisual StudioAndroid StudioEclipseVisual Studio CodePyCharmSublime TextPhpStormVimGoLandRubyMineEmacsJupyter NotebookJupyter LabRiderDataGripAppCode
  • Company

    About UsContact UsCareers
  • Resources

    FAQBlogTabnine AcademyTerms of usePrivacy policyJava Code IndexJavascript Code Index
Get Tabnine for your IDE now