Tabnine Logo
JavaDStream.reduceByWindow
Code IndexAdd Tabnine to your IDE (free)

How to use
reduceByWindow
method
in
org.apache.spark.streaming.api.java.JavaDStream

Best Java code snippets using org.apache.spark.streaming.api.java.JavaDStream.reduceByWindow (Showing top 5 results out of 315)

origin: databricks/learning-spark

public Long call(ApacheAccessLog entry) {
 return 1L;
}}).reduceByWindow(new Function2<Long, Long, Long>() {
  public Long call(Long v1, Long v2) {
   return v1+v2;
origin: org.apache.spark/spark-streaming_2.11

@Test
public void testReduceByWindow() {
 List<List<Integer>> inputData = Arrays.asList(
  Arrays.asList(1, 2, 3),
  Arrays.asList(4, 5, 6),
  Arrays.asList(7, 8, 9));
 List<List<Integer>> expected = Arrays.asList(
  Arrays.asList(6),
  Arrays.asList(21),
  Arrays.asList(39),
  Arrays.asList(24));
 JavaDStream<Integer> stream = JavaTestUtils.attachTestInputStream(ssc, inputData, 1);
 JavaDStream<Integer> reducedWindowed = stream.reduceByWindow(
  (x, y) -> x + y, (x, y) -> x - y, new Duration(2000), new Duration(1000));
 JavaTestUtils.attachTestOutputStream(reducedWindowed);
 List<List<Integer>> result = JavaTestUtils.runStreams(ssc, 4, 4);
 Assert.assertEquals(expected, result);
}
origin: org.apache.spark/spark-streaming_2.10

@Test
public void testReduceByWindow() {
 List<List<Integer>> inputData = Arrays.asList(
  Arrays.asList(1, 2, 3),
  Arrays.asList(4, 5, 6),
  Arrays.asList(7, 8, 9));
 List<List<Integer>> expected = Arrays.asList(
  Arrays.asList(6),
  Arrays.asList(21),
  Arrays.asList(39),
  Arrays.asList(24));
 JavaDStream<Integer> stream = JavaTestUtils.attachTestInputStream(ssc, inputData, 1);
 JavaDStream<Integer> reducedWindowed = stream.reduceByWindow(
  (x, y) -> x + y, (x, y) -> x - y, new Duration(2000), new Duration(1000));
 JavaTestUtils.attachTestOutputStream(reducedWindowed);
 List<List<Integer>> result = JavaTestUtils.runStreams(ssc, 4, 4);
 Assert.assertEquals(expected, result);
}
origin: org.apache.spark/spark-streaming_2.11

@SuppressWarnings("unchecked")
private void testReduceByWindow(boolean withInverse) {
 List<List<Integer>> inputData = Arrays.asList(
   Arrays.asList(1,2,3),
   Arrays.asList(4,5,6),
   Arrays.asList(7,8,9));
 List<List<Integer>> expected = Arrays.asList(
   Arrays.asList(6),
   Arrays.asList(21),
   Arrays.asList(39),
   Arrays.asList(24));
 JavaDStream<Integer> stream = JavaTestUtils.attachTestInputStream(ssc, inputData, 1);
 JavaDStream<Integer> reducedWindowed;
 if (withInverse) {
  reducedWindowed = stream.reduceByWindow(new IntegerSum(),
                      new IntegerDifference(),
                      new Duration(2000),
                      new Duration(1000));
 } else {
  reducedWindowed = stream.reduceByWindow(new IntegerSum(),
                      new Duration(2000), new Duration(1000));
 }
 JavaTestUtils.attachTestOutputStream(reducedWindowed);
 List<List<Integer>> result = JavaTestUtils.runStreams(ssc, 4, 4);
 Assert.assertEquals(expected, result);
}
origin: org.apache.spark/spark-streaming_2.10

@SuppressWarnings("unchecked")
private void testReduceByWindow(boolean withInverse) {
 List<List<Integer>> inputData = Arrays.asList(
   Arrays.asList(1,2,3),
   Arrays.asList(4,5,6),
   Arrays.asList(7,8,9));
 List<List<Integer>> expected = Arrays.asList(
   Arrays.asList(6),
   Arrays.asList(21),
   Arrays.asList(39),
   Arrays.asList(24));
 JavaDStream<Integer> stream = JavaTestUtils.attachTestInputStream(ssc, inputData, 1);
 JavaDStream<Integer> reducedWindowed;
 if (withInverse) {
  reducedWindowed = stream.reduceByWindow(new IntegerSum(),
                      new IntegerDifference(),
                      new Duration(2000),
                      new Duration(1000));
 } else {
  reducedWindowed = stream.reduceByWindow(new IntegerSum(),
                      new Duration(2000), new Duration(1000));
 }
 JavaTestUtils.attachTestOutputStream(reducedWindowed);
 List<List<Integer>> result = JavaTestUtils.runStreams(ssc, 4, 4);
 Assert.assertEquals(expected, result);
}
org.apache.spark.streaming.api.javaJavaDStreamreduceByWindow

Popular methods of JavaDStream

  • foreachRDD
  • map
  • mapToPair
  • union
  • filter
  • flatMap
  • dstream
  • countByValue
  • cache
  • transformToPair
  • window
  • count
  • window,
  • count,
  • transform,
  • countByValueAndWindow,
  • flatMapToPair,
  • print,
  • repartition,
  • glom,
  • mapPartitions

Popular in Java

  • Updating database using SQL prepared statement
  • getSupportFragmentManager (FragmentActivity)
  • scheduleAtFixedRate (Timer)
  • findViewById (Activity)
  • FileWriter (java.io)
    A specialized Writer that writes to a file in the file system. All write requests made by calling me
  • SocketTimeoutException (java.net)
    This exception is thrown when a timeout expired on a socket read or accept operation.
  • Stack (java.util)
    Stack is a Last-In/First-Out(LIFO) data structure which represents a stack of objects. It enables u
  • Options (org.apache.commons.cli)
    Main entry-point into the library. Options represents a collection of Option objects, which describ
  • Logger (org.slf4j)
    The org.slf4j.Logger interface is the main user entry point of SLF4J API. It is expected that loggin
  • Option (scala)
  • Top 12 Jupyter Notebook extensions
Tabnine Logo
  • Products

    Search for Java codeSearch for JavaScript code
  • IDE Plugins

    IntelliJ IDEAWebStormVisual StudioAndroid StudioEclipseVisual Studio CodePyCharmSublime TextPhpStormVimGoLandRubyMineEmacsJupyter NotebookJupyter LabRiderDataGripAppCode
  • Company

    About UsContact UsCareers
  • Resources

    FAQBlogTabnine AcademyTerms of usePrivacy policyJava Code IndexJavascript Code Index
Get Tabnine for your IDE now