Tabnine Logo
Convert.to
Code IndexAdd Tabnine to your IDE (free)

How to use
to
method
in
org.apache.beam.sdk.schemas.transforms.Convert

Best Java code snippets using org.apache.beam.sdk.schemas.transforms.Convert.to (Showing top 15 results out of 315)

origin: org.apache.beam/beam-sdks-java-core

/**
 * Convert a {@link PCollection}{@literal <Row>} into a {@link PCollection}{@literal <OutputT>}.
 *
 * <p>The output schema will be inferred using the schema registry. A schema must be registered
 * for this type, or the conversion will fail.
 */
public static <OutputT> PTransform<PCollection<Row>, PCollection<OutputT>> fromRows(
  Class<OutputT> clazz) {
 return to(clazz);
}
origin: org.apache.beam/beam-sdks-java-core

/**
 * Convert a {@link PCollection}{@literal <Row>} into a {@link PCollection}{@literal <Row>}.
 *
 * <p>The output schema will be inferred using the schema registry. A schema must be registered
 * for this type, or the conversion will fail.
 */
public static <OutputT> PTransform<PCollection<Row>, PCollection<OutputT>> fromRows(
  TypeDescriptor<OutputT> typeDescriptor) {
 return to(typeDescriptor);
}
origin: org.apache.beam/beam-sdks-java-core

/**
 * Convert a {@link PCollection}{@literal <InputT>} into a {@link PCollection}{@literal <Row>}.
 *
 * <p>The input {@link PCollection} must have a schema attached. The output collection will have
 * the same schema as the iput.
 */
public static <InputT> PTransform<PCollection<InputT>, PCollection<Row>> toRows() {
 return to(Row.class);
}
origin: org.apache.beam/beam-sdks-java-core

/**
 * Convert a {@link PCollection}{@literal <InputT>} to a {@link PCollection}{@literal <OutputT>}.
 *
 * <p>This function allows converting between two types as long as the two types have
 * <i>compatible</i> schemas. Two schemas are said to be <i>compatible</i> if they recursively
 * have fields with the same names, but possibly different orders.
 */
public static <InputT, OutputT> PTransform<PCollection<InputT>, PCollection<OutputT>> to(
  Class<OutputT> clazz) {
 return to(TypeDescriptor.of(clazz));
}
origin: org.apache.beam/beam-sdks-java-core

 @Test
 @Category(NeedsRunner.class)
 public void testGeneralConvert() {
  PCollection<POJO2> pojos =
    pipeline.apply(Create.of(new POJO1())).apply(Convert.to(POJO2.class));
  PAssert.that(pojos).containsInAnyOrder(new POJO2());
  pipeline.run();
 }
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testSelectAll() {
 PCollection<POJO1> pojos =
   pipeline
     .apply(Create.of(new POJO1()))
     .apply(Select.fieldAccess(FieldAccessDescriptor.withAllFields()))
     .apply(Convert.to(POJO1.class));
 PAssert.that(pojos).containsInAnyOrder(new POJO1());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testSimpleSelect() {
 PCollection<POJO1Selected> pojos =
   pipeline
     .apply(Create.of(new POJO1()))
     .apply(Select.fieldNames("field1", "field3"))
     .apply(Convert.to(POJO1Selected.class));
 PAssert.that(pojos).containsInAnyOrder(new POJO1Selected());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testComplexCast() throws Exception {
 Schema outputSchema = pipeline.getSchemaRegistry().getSchema(All2.class);
 PCollection<All2> pojos =
   pipeline
     .apply(Create.of(new All1()))
     .apply(Cast.narrowing(outputSchema))
     .apply(Convert.to(All2.class));
 PAssert.that(pojos).containsInAnyOrder(new All2());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testTypeNarrow() throws Exception {
 // narrowing is the opposite of widening
 Schema outputSchema = pipeline.getSchemaRegistry().getSchema(TypeWiden1.class);
 PCollection<TypeWiden1> pojos =
   pipeline
     .apply(Create.of(new TypeWiden2()))
     .apply(Cast.narrowing(outputSchema))
     .apply(Convert.to(TypeWiden1.class));
 PAssert.that(pojos).containsInAnyOrder(new TypeWiden1());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testTypeWiden() throws Exception {
 Schema outputSchema = pipeline.getSchemaRegistry().getSchema(TypeWiden2.class);
 PCollection<TypeWiden2> pojos =
   pipeline
     .apply(Create.of(new TypeWiden1()))
     .apply(Cast.widening(outputSchema))
     .apply(Convert.to(TypeWiden2.class));
 PAssert.that(pojos).containsInAnyOrder(new TypeWiden2());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testWeakedNullable() throws Exception {
 Schema outputSchema = pipeline.getSchemaRegistry().getSchema(Nullable2.class);
 PCollection<Nullable2> pojos =
   pipeline
     .apply(Create.of(new Nullable1()))
     .apply(Cast.narrowing(outputSchema))
     .apply(Convert.to(Nullable2.class));
 PAssert.that(pojos).containsInAnyOrder(new Nullable2());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testProjection() throws Exception {
 Schema outputSchema = pipeline.getSchemaRegistry().getSchema(Projection2.class);
 PCollection<Projection2> pojos =
   pipeline
     .apply(Create.of(new Projection1()))
     .apply(Cast.widening(outputSchema))
     .apply(Convert.to(Projection2.class));
 PAssert.that(pojos).containsInAnyOrder(new Projection2());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testIgnoreNullable() throws Exception {
 // ignoring nullable is opposite of weakening
 Schema outputSchema = pipeline.getSchemaRegistry().getSchema(Nullable1.class);
 PCollection<Nullable1> pojos =
   pipeline
     .apply(Create.of(new Nullable2()))
     .apply(Cast.narrowing(outputSchema))
     .apply(Convert.to(Nullable1.class));
 PAssert.that(pojos).containsInAnyOrder(new Nullable1());
 pipeline.run();
}
origin: org.apache.beam/beam-sdks-java-core

 @Test
 @Category(NeedsRunner.class)
 public void testSelectNestedPartial() {
  PCollection<POJO2NestedPartial> pojos =
    pipeline
      .apply(Create.of(new POJO2()))
      .apply(
        Select.fieldAccess(
          FieldAccessDescriptor.create()
            .withNestedField(
              "field2", FieldAccessDescriptor.withFieldNames("field1", "field3"))))
      .apply(Convert.to(POJO2NestedPartial.class));
  PAssert.that(pojos).containsInAnyOrder(new POJO2NestedPartial());
  pipeline.run();
 }
}
origin: org.apache.beam/beam-sdks-java-core

@Test
@Category(NeedsRunner.class)
public void testSelectNestedAll() {
 PCollection<POJO2NestedAll> pojos =
   pipeline
     .apply(Create.of(new POJO2()))
     .apply(
       Select.fieldAccess(
         FieldAccessDescriptor.create()
           .withNestedField("field2", FieldAccessDescriptor.withAllFields())))
     .apply(Convert.to(POJO2NestedAll.class));
 PAssert.that(pojos).containsInAnyOrder(new POJO2NestedAll());
 pipeline.run();
}
org.apache.beam.sdk.schemas.transformsConvertto

Javadoc

Convert a PCollection to a PCollection .

This function allows converting between two types as long as the two types have compatible schemas. Two schemas are said to be compatible if they recursively have fields with the same names, but possibly different orders.

Popular methods of Convert

  • toRows
    Convert a PCollection into a PCollection .The input PCollection must have a schema attached. The out
  • fromRows
    Convert a PCollection into a PCollection .The output schema will be inferred using the schema regist

Popular in Java

  • Reading from database using SQL prepared statement
  • orElseThrow (Optional)
    Return the contained value, if present, otherwise throw an exception to be created by the provided s
  • runOnUiThread (Activity)
  • scheduleAtFixedRate (Timer)
  • DateFormat (java.text)
    Formats or parses dates and times.This class provides factories for obtaining instances configured f
  • ArrayList (java.util)
    ArrayList is an implementation of List, backed by an array. All optional operations including adding
  • Calendar (java.util)
    Calendar is an abstract base class for converting between a Date object and a set of integer fields
  • HashSet (java.util)
    HashSet is an implementation of a Set. All optional operations (adding and removing) are supported.
  • StringTokenizer (java.util)
    Breaks a string into tokens; new code should probably use String#split.> // Legacy code: StringTo
  • Table (org.hibernate.mapping)
    A relational table
  • CodeWhisperer alternatives
Tabnine Logo
  • Products

    Search for Java codeSearch for JavaScript code
  • IDE Plugins

    IntelliJ IDEAWebStormVisual StudioAndroid StudioEclipseVisual Studio CodePyCharmSublime TextPhpStormVimGoLandRubyMineEmacsJupyter NotebookJupyter LabRiderDataGripAppCode
  • Company

    About UsContact UsCareers
  • Resources

    FAQBlogTabnine AcademyTerms of usePrivacy policyJava Code IndexJavascript Code Index
Get Tabnine for your IDE now