Tabnine Logo
GradoopFlinkConfig.getLogicalGraphFactory
Code IndexAdd Tabnine to your IDE (free)

How to use
getLogicalGraphFactory
method
in
org.gradoop.flink.util.GradoopFlinkConfig

Best Java code snippets using org.gradoop.flink.util.GradoopFlinkConfig.getLogicalGraphFactory (Showing top 20 results out of 315)

origin: org.gradoop/gradoop-flink

@Override
public BaseGraphFactory<GraphHead, Vertex, Edge, LogicalGraph> getFactory() {
 return config.getLogicalGraphFactory();
}
origin: org.gradoop/gradoop-flink

/**
 * Creates a new logical graph by union the vertex and edge sets of all graph
 * contained in the given collection.
 *
 * @param collection input collection
 * @return combined graph
 */
@Override
public LogicalGraph execute(GraphCollection collection) {
 return collection.getConfig().getLogicalGraphFactory().fromDataSets(
  collection.getVertices(), collection.getEdges());
}
origin: dbs-leipzig/gradoop

/**
 * Builds a {@link LogicalGraph} from the graph referenced by the given
 * graph variable.
 *
 * @param variable graph variable used in GDL script
 * @return LogicalGraph
 */
public LogicalGraph getLogicalGraphByVariable(String variable) {
 GraphHead graphHead = getGraphHeadByVariable(variable);
 Collection<Vertex> vertices = getVerticesByGraphVariables(variable);
 Collection<Edge> edges = getEdgesByGraphVariables(variable);
 return config.getLogicalGraphFactory().fromCollections(graphHead, vertices, edges);
}
origin: org.gradoop/gradoop-flink

/**
 * Builds a {@link LogicalGraph} from the graph referenced by the given
 * graph variable.
 *
 * @param variable graph variable used in GDL script
 * @return LogicalGraph
 */
public LogicalGraph getLogicalGraphByVariable(String variable) {
 GraphHead graphHead = getGraphHeadByVariable(variable);
 Collection<Vertex> vertices = getVerticesByGraphVariables(variable);
 Collection<Edge> edges = getEdgesByGraphVariables(variable);
 return config.getLogicalGraphFactory().fromCollections(graphHead, vertices, edges);
}
origin: dbs-leipzig/gradoop

@Override
public LogicalGraph getLogicalGraph() {
 DataSet<Vertex> vertices = config.getExecutionEnvironment().readTextFile(jsonPath)
  .map(new MinimalJsonToVertex(config.getVertexFactory()));
 return config.getLogicalGraphFactory().fromDataSets(vertices);
}
origin: org.gradoop/gradoop-flink

@Override
public GraphCollection executeForGVELayout(GraphCollection collection) {
 // the resulting logical graph holds multiple graph heads
 LogicalGraph modifiedGraph = executeInternal(
  collection.getGraphHeads(),
  collection.getVertices(),
  collection.getEdges(),
  collection.getConfig().getLogicalGraphFactory());
 return collection.getFactory().fromDataSets(
  modifiedGraph.getGraphHead(),
  modifiedGraph.getVertices(),
  modifiedGraph.getEdges());
}
origin: org.gradoop/gradoop-flink

/**
 * {@inheritDoc}
 */
@Override
public LogicalGraph execute(LogicalGraph graph) {
 DataSet<GraphHead> newGraphHead = graph
  .aggregate(new VertexCount(), new EdgeCount())
  .getGraphHead()
  .map(new CalculateDensity(SamplingEvaluationConstants.PROPERTY_KEY_DENSITY));
 return graph.getConfig().getLogicalGraphFactory()
  .fromDataSets(newGraphHead, graph.getVertices(), graph.getEdges());
}
origin: dbs-leipzig/gradoop

@Override
public GraphCollection executeForGVELayout(GraphCollection collection) {
 // the resulting logical graph holds multiple graph heads
 LogicalGraph modifiedGraph = executeInternal(
  collection.getGraphHeads(),
  collection.getVertices(),
  collection.getEdges(),
  collection.getConfig().getLogicalGraphFactory());
 return collection.getFactory().fromDataSets(
  modifiedGraph.getGraphHead(),
  modifiedGraph.getVertices(),
  modifiedGraph.getEdges());
}
origin: dbs-leipzig/gradoop

 @Override
 public LogicalGraph execute(LogicalGraph graph) {
  DataSet<GraphHead> newGraphHead = graph
   .aggregate(new VertexCount(), new EdgeCount())
   .getGraphHead()
   .map(new CalculateDensity(SamplingEvaluationConstants.PROPERTY_KEY_DENSITY));

  return graph.getConfig().getLogicalGraphFactory()
   .fromDataSets(newGraphHead, graph.getVertices(), graph.getEdges());
 }
}
origin: org.gradoop/gradoop-flink

@Override
public LogicalGraph executeInGelly(Graph<GradoopId, NullValue, NullValue> graph)
 throws Exception {
 DataSet<Tuple3<GradoopId, GradoopId, GradoopId>> triangles =
  new org.apache.flink.graph.library.TriangleEnumerator<GradoopId, NullValue, NullValue>()
  .run(graph);
 DataSet<GraphHead> resultHead = currentGraph.getGraphHead()
  .map(new WritePropertyToGraphHeadMap(
   PROPERTY_KEY_TRIANGLES, PropertyValue.create(triangles.count())));
 return currentGraph.getConfig().getLogicalGraphFactory().fromDataSets(
  resultHead, currentGraph.getVertices(), currentGraph.getEdges());
}
origin: dbs-leipzig/gradoop

@Override
public LogicalGraph executeInGelly(Graph<GradoopId, PropertyValue, NullValue> graph) {
 DataSet<Vertex> labeledVertices = executeInternal(graph)
  .join(currentGraph.getVertices())
  .where(0).equalTo(new Id<>())
  .with(new LPVertexJoin(propertyKey));
 // return labeled graph
 return currentGraph.getConfig().getLogicalGraphFactory()
  .fromDataSets(labeledVertices, currentGraph.getEdges());
}
origin: org.gradoop/gradoop-flink

/**
 * {@inheritDoc}
 */
@Override
public LogicalGraph executeInGelly(Graph<GradoopId, PropertyValue, NullValue> graph) {
 DataSet<Vertex> labeledVertices = executeInternal(graph)
  .join(currentGraph.getVertices())
  .where(0).equalTo(new Id<>())
  .with(new LPVertexJoin(propertyKey));
 // return labeled graph
 return currentGraph.getConfig().getLogicalGraphFactory()
  .fromDataSets(labeledVertices, currentGraph.getEdges());
}
origin: org.gradoop/gradoop-flink

@Override
public LogicalGraph executeInGelly(Graph<GradoopId, NullValue, NullValue> graph)
 throws Exception {
 DataSet<Vertex> newVertices = hits.runInternal(graph)
  .join(currentGraph.getVertices())
  .where(new HitsResultKeySelector()).equalTo(new Id<>())
  .with(new HITSToAttributes(authorityPropertyKey, hubPropertyKey));
 return currentGraph.getConfig().getLogicalGraphFactory()
  .fromDataSets(newVertices, currentGraph.getEdges());
}
origin: dbs-leipzig/gradoop

@Override
public LogicalGraph executeInGelly(Graph<GradoopId, NullValue, NullValue> graph)
 throws Exception {
 DataSet<Vertex> newVertices = hits.runInternal(graph)
  .join(currentGraph.getVertices())
  .where(new HitsResultKeySelector()).equalTo(new Id<>())
  .with(new HITSToAttributes(authorityPropertyKey, hubPropertyKey));
 return currentGraph.getConfig().getLogicalGraphFactory()
  .fromDataSets(newVertices, currentGraph.getEdges());
}
origin: dbs-leipzig/gradoop

@Override
public LogicalGraph getGraph(final GradoopId graphID) {
 // filter vertices and edges based on given graph id
 DataSet<GraphHead> graphHead = getGraphHeads()
  .filter(new BySameId<>(graphID));
 DataSet<Vertex> vertices = getVertices()
  .filter(new InGraph<>(graphID));
 DataSet<Edge> edges = getEdges()
  .filter(new InGraph<>(graphID));
 return new LogicalGraph(
  config.getLogicalGraphFactory().fromDataSets(graphHead, vertices, edges),
  getConfig());
}
origin: org.gradoop/gradoop-flink

@Override
public LogicalGraph executeInGelly(Graph<GradoopId, NullValue, Double> graph)
 throws Exception {
 DataSet<Vertex> newVertices = new org.apache.flink.graph.library.SingleSourceShortestPaths
  <GradoopId, NullValue>(srcVertexId, iterations)
  .run(graph)
  .join(currentGraph.getVertices())
  .where(0)
  .equalTo(new Id<>())
  .with(new SingleSourceShortestPathsAttribute(propertyKeyVertex));
 return currentGraph.getConfig().getLogicalGraphFactory().fromDataSets(newVertices,
  currentGraph.getEdges());
}
origin: org.gradoop/gradoop-flink

@Override
public LogicalGraph execute(LogicalGraph graph) {
 DataSet<Vertex> vertices = graph.getVertices();
 DataSet<Edge> edges = graph.getEdges();
 DataSet<Map<String, PropertyValue>> aggregate = aggregateVertices(vertices)
  .union(aggregateEdges(edges))
  .reduceGroup(new CombinePartitionAggregates(aggregateFunctions));
 DataSet<GraphHead> graphHead = graph.getGraphHead()
  .map(new SetAggregateProperty(aggregateFunctions))
  .withBroadcastSet(aggregate, SetAggregateProperty.VALUE);
 return graph.getConfig().getLogicalGraphFactory()
  .fromDataSets(graphHead, vertices, edges);
}
origin: org.gradoop/gradoop-flink

@Override
public LogicalGraph executeInGelly(Graph<GradoopId, NullValue, NullValue> graph)
 throws Exception {
 DataSet<Vertex> newVertices =
  new org.apache.flink.graph.library.linkanalysis.PageRank<GradoopId, NullValue, NullValue>(
   dampingFactor, iterations).setIncludeZeroDegreeVertices(includeZeroDegrees).run(graph)
  .join(currentGraph.getVertices())
  .where(new PageRankResultKey()).equalTo(new Id<>())
  .with(new PageRankToAttribute(propertyKey));
 return currentGraph.getConfig().getLogicalGraphFactory().fromDataSets(
  currentGraph.getGraphHead(), newVertices, currentGraph.getEdges());
}
origin: org.gradoop/gradoop-flink

@Override
public LogicalGraph execute(LogicalGraph graph) {
 VertexCount vertexCountAggregate = new VertexCount();
 SumVertexProperty localCCAggregate = new SumVertexProperty(
  ClusteringCoefficientBase.PROPERTY_KEY_LOCAL);
 graph = new GellyLocalClusteringCoefficientDirected().execute(graph)
  .aggregate(vertexCountAggregate, localCCAggregate);
 DataSet<GraphHead> graphHead = graph.getGraphHead().map(new AddAverageCCValueToGraphHeadMap(
   localCCAggregate.getAggregatePropertyKey(), vertexCountAggregate.getAggregatePropertyKey(),
   PROPERTY_KEY_AVERAGE));
 return graph.getConfig().getLogicalGraphFactory().fromDataSets(
  graphHead, graph.getVertices(), graph.getEdges());
}
origin: dbs-leipzig/gradoop

 @Override
 public LogicalGraph executeInGelly(Graph<GradoopId, NullValue, NullValue> graph)
  throws Exception {
  DataSet<Vertex> newVertices =
   new org.apache.flink.graph.library.linkanalysis.PageRank<GradoopId, NullValue, NullValue>(
    dampingFactor, iterations).setIncludeZeroDegreeVertices(includeZeroDegrees).run(graph)
   .join(currentGraph.getVertices())
   .where(new PageRankResultKey()).equalTo(new Id<>())
   .with(new PageRankToAttribute(propertyKey));
  return currentGraph.getConfig().getLogicalGraphFactory().fromDataSets(
   currentGraph.getGraphHead(), newVertices, currentGraph.getEdges());
 }
}
org.gradoop.flink.utilGradoopFlinkConfiggetLogicalGraphFactory

Javadoc

Returns a factory that is able to create logical graph layouts.

Popular methods of GradoopFlinkConfig

  • getGraphCollectionFactory
    Returns a factory that is able to create graph collection layouts.
  • getExecutionEnvironment
    Returns the Flink execution environment.
  • getVertexFactory
  • getEdgeFactory
  • createConfig
    Creates a Gradoop Flink configuration using the given parameters.
  • getGraphHeadFactory
  • <init>
    Creates a new Configuration.
  • setGraphCollectionLayoutFactory
    Sets the layout factory for building layouts that represent a GraphCollection.
  • setLogicalGraphLayoutFactory
    Sets the layout factory for building layouts that represent a LogicalGraph.

Popular in Java

  • Making http post requests using okhttp
  • getExternalFilesDir (Context)
  • getResourceAsStream (ClassLoader)
  • getSystemService (Context)
  • Pointer (com.sun.jna)
    An abstraction for a native pointer data type. A Pointer instance represents, on the Java side, a na
  • Selector (java.nio.channels)
    A controller for the selection of SelectableChannel objects. Selectable channels can be registered w
  • Reference (javax.naming)
  • Servlet (javax.servlet)
    Defines methods that all servlets must implement. A servlet is a small Java program that runs within
  • JPanel (javax.swing)
  • Scheduler (org.quartz)
    This is the main interface of a Quartz Scheduler. A Scheduler maintains a registry of org.quartz.Job
  • CodeWhisperer alternatives
Tabnine Logo
  • Products

    Search for Java codeSearch for JavaScript code
  • IDE Plugins

    IntelliJ IDEAWebStormVisual StudioAndroid StudioEclipseVisual Studio CodePyCharmSublime TextPhpStormVimGoLandRubyMineEmacsJupyter NotebookJupyter LabRiderDataGripAppCode
  • Company

    About UsContact UsCareers
  • Resources

    FAQBlogTabnine AcademyTerms of usePrivacy policyJava Code IndexJavascript Code Index
Get Tabnine for your IDE now