how to write Spark data frame to Neo4j database -


i'd build workflow:

  • preprocess data spark, ending data frame
  • write such dataframe neo4j set of nodes

my idea basic: write each row in df node, each column value represents value of node's attribute

i have seen many articles, including neo4j-spark-connector , introducing neo4j 3.0 apache spark connector focus on importing spark data neo4j db... far, wasn't able find clear example of writing spark data frame neo4j database.

any pointer documentation or basic examples appreciated.

you can write routine , use opensource neo4j java driver

https://github.com/neo4j/neo4j-java-driver

for example.

simple serialise result of rdd (using rdd.tojson) , use above driver create neo4j nodes , push neo4j instance.


Comments

Popular posts from this blog

python - How to insert QWidgets in the middle of a Layout? -

python - serve multiple gunicorn django instances under nginx ubuntu -

module - Prestashop displayPaymentReturn hook url -