Hi there.
I couldn't find any documentation for initializing pyspark.
I've initialized the spark sessions with
%%classpath add mvn
org.apache.spark spark-sql_2.11 2.3.1
followed by
Everything is all good.
Then when I go to link the spark session in python I don't have an entry point to the spark because spark has been initialized through the jar files and not through python.
Is there a tutorial or documentation which outlines how to use spark in python or through pyspark?
Cheers.
Hi there.
I couldn't find any documentation for initializing pyspark.
I've initialized the spark sessions with
followed by
Everything is all good.
Then when I go to link the spark session in python I don't have an entry point to the spark because spark has been initialized through the jar files and not through python.
Is there a tutorial or documentation which outlines how to use spark in python or through pyspark?
Cheers.