jeudi 11 octobre 2018

Write conditonal statement with an empty spark data frame

I need to write a if else condition in pyspark in the following manner.

schema = StructType([])
   final = sqlCtx.createDataFrame(sc.emptyRDD(), schema)

if final.rdd.isEmpty:
    print('abc')
else:
    print('pqr')

But I can't seem to find the right syntax to check if the dataframe is empty or not. The above final.rdd.isEmpty always gives a true values even if the final data-frame is not empty.

Aucun commentaire:

Enregistrer un commentaire