Where and Filter in Spark Dataframes

Filtering rows from dataframe is one of the basic tasks performed when analyzing data with Spark. Spark provides two ways to filter data. Where and Filter function. Both of these functions work in the same way, but mostly we will be using “where” due to its familiarity with SQL.

Using Where / Filter in Spark Dataframe

We can easily filter rows with some conditions as we do in SQL using “Where” function. Say we need to find all rows where the number of flights is more than 50 between the two countries.

We can also use column expressions. This time we will use “Filter” function to get desired rows from dataframe.

Chaining Multiple Conditions

Though it is possible to write multiple where conditions in one statement, it is not necessary. Even when we chain multiple conditions one after another while creating a physical plan for execution spark will optimize these operations in one single step.

That is why it is always a better idea to write multiple where conditions separately which will be easier to understand while reading code.

Multiple Where Clauses Chained Together
Multiple Where Clauses Chained Together

I hope you found this useful :). See you in next blog.

Similar Posts

2 Comments

  1. Hi Mahesh ,

    Could you pls share the Data set which you used in your example , it will be great help in practice

Comments are closed.