spark2.2.0 support

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

spark2.2.0 support

john cheng
Hi carbon guys, At now carbondata seems not support spark2.2.0. I add spark2.2 as a new profile, and build like this: mvn -DskipTests -Pspark-2.2 -Dspark.version=2.2.0 -Dhadoop.version=2.6.0 clean package

But there're errors on spark common module:

[ERROR] /Users/zhengqh/Github/carbondata-parent-1.1.1/integration/spark-common/src/main/scala/org/apache/spark/sql/optimizer/CarbonDecoderOptimizerHelper.scala:87: error: value child is not a member of org.apache.spark.sql.catalyst.plans.logical.InsertIntoTable
[INFO]       case i: InsertIntoTable => process(i.child, nodeList)
[INFO]                                            ^
[WARNING] 11 warnings found
[ERROR] one error found

do you guys plan to support spark2.2.0. or at now I should downgrade to spark2.1.x?
Reply | Threaded
Open this post in threaded view
|

Re: spark2.2.0 support

john cheng
If I build carbondata with spark2.1.x, it works. But our spark version is spark2.2.0. If use spark2.1.x builded jar, and execute on spark2.2.0, there're still errors when create CarbonSession, the errors is : ClassNotFoundException: o.a.s.sql.hive.HiveSessionState

2017-08-03 14:45 GMT+08:00 john cheng <[hidden email]>:
Hi carbon guys, At now carbondata seems not support spark2.2.0. I add spark2.2 as a new profile, and build like this: mvn -DskipTests -Pspark-2.2 -Dspark.version=2.2.0 -Dhadoop.version=2.6.0 clean package

But there're errors on spark common module:

[ERROR] /Users/zhengqh/Github/carbondata-parent-1.1.1/integration/spark-common/src/main/scala/org/apache/spark/sql/optimizer/CarbonDecoderOptimizerHelper.scala:87: error: value child is not a member of org.apache.spark.sql.catalyst.plans.logical.InsertIntoTable
[INFO]       case i: InsertIntoTable => process(i.child, nodeList)
[INFO]                                            ^
[WARNING] 11 warnings found
[ERROR] one error found

do you guys plan to support spark2.2.0. or at now I should downgrade to spark2.1.x?

Reply | Threaded
Open this post in threaded view
|

Re: spark2.2.0 support

john cheng
After dig source code about error happened(https://github.com/apache/carbondata/blob/master/integration/spark-common/src/main/scala/org/apache/spark/sql/optimizer/CarbonDecoderOptimizerHelper.scala#L87) between spark1.6/spark2.1.x and spark2.2.0 . I notice spark1.6's InsertIntoTable has child LogicalPlan attribute, but spark2.2.0 change to query LogicalPlan attribute. but CarbonDecoderOptimizerHelper is at spark-common module, which means spark1.6/spark2.1 and spark2.2 both use this class. So I change to i.query, then build with spark2.2.0, It's OK then. Should this code here being version compatible or version independent?

2017-08-03 14:52 GMT+08:00 john cheng <[hidden email]>:
If I build carbondata with spark2.1.x, it works. But our spark version is spark2.2.0. If use spark2.1.x builded jar, and execute on spark2.2.0, there're still errors when create CarbonSession, the errors is : ClassNotFoundException: o.a.s.sql.hive.HiveSessionState

2017-08-03 14:45 GMT+08:00 john cheng <[hidden email]>:
Hi carbon guys, At now carbondata seems not support spark2.2.0. I add spark2.2 as a new profile, and build like this: mvn -DskipTests -Pspark-2.2 -Dspark.version=2.2.0 -Dhadoop.version=2.6.0 clean package

But there're errors on spark common module:

[ERROR] /Users/zhengqh/Github/carbondata-parent-1.1.1/integration/spark-common/src/main/scala/org/apache/spark/sql/optimizer/CarbonDecoderOptimizerHelper.scala:87: error: value child is not a member of org.apache.spark.sql.catalyst.plans.logical.InsertIntoTable
[INFO]       case i: InsertIntoTable => process(i.child, nodeList)
[INFO]                                            ^
[WARNING] 11 warnings found
[ERROR] one error found

do you guys plan to support spark2.2.0. or at now I should downgrade to spark2.1.x?


Reply | Threaded
Open this post in threaded view
|

Re: spark2.2.0 support

Ravindra Pesala
Hi,

In the ongoing version we support only spark 2.1.1 version, As spark 2.2.0 is relatively new major released version we require more time for upgrading and testing Carbondata completely. So as per the effort we plan the release of upgraded version.

And about your query , yes the code should be version independent, that means same should work for 2.1.0, 2.1.1 and 2.2.0 as well.

Regards,
Ravindra.

On 3 August 2017 at 12:57, john cheng <[hidden email]> wrote:
After dig source code about error happened(https://github.com/apache/carbondata/blob/master/integration/spark-common/src/main/scala/org/apache/spark/sql/optimizer/CarbonDecoderOptimizerHelper.scala#L87) between spark1.6/spark2.1.x and spark2.2.0 . I notice spark1.6's InsertIntoTable has child LogicalPlan attribute, but spark2.2.0 change to query LogicalPlan attribute. but CarbonDecoderOptimizerHelper is at spark-common module, which means spark1.6/spark2.1 and spark2.2 both use this class. So I change to i.query, then build with spark2.2.0, It's OK then. Should this code here being version compatible or version independent?

2017-08-03 14:52 GMT+08:00 john cheng <[hidden email]>:
If I build carbondata with spark2.1.x, it works. But our spark version is spark2.2.0. If use spark2.1.x builded jar, and execute on spark2.2.0, there're still errors when create CarbonSession, the errors is : ClassNotFoundException: o.a.s.sql.hive.HiveSessionState

2017-08-03 14:45 GMT+08:00 john cheng <[hidden email]>:
Hi carbon guys, At now carbondata seems not support spark2.2.0. I add spark2.2 as a new profile, and build like this: mvn -DskipTests -Pspark-2.2 -Dspark.version=2.2.0 -Dhadoop.version=2.6.0 clean package

But there're errors on spark common module:

[ERROR] /Users/zhengqh/Github/carbondata-parent-1.1.1/integration/spark-common/src/main/scala/org/apache/spark/sql/optimizer/CarbonDecoderOptimizerHelper.scala:87: error: value child is not a member of org.apache.spark.sql.catalyst.plans.logical.InsertIntoTable
[INFO]       case i: InsertIntoTable => process(i.child, nodeList)
[INFO]                                            ^
[WARNING] 11 warnings found
[ERROR] one error found

do you guys plan to support spark2.2.0. or at now I should downgrade to spark2.1.x?





--
Thanks & Regards,
Ravi