You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was upgrading my java based application which uses scala and spark also json4s for some processing we are migrating from java 1.8 to java 17. Currently facing issue during deployment
The error says java.lang.classnotfoundexception: org.json4s.jsonassoc$ exception if I use json4s-jackson_2.12 less than 4.0.0 but when i am using json4s-jackson_2.12 greater than ver 4.0.0 I am getting error as
[java.lang.ClassNotFoundException: org.json4s.JsonAST$JValue]
Since i need to use java 17
So i have to use scala 2.12.15+ if using scala 2.12.x
As per the scala jdk compatibility matrix
The issue seems to be with spark and json4s-jackson library incompatibility can someone help with this and guide me which spark version is compatible with json4s-jackson libray using scala 2.12.15 , jdk 17 I can additional details if required
The text was updated successfully, but these errors were encountered:
iamdnair
changed the title
Spark 3.3.0 or greater than 3.3.0 version not working with json4s
Spark 3.3.0 or greater than 3.3.0 versions giving class not found with json4s library
Feb 13, 2024
json4s version tried with 3.6.10 , 4.1.0-M4
scala version 2.12.15
jdk version jdk 17
I was upgrading my java based application which uses scala and spark also json4s for some processing we are migrating from java 1.8 to java 17. Currently facing issue during deployment
The error says java.lang.classnotfoundexception: org.json4s.jsonassoc$ exception if I use json4s-jackson_2.12 less than 4.0.0 but when i am using json4s-jackson_2.12 greater than ver 4.0.0 I am getting error as
[java.lang.ClassNotFoundException: org.json4s.JsonAST$JValue]
Since i need to use java 17
So i have to use scala 2.12.15+ if using scala 2.12.x
As per the scala jdk compatibility matrix
The issue seems to be with spark and json4s-jackson library incompatibility can someone help with this and guide me which spark version is compatible with json4s-jackson libray using scala 2.12.15 , jdk 17 I can additional details if required
The text was updated successfully, but these errors were encountered: