logging - Override Spark log4j configurations -


i'm running spark on yarn cluster , having log4j.properties configured such logs default go log file. however, spark jobs want logs go console without changing log4j file , code of actual job. best way achieve this? thanks, all.

i know there have @ least 4 solutions solving problem.

  1. you modify log4j.properties in spark machines

  2. when running job on spark better attach log4j file configuration file submit spark example

    bin/spark-submit --class com.viaplay.log4jtest.log4jtest --conf "spark.driver.extrajavaoptions=-dlog4j.configuration=file:/users/feng/sparklog4j/sparklog4jtest/target/log4j2.properties" --master local[*] /users/feng/sparklog4j/sparklog4jtest/target/sparklog4jtest-1.0-jar-with-dependencies.jar

  3. try import log4j logic code.

    import org.apache.log4j.logger; import org.apache.log4j.level;

    put logger sparkcontext() function logger.getlogger("org").setlevel(level.info); logger.getlogger("akka").setlevel(level.info);

  4. spark use spark.sql.sparksession

    import org.apache.spark.sql.sparksession; spark = sparksession.builder.getorcreate() spark.sparkcontext.setloglevel('error')


Comments

Popular posts from this blog

OpenCV OpenCL: Convert Mat to Bitmap in JNI Layer for Android -

android - org.xmlpull.v1.XmlPullParserException: expected: START_TAG {http://schemas.xmlsoap.org/soap/envelope/}Envelope -

python - How to remove the Xframe Options header in django? -