logging - Override Spark log4j configurations -
i'm running spark on yarn cluster , having log4j.properties configured such logs default go log file. however, spark jobs want logs go console without changing log4j file , code of actual job. best way achieve this? thanks, all.
i know there have @ least 4 solutions solving problem.
you modify log4j.properties in spark machines
when running job on spark better attach log4j file configuration file submit spark example
bin/spark-submit --class com.viaplay.log4jtest.log4jtest --conf "spark.driver.extrajavaoptions=-dlog4j.configuration=file:/users/feng/sparklog4j/sparklog4jtest/target/log4j2.properties" --master local[*] /users/feng/sparklog4j/sparklog4jtest/target/sparklog4jtest-1.0-jar-with-dependencies.jar
try import log4j logic code.
import org.apache.log4j.logger; import org.apache.log4j.level;
put logger sparkcontext() function logger.getlogger("org").setlevel(level.info); logger.getlogger("akka").setlevel(level.info);
spark use spark.sql.sparksession
import org.apache.spark.sql.sparksession; spark = sparksession.builder.getorcreate() spark.sparkcontext.setloglevel('error')
Comments
Post a Comment