【原创】大叔经验分享(65)spark读取不到hive表
spark 2.4.3
spark讀取hive表,步驟:
1)hive-site.xml
hive-site.xml放到$SPARK_HOME/conf下
2)enableHiveSupport
SparkSession.builder.enableHiveSupport().getOrCreate()
3) 測試代碼
val sparkConf = new SparkConf().setAppName(getName)val sc = new SparkContext(sparkConf)val spark = SparkSession.builder.config(sparkConf).enableHiveSupport().getOrCreate()spark.sql("show databases").rdd.foreach(println)
使用$SPARK_HOME/bin/spark-submit提交任務后發現并不能讀取到hive的數據庫,相關日志如下
19/05/31 13:11:31 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 19/05/31 13:11:31 INFO SharedState: loading hive config file: file:/export/spark-2.4.3-bin-hadoop2.6/conf/hive-site.xml 19/05/31 13:11:31 INFO SharedState: spark.sql.warehouse.dir is not set, but hive.metastore.warehouse.dir is set. Setting spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir ('/user/hive/warehouse'). 19/05/31 13:11:31 INFO SharedState: Warehouse path is '/user/hive/warehouse'. 19/05/31 13:11:31 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoin
說明已經讀到hive-site.xml;
進一步測試,使用$SPARK_HOME/bin/spark-sql或者$SPARK_HOME/bin/spark-shell發現都可以讀到hive數據庫,很神奇有沒有,
$SPARK_HOME/bin/spark-shell啟動的類為org.apache.spark.repl.Main
"${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"
跟進org.apache.spark.repl.Main代碼
...val builder = SparkSession.builder.config(conf)if (conf.get(CATALOG_IMPLEMENTATION.key, "hive").toLowerCase(Locale.ROOT) == "hive") {if (SparkSession.hiveClassesArePresent) {// In the case that the property is not set at all, builder's config// does not have this value set to 'hive' yet. The original default// behavior is that when there are hive classes, we use hive catalog.sparkSession = builder.enableHiveSupport().getOrCreate()logInfo("Created Spark session with Hive support")} else {// Need to change it back to 'in-memory' if no hive classes are found// in the case that the property is set to hive in spark-defaults.confbuilder.config(CATALOG_IMPLEMENTATION.key, "in-memory")sparkSession = builder.getOrCreate()logInfo("Created Spark session")}} else {// In the case that the property is set but not to 'hive', the internal// default is 'in-memory'. So the sparkSession will use in-memory catalog.sparkSession = builder.getOrCreate()logInfo("Created Spark session")}sparkContext = sparkSession.sparkContextsparkSession ...
發現和測試代碼有些差異,關鍵是在倒數第二行,這里是先創建SparkSession,再從SparkSession中獲取SparkContext,另外注意到之前有個WARN級別的日志
19/05/31 13:11:31 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
修改測試代碼
val sparkConf = new SparkConf().setAppName(getName)//val sc = new SparkContext(sparkConf)val spark = SparkSession.builder.config(sparkConf).enableHiveSupport().getOrCreate()val sc = spark.sparkContextspark.sql("show databases").rdd.foreach(println)
這次果然ok了,詳細原因有空再看,未完待續;
?
轉載于:https://www.cnblogs.com/barneywill/p/10959418.html
總結
以上是生活随笔為你收集整理的【原创】大叔经验分享(65)spark读取不到hive表的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: 2019春第二次课程设计实验报告
- 下一篇: 用户体验评价