DBeaver小松鼠:数据库客户端界面工具DBeaver连接Hive
前言
本文講解如何通過數(shù)據(jù)庫客戶端界面工具DBeaver連接hive,并解決驅(qū)動(dòng)下載不下來的問題。
1、為什么使用客戶端界面工具
為什么使用客戶端界面工具而不用命令行使用hive
通過界面工具查看分析hive里的數(shù)據(jù)要方便很多
業(yè)務(wù)人員沒有權(quán)限通過命令行連接hive
領(lǐng)導(dǎo)喜歡在界面工具上查看hive里的數(shù)據(jù)
2、為什么使用DBeaver
其實(shí)在網(wǎng)上搜一下,連接hive的工具還有很多,使用DBeaver的原因是因?yàn)槲抑斑B接關(guān)系型數(shù)據(jù)庫使用的就是DBeaver,正好DBeaver支持連接hive,且個(gè)人認(rèn)為DBeaver確實(shí)挺好用的,支持各種關(guān)系型數(shù)據(jù)庫,如連接Oracle數(shù)據(jù)庫不需要像plsql那樣自己配置連接文件,只需要在界面上輸入url、用戶名、密碼即可,還有就是DBeaver的快捷鍵和Eclipse是一樣的,比如注釋、刪除一行、復(fù)制一行到下一行等。
3、DBeaver下載、安裝
之前我一直用的舊版的,現(xiàn)在在官網(wǎng)上下載了最新版的DBeaver,發(fā)現(xiàn)界面功能比舊版好用了很多,親測連hive沒有問題。
下載地址:https://dbeaver.io/download/
我下載的免安裝版(不帶jre),windows64位,大家可以根據(jù)自己情況下載對應(yīng)版本。
因?yàn)槲蚁螺d的免安裝版,所以解壓后,直接點(diǎn)擊dbeaver.exe就可以使用了!
4、啟動(dòng)hive相關(guān)
測試連接前先啟動(dòng)hive相關(guān)的服務(wù)
1、啟動(dòng)hdfs、yarn
opt/hadoop-2.7.5/sbin/start-dfs.sh /opt/hadoop-2.7.5/sbin/start-yarn.sh
2、啟動(dòng)hiveserver2(hive-0.11.0以后的版本)
如果想遠(yuǎn)程連接hive,則需要啟動(dòng)hiveserver2
?
注:圖中的startall.sh就是上面hdfs、yarn的啟動(dòng)命令
5、創(chuàng)建hive測試表
?
6、DBeaver連接hive
DBeaver連接關(guān)系型數(shù)據(jù)庫比較的簡單,連接hive的話因?yàn)橐渲孟螺d驅(qū)動(dòng),所以這里詳細(xì)說明一下。
6.1 文件->新建->數(shù)據(jù)庫連接(新版是中文的,而之前舊版的是英文的,這點(diǎn)我還是比較喜歡的~)
6.2 選擇新連接類型->Apache Hive(從這里看到,DBeaver支持的數(shù)據(jù)庫還是很多的)
6.3 填一下hiveserver2的ip和hive的數(shù)據(jù)庫名
自動(dòng)下載的驅(qū)動(dòng):?
7、
如果自動(dòng)下載的驅(qū)動(dòng)不合適則自己導(dǎo)入相應(yīng)的驅(qū)動(dòng):
三個(gè)包都加一下
8、注意:
我這里在公司需連接vpn,所以補(bǔ)充下vpn配置
解壓后:
?
安裝vpn:
原文參考:https://blog.csdn.net/dkl12/article/details/81381122
?
?
連接報(bào)錯(cuò)了:
[root@sparkproject1 ~]# hive --service hiveserver2 Starting HiveServer2 18/10/23 07:47:50 WARN conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect. Use hive.hmshandler.retry.* instead [root@sparkproject1 ~]#hive默認(rèn)log目錄:
/tmp/<user.name>文件夾的hive.log文件中,全路徑就是/tmp/當(dāng)前用戶名/hive.log。
[root@sparkproject1 root]# pwd /tmp/root [root@sparkproject1 root]# ll total 1140 -rw-r--r-- 1 root root 0 Aug 29 10:51 037b34c7-36be-4dcd-9632-71655d05169e1031340479492035973.pipeout -rw-r--r-- 1 root root 0 Aug 29 11:18 1db7ed16-0774-4d65-9357-8f9fc8d81e668892554412007814531.pipeout -rw-r--r-- 1 root root 0 Aug 29 11:17 240986ce-3a86-4652-9111-9cf70ebb27b64167364938441024841.pipeout -rw-r--r-- 1 root root 0 Aug 29 11:17 36ec3029-86a4-4ead-8c84-15184bba84e02398473527124955987.pipeout drwx------ 2 root root 4096 Aug 30 06:04 455348fc-5dcc-4b9b-aa07-55f0b1ac69ae -rw-r--r-- 1 root root 0 Aug 29 11:08 6a1f5a65-8a14-4416-8e07-0dd09bc6eba81619849933300463947.pipeout -rw-r--r-- 1 root root 0 Oct 23 05:41 715ddddf-a824-418c-a210-1083aa3268f72490915124940117231.pipeout -rw-r--r-- 1 root root 0 Aug 29 10:50 7acadaf5-fcec-4102-9114-ca55669930408559664959062910903.pipeout -rw-r--r-- 1 root root 0 Aug 29 13:09 7d5dfd06-8819-4688-9739-c54cb28e0dd57223042234491136050.pipeout -rw-r--r-- 1 root root 0 Aug 29 11:12 7e51556c-9225-49b8-af93-dd83a9136e028742797957346775102.pipeout drwx------ 2 root root 4096 Aug 30 06:06 8195f4c9-2bbd-42d6-bcef-16e460c02a65 -rw-r--r-- 1 root root 0 Aug 29 11:18 8fd11d48-4810-4073-b6c1-a4a2b0eb10e35292843796529942308.pipeout drwx------ 2 root root 4096 Aug 30 06:03 90166cb7-a851-4501-87f9-9e00ffc3db04 -rw-r--r-- 1 root root 0 Aug 29 11:17 96c2aabf-c5f9-41cd-94b7-698ff8531e485233529610306324479.pipeout -rw-r--r-- 1 root root 0 Aug 29 13:08 b8a8a766-d4b1-4528-8ba8-53bdd5dd47237286042327080022040.pipeout drwx------ 2 root root 4096 Aug 30 06:04 beba773a-eb0f-4f1d-8744-1113748cdc90 -rw-r--r-- 1 root root 0 Aug 29 13:11 d31e3001-18d9-4b94-ac27-611d8718dccc5011869932572437153.pipeout drwx------ 2 root root 4096 Aug 30 06:07 d4351240-f29e-4e38-9786-44e9925584c2 -rw-r--r-- 1 root root 0 Aug 29 11:17 e01f2814-e122-4dd6-bac3-2da015debdfb1336515642746251190.pipeout -rw-r--r-- 1 root root 0 Aug 29 11:03 eb783cd8-4fec-4120-a820-7e08f5e747221661939143361966384.pipeout -rw-r--r-- 1 root root 0 Aug 29 13:11 ec10ac35-0e55-460d-b68c-7256625bc40a6919750079144918582.pipeout drwx------ 2 root root 4096 Aug 30 06:03 f767c03c-818e-4dd8-81ea-b258c697bb77 -rw-r--r-- 1 root root 494108 Oct 23 07:44 hive.log -rw-r--r-- 1 root root 50213 Aug 17 03:42 hive.log.2018-08-17 -rw-r--r-- 1 root root 4411 Aug 18 14:12 hive.log.2018-08-18 -rw-r--r-- 1 root root 79775 Aug 19 13:52 hive.log.2018-08-19 -rw-r--r-- 1 root root 418827 Aug 29 13:30 hive.log.2018-08-29 -rw-r--r-- 1 root root 3564 Oct 7 21:27 hive.log.2018-10-07 -rw-r--r-- 1 root root 54170 Oct 21 08:40 hive.log.2018-10-21 [root@sparkproject1 root]#報(bào)錯(cuò)(linux日期 不對不用管):
2018-10-23 07:44:07,054 WARN [main]: conf.HiveConf (HiveConf.java:initialize(1534)) - DEPRECATED: hive.metastore.ds.retry.* no longer has any effect. Use hive.hmshandler.retry.* instead 2018-10-23 07:44:07,307 WARN [main]: util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2018-10-23 07:44:08,500 INFO [main]: session.SessionState (SessionState.java:start(385)) - No Tez session required at this point. hive.execution.engine=mr. 2018-10-23 07:44:08,632 WARN [main]: conf.HiveConf (HiveConf.java:initialize(1534)) - DEPRECATED: hive.metastore.ds.retry.* no longer has any effect. Use hive.hmshandler.retry.* instead 2018-10-23 07:44:09,016 ERROR [Thread-9]: thrift.ThriftCLIService (ThriftBinaryCLIService.java:run(120)) - Error: org.apache.thrift.transport.TTransportException: Could not create ServerSocket on address /192.168.1.28:10000.at org.apache.thrift.transport.TServerSocket.<init>(TServerSocket.java:93)at org.apache.hive.service.auth.HiveAuthFactory.getServerSocket(HiveAuthFactory.java:232)at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.run(ThriftBinaryCLIService.java:93)at java.lang.Thread.run(Thread.java:745)?根據(jù)錯(cuò)誤提示,192.168.1.28是我之前的ip,這里沒改過來,現(xiàn)在改為主機(jī)名 sparkproject1再試試:
打開hive-site.xml 文件,找到
<property><name>hive.server2.thrift.port</name><value>10000</value><description>Port number of HiveServer2 Thrift interface.Can be overridden by setting $HIVE_SERVER2_THRIFT_PORT</description> </property><property><name>hive.server2.thrift.bind.host</name><value>192.168.1.28</value><description>Bind host on which to run the HiveServer2 Thrift interface.Can be overridden by setting $HIVE_SERVER2_THRIFT_BIND_HOST</description> </property><property><name>hive.server2.authentication</name><value>NONE</value><description>Client authentication types.NONE: no authentication checkLDAP: LDAP/AD based authenticationKERBEROS: Kerberos/GSSAPI authentication改后:
<property><name>hive.server2.thrift.port</name><value>10000</value><description>Port number of HiveServer2 Thrift interface.Can be overridden by setting $HIVE_SERVER2_THRIFT_PORT</description> </property><property><name>hive.server2.thrift.bind.host</name><value>sparkproject1</value><description>Bind host on which to run the HiveServer2 Thrift interface.Can be overridden by setting $HIVE_SERVER2_THRIFT_BIND_HOST</description> </property><property><name>hive.server2.authentication</name><value>NONE</value><description>Client authentication types.NONE: no authentication checkLDAP: LDAP/AD based authenticationKERBEROS: Kerberos/GSSAPI authentication?
同時(shí)配置?hive.metastore.uris:
<property><name>hive.metastore.uris</name><value>thrift://sparkproject1:9083</value><description>Thrift URI for the remote metastore. Used by metastore client to connect to remote metastore.</description> </property>然后就不報(bào)錯(cuò)了:
2018-10-23 07:55:41,345 WARN [main]: conf.HiveConf (HiveConf.java:initialize(1534)) - DEPRECATED: hive.metastore.ds.retry.* no longer has any effect. Use hive.hmshandler.retry.* instead 2018-10-23 07:55:41,597 WARN [main]: util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2018-10-23 07:55:42,777 INFO [main]: session.SessionState (SessionState.java:start(385)) - No Tez session required at this point. hive.execution.engine=mr. 2018-10-23 07:55:42,892 WARN [main]: conf.HiveConf (HiveConf.java:initialize(1534)) - DEPRECATED: hive.metastore.ds.retry.* no longer has any effect. Use hive.hmshandler.retry.* instead 2018-10-23 07:56:22,079 WARN [HiveServer2-Handler-Pool: Thread-20]: fair.AllocationFileLoaderService (AllocationFileLoaderService.java:getAllocationFile(179)) - fair-scheduler.xml not found on the classpath. 2018-10-23 07:56:22,157 INFO [HiveServer2-Handler-Pool: Thread-20]: session.SessionState (SessionState.java:start(385)) - No Tez session required at this point. hive.execution.engine=mr. 2018-10-23 07:56:22,198 INFO [HiveServer2-Handler-Pool: Thread-20]: session.SessionState (SessionState.java:start(385)) - No Tez session required at this point. hive.execution.engine=mr. 2018-10-23 07:56:24,331 INFO [HiveServer2-Handler-Pool: Thread-20]: parse.SemanticAnalyzer (SemanticAnalyzer.java:analyzeInternal(9381)) - Starting Semantic Analysis?
ps一下:
[root@sparkproject1 hive]# ps -ef | grep hive root 4575 3341 0 07:18 pts/3 00:00:10 /usr/java/latest/bin/java -Xmx256m -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/usr/local/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=root -Dhadoop.root.logger=INFO,console -Djava.library.path=/usr/local/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx512m -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/local/hive/lib/hive-service-0.13.1-cdh5.3.6.jar org.apache.hadoop.hive.metastore.HiveMetaStore root 5332 2956 0 07:35 pts/2 00:00:00 vi hive-site.xml root 5796 4983 0 07:54 pts/4 00:00:00 tail -1000f hive.log root 5808 3341 0 07:55 pts/3 00:00:14 /usr/java/latest/bin/java -Xmx256m -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/usr/local/hadoop/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/local/hadoop -Dhadoop.id.str=root -Dhadoop.root.logger=INFO,console -Djava.library.path=/usr/local/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx512m -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/local/hive/lib/hive-service-0.13.1-cdh5.3.6.jar org.apache.hive.service.server.HiveServer2 root 6364 3341 0 08:23 pts/3 00:00:00 grep hive?
參考:http://www.voidcn.com/article/p-uoyeaafw-bbb.html
?
然后注意先啟動(dòng)hive的服務(wù):
1.關(guān)閉
可以通過ps -ef|grep hive 來看hive 的端口號,然后kill 掉相關(guān)的進(jìn)程。
2.啟動(dòng)
用來啟動(dòng)metastore nohup hive --service metastore 2>&1 &?
用來啟動(dòng)hiveserver2 nohup hive --service hiveserver2 2>&1 &?
可以通過查看日志,來確認(rèn)是否正常啟動(dòng)。
注意!如果 hiveserver2 不啟動(dòng),jdbc將無法正常連接
DBeaver連接成功:
?
?
?
?
?
使用快捷鍵:
1、執(zhí)行sql語句,選中所寫得sql語句,執(zhí)行Ctrl+ENTER鍵,即可執(zhí)行語句。
2、如果想要新增一個(gè)sql編輯頁面,執(zhí)行ctrl+】符號。
3、如果查出來數(shù)據(jù)想要拷貝出自己想要得文本文件也可以使用這個(gè)高級拷貝。選中要復(fù)制得信息,右鍵,如下圖。
使用高級拷貝,將行號和字段名都拷貝出來:
?
拷貝結(jié)果:
# id company_year_months insert_time 1 1097654298 江蘇樂能電池股份有限公司_2017_12@威爾耐車輪制造有限公司_2012_1 1562230437000 2 1097615671 中藝財(cái)富藝品城_2016_7@北京七彩云南珠寶商城_2014_3 1562201068000 3 1097637851 均輝有限公司_2013_7@上海東方嘉盛供應(yīng)鏈物流有限公司_2010_9@德威物流上海有限公司_2008_1@上海美國學(xué)校_2005_5@上海美國學(xué)校_2004_5 1562220708000?
要求拷貝出來列之間以,分隔,行之間以|分隔。
#,id,company_year_months,insert_time|1,1097654298,江蘇樂能電池股份有限公司_2017_12@威爾耐車輪制造有限公司_2012_1,1562230437000|2,1097615671,中藝財(cái)富藝品城_2016_7@北京七彩云南珠寶商城_2014_3,1562201068000|3,1097637851,均輝有限公司_2013_7@上海東方嘉盛供應(yīng)鏈物流有限公司_2010_9@德威物流上海有限公司_2008_1@上海美國學(xué)校_2005_5@上海美國學(xué)校_2004_5,1562220708000?
?
總結(jié)
以上是生活随笔為你收集整理的DBeaver小松鼠:数据库客户端界面工具DBeaver连接Hive的全部內(nèi)容,希望文章能夠幫你解決所遇到的問題。