使用spark/bin目录下beeline连接spark thrift server
使用spark/bin目录下beeline连接spark thrift server
在开启spark thrift server之后,通过beeline连接需要使用spark/bin目录下的beeline进行连接,不然会出现以下问题:
Connecting to jdbc:hive2://hadoop2:10016/default
20/12/26 16:11:40 [main]: ERROR jdbc.HiveConnection: Error opening session
org.apache.thrift.TApplicationException: Required field 'client_protocol' is unset! Struct:TOpenSessionReq(client_protocol:null, configuration:{set:hiveconf:hive.server2.thrift.resultset.default.fetch.size=1000, use:database=default})
at org.apache.thrift.TApplicationException.read(TApplicationException.java:111) ~[hive-exec-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:79) ~[hive-exec-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.service.rpc.thrift.TCLIService$Client.recv_OpenSession(TCLIService.java:176) ~[hive-exec-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.service.rpc.thrift.TCLIService$Client.OpenSession(TCLIService.java:163) ~[hive-exec-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:853) [hive-jdbc-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:316) [hive-jdbc-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) [hive-jdbc-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at java.sql.DriverManager.getConnection(DriverManager.java:664) [?:1.8.0_112]
at java.sql.DriverManager.getConnection(DriverManager.java:208) [?:1.8.0_112]
at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145) [hive-beeline-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209) [hive-beeline-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.beeline.Commands.connect(Commands.java:1643) [hive-beeline-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.beeline.Commands.connect(Commands.java:1538) [hive-beeline-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56) [hive-beeline-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1456) [hive-beeline-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1495) [hive-beeline-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.beeline.BeeLine.connectUsingArgs(BeeLine.java:917) [hive-beeline-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:803) [hive-beeline-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:1108) [hive-beeline-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:1082) [hive-beeline-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:546) [hive-beeline-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:528) [hive-beeline-3.1.0.3.1.4.0-315.jar:3.1.0.3.1.4.0-315]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
at org.apache.hadoop.util.RunJar.run(RunJar.java:318) [hadoop-common-3.1.1.3.1.4.0-315.jar:?]
at org.apache.hadoop.util.RunJar.main(RunJar.java:232) [hadoop-common-3.1.1.3.1.4.0-315.jar:?]
20/12/26 16:11:40 [main]: WARN jdbc.HiveConnection: Failed to connect to hadoop2:10016
Error: Could not open client transport with JDBC Uri: jdbc:hive2://hadoop2:10016/default: Could not establish connection to jdbc:hive2://hadoop2:10016/default: Required field 'client_protocol' is unset! Struct:TOpenSessionReq(client_protocol:null, configuration:{set:hiveconf:hive.server2.thrift.resultset.default.fetch.size=1000, use:database=default}) (state=08S01,code=0)
而是用spark目录下的beeline则可以:
[root@hadoop1 bin]# /usr/hdp/3.1.4.0-315/spark2/bin/beeline -u jdbc:hive2://hadoop2:10016/default
Connecting to jdbc:hive2://hadoop2:10016/default
20/12/26 16:15:01 INFO Utils: Supplied authorities: hadoop2:10016
20/12/26 16:15:01 INFO Utils: Resolved authority: hadoop2:10016
20/12/26 16:15:01 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://hadoop2:10016/default
Connected to: Spark SQL (version 2.3.2.3.1.4.0-315)
Driver: Hive JDBC (version 1.21.2.3.1.4.0-315)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 1.21.2.3.1.4.0-315 by Apache Hive
0: jdbc:hive2://hadoop2:10016/default>
Powered by Waline v2.14.1