说明:本人使用的cdh版本是5.13.2,对应的hive版本是1.1.0-cdh5.13.2,flink版本为1.10。以下为配置过程:
首先修改flink-1.10/conf/sql-client-defaults.yaml配置,为hive配置catalog相关参数,cdh版本的hive-conf目录为:/etc/hive/conf.cloudera.hive:
#==============================================================================
# Catalogs
#==============================================================================
# Define catalogs here.
#catalogs: [] # empty list
hive-version: 1.1.0
#catalogs: [] # empty list
# A typical catalog definition looks like:
# - name: myhive
# type: hive
# hive-conf-dir: /opt/hive_conf/
# default-database: ...
catalogs:
- name: myhive
type: hive
hive-conf-dir: /etc/hive/conf.cloudera.hive
hive-version: 1.2.1
property-version: 1
default-database: default
由于下载的flink本身不自带hive相关的lib,因此决定自行导入:
wget https://repo1.maven.org/maven2/org/apache/flink/flink-connector-hive_2.11/1.10.0/flink-connector-hive_2.11-1.10.0.jar
wget https://repo1.maven.org/maven2/org/apache/flink/flink-hadoop-compatibility_2.11/1.10.0/flink-hadoop-compatibility_2.11-1.10.0.jar
下载hive1.2.1版本,并解压:
http://archive.apache.org/dist/hive/hive-1.2.1/
将以下jar导入到flink-1.10/lib:
-rw-r--r-- 1 root root 164368 Mar 27 15:54 antlr-runtime-3.4.jar
-rw-r--r-- 1 root root 448794 Mar 27 15:57 apache-log4j-extras-1.2.17.jar
-rw-r--r-- 1 root root 91473 Mar 27 16:11 commons-cli-2.0-mahout.jar
-rw-r--r-- 1 root root 298829 Mar 27 16:16 commons-configuration-1.6.jar
-rw-r--r-- 1 root root 62050 Mar 27 15:56 commons-logging-1.1.3.jar
-rw-r--r-- 1 root root 339666 Mar 27 15:54 datanucleus-api-jdo-3.2.6.jar
-rw-r--r-- 1 root root 1890075 Mar 27 15:54 datanucleus-core-3.2.10.jar
-rw-r--r-- 1 root root 1809447 Mar 27 15:54 datanucleus-rdbms-3.2.9.jar
-rw-r--r-- 1 root root 292290 Mar 27 15:53 hive-common-1.2.1.jar
-rw-r--r-- 1 root root 20599030 Mar 27 15:56 hive-exec-1.2.1.jar
-rw-r--r-- 1 root root 5505100 Mar 27 15:54 hive-metastore-1.2.1.jar
-rw-r--r-- 1 root root 108914 Mar 27 15:53 hive-shims-common-1.2.1.jar
-rw-r--r-- 1 root root 201124 Mar 27 15:57 jdo-api-3.0.1.jar
-rw-r--r-- 1 root root 313686 Mar 27 15:54 libfb303-0.9.2.jar
-rw-r--r-- 1 root root 481535 Mar 27 15:57 log4j-1.2.16.jar
-rw-r--r-- 1 root root 9931 Sep 2 2019 slf4j-log4j12-1.7.15.jar
使用命令bin/sql-client.sh embedded启动,报错及解决过程:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.commons.cli.Option.builder(Ljava/lang/String;)Lorg/apache/commons/cli/Option$Builder;
at org.apache.flink.table.client.cli.CliOptionsParser.<clinit>(CliOptionsParser.java:43)
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:172)
删除commons-cli-1.2.1.jar,从cdh jars目录导入commons-cli-2.0:
[root@kylin03 flink-1.10.0]# cd lib
[root@kylin03 lib]# ls
antlr-runtime-3.4.jar datanucleus-core-3.2.10.jar flink-table_2.11-1.10.0.jar hive-shims-common-1.2.1.jar slf4j-log4j12-1.7.15.jar
apache-log4j-extras-1.2.17.jar datanucleus-rdbms-3.2.9.jar flink-table-blink_2.11-1.10.0.jar jdo-api-3.0.1.jar
commons-cli-1.2.1.jar flink-connector-hive_2.11-1.10.0.jar hive-common-1.2.1.jar libfb303-0.9.2.jar
commons-logging-1.1.3.jar flink-dist_2.11-1.10.0.jar hive-exec-1.2.1.jar log4j-1.2.16.jar
datanucleus-api-jdo-3.2.6.jar flink-hadoop-compatibility_2.11-1.10.0.jar hive-metastore-1.2.1.jar mysql-connector-java-5.1.42-bin.jar
[root@kylin03 lib]# rm commons-cli-1.2.1.jar
rm: remove regular file `commons-cli-1.2.1.jar'? y
[root@kylin03 lib]# cp /opt/cloudera/parcels/CDH/jars/commons-cli-
commons-cli-1.2.jar commons-cli-2.0-mahout.jar
[root@kylin03 lib]# cp /opt/cloudera/parcels/CDH/jars/commons-cli-2.0-mahout.jar .
[root@kylin03 flink-1.10.0]# bin/sql-client.sh embedded
No default environment specified.
Searching for '/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml'...found.
Reading default environment from: file:/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml
No session environment specified.
Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:190)
Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:753)
at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:228)
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:98)
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:178)
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:84)
at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:371)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$4(ExecutionContext.java:547)
at java.util.HashMap.forEach(HashMap.java:1288)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$5(ExecutionContext.java:546)
at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:240)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:545)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:494)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:159)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:118)
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:742)
... 3 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 14 more
缺少Hadoop相关jar包,从cdh目录拷贝:
[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/hadoop-co
hadoop-common-2.6.0-cdh5.13.2.jar hadoop-common-2.6.0-cdh5.13.2-tests.jar hadoop-core-2.6.0-mr1-cdh5.13.2.jar
[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/hadoop-co
hadoop-common-2.6.0-cdh5.13.2.jar hadoop-common-2.6.0-cdh5.13.2-tests.jar hadoop-core-2.6.0-mr1-cdh5.13.2.jar
[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/hadoop-common-2.6.0-cdh5.13.2.jar lib/
[root@kylin03 flink-1.10.0]# bin/sql-client.sh embedded
No default environment specified.
Searching for '/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml'...found.
Reading default environment from: file:/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml
No session environment specified.
Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:190)
Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:753)
at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:228)
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:98)
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:178)
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/TaskAttemptContext
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.hadoop.hive.shims.ShimLoader.createShim(ShimLoader.java:146)
at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:141)
at org.apache.hadoop.hive.shims.ShimLoader.getHadoopShims(ShimLoader.java:100)
at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:368)
at org.apache.hadoop.hive.conf.HiveConf.<clinit>(HiveConf.java:105)
at org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:162)
at org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:140)
at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:84)
at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:371)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$4(ExecutionContext.java:547)
at java.util.HashMap.forEach(HashMap.java:1288)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$5(ExecutionContext.java:546)
at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:240)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:545)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:494)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:159)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:118)
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:742)
... 3 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.TaskAttemptContext
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 23 more
缺少mapreduce相关jar包,从cdh目录拷贝:
[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/hadoop-mapreduce-client-core-2.6.0-cdh5.13.2.jar lib/
[root@kylin03 flink-1.10.0]# bin/sql-client.sh embedded
No default environment specified.
Searching for '/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml'...found.
Reading default environment from: file:/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml
No session environment specified.
Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:190)
Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:753)
at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:228)
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:98)
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:178)
Caused by: java.lang.NoClassDefFoundError: org/apache/commons/configuration/Configuration
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:38)
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:36)
at org.apache.hadoop.security.UserGroupInformation$UgiMetrics.create(UserGroupInformation.java:139)
at org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:259)
at org.apache.hadoop.conf.Configuration$Resource.getRestrictParserDefault(Configuration.java:245)
at org.apache.hadoop.conf.Configuration$Resource.<init>(Configuration.java:213)
at org.apache.hadoop.conf.Configuration$Resource.<init>(Configuration.java:205)
at org.apache.hadoop.conf.Configuration.addResource(Configuration.java:863)
at org.apache.flink.api.java.hadoop.mapred.utils.HadoopUtils.getHadoopConfiguration(HadoopUtils.java:102)
at org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:171)
at org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:140)
at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:84)
at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:371)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$4(ExecutionContext.java:547)
at java.util.HashMap.forEach(HashMap.java:1288)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$5(ExecutionContext.java:546)
at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:240)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:545)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:494)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:159)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:118)
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:742)
... 3 more
Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration.Configuration
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 25 more
[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/commons-
commons-beanutils-1.8.3.jar commons-compress-1.4.1.jar commons-exec-1.1.jar commons-lang-2.6.jar commons-math3-3.2.jar
commons-beanutils-1.9.2.jar commons-compress-1.4.jar commons-fileupload-1.3.2.jar commons-lang3-3.1.jar commons-math3-3.4.1.jar
commons-beanutils-core-1.8.0.jar commons-compress-1.9.jar commons-httpclient-3.0.1.jar commons-lang3-3.3.2.jar commons-net-1.4.1.jar
commons-cli-1.2.jar commons-configuration-1.6.jar commons-httpclient-3.1.jar commons-lang3-3.4.jar commons-net-3.1.jar
commons-cli-2.0-mahout.jar commons-configuration-1.7.jar commons-io-1.4.jar commons-logging-1.1.1.jar commons-pool-1.5.4.jar
commons-codec-1.4.jar commons-daemon-1.0.13.jar commons-io-2.1.jar commons-logging-1.1.3.jar commons-pool2-2.4.2.jar
commons-codec-1.6.jar commons-daemon.jar commons-io-2.3.jar commons-logging-1.1.jar commons-vfs2-2.0.jar
commons-codec-1.8.jar commons-dbcp-1.4.jar commons-io-2.4.jar commons-logging-1.2.jar
commons-codec-1.9.jar commons-digester-1.8.1.jar commons-jexl-2.1.1.jar commons-math-2.1.jar
commons-collections-3.2.2.jar commons-digester-1.8.jar commons-lang-2.4.jar commons-math-2.2.jar
commons-compiler-2.7.6.jar commons-el-1.0.jar commons-lang-2.5.jar commons-math3-3.1.1.jar
[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/commons-configuration-1.6.jar lib/
[root@kylin03 flink-1.10.0]# bin/sql-client.sh embedded
No default environment specified.
Searching for '/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml'...found.
Reading default environment from: file:/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml
No session environment specified.
Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:190)
Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:753)
at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:228)
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:98)
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:178)
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/util/PlatformName
at org.apache.hadoop.security.UserGroupInformation.getOSLoginModuleName(UserGroupInformation.java:442)
at org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:487)
at org.apache.hadoop.conf.Configuration$Resource.getRestrictParserDefault(Configuration.java:245)
at org.apache.hadoop.conf.Configuration$Resource.<init>(Configuration.java:213)
at org.apache.hadoop.conf.Configuration$Resource.<init>(Configuration.java:205)
at org.apache.hadoop.conf.Configuration.addResource(Configuration.java:863)
at org.apache.flink.api.java.hadoop.mapred.utils.HadoopUtils.getHadoopConfiguration(HadoopUtils.java:102)
at org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:171)
at org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:140)
at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:84)
at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:371)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$4(ExecutionContext.java:547)
at java.util.HashMap.forEach(HashMap.java:1288)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$5(ExecutionContext.java:546)
at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:240)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:545)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:494)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:159)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:118)
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:742)
... 3 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.PlatformName
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 23 more
[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/hadoop-co
hadoop-common-2.6.0-cdh5.13.2.jar hadoop-common-2.6.0-cdh5.13.2-tests.jar hadoop-core-2.6.0-mr1-cdh5.13.2.jar
[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/hadoop-co
hadoop-common-2.6.0-cdh5.13.2.jar hadoop-common-2.6.0-cdh5.13.2-tests.jar hadoop-core-2.6.0-mr1-cdh5.13.2.jar
[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/hadoop-common-2.6.0-cdh5.13.2.jar lib/
cp: overwrite `lib/hadoop-common-2.6.0-cdh5.13.2.jar'? y
[root@kylin03 flink-1.10.0]# cp /opt/cloudera/parcels/CDH/jars/hadoop-auth-2.6.0-cdh5.13.2.jar lib/
终于,可以正常打开了:
[root@kylin03 flink-1.10.0]# bin/sql-client.sh embedded
No default environment specified.
Searching for '/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml'...found.
Reading default environment from: file:/usr/local/flink-1.10.0/conf/sql-client-defaults.yaml
No session environment specified.
▒▓██▓██▒
▓████▒▒█▓▒▓███▓▒
▓███▓░░ ▒▒▒▓██▒ ▒
░██▒ ▒▒▓▓█▓▓▒░ ▒████
██▒ ░▒▓███▒ ▒█▒█▒
░▓█ ███ ▓░▒██
▓█ ▒▒▒▒▒▓██▓░▒░▓▓█
█░ █ ▒▒░ ███▓▓█ ▒█▒▒▒
████░ ▒▓█▓ ██▒▒▒ ▓███▒
░▒█▓▓██ ▓█▒ ▓█▒▓██▓ ░█░
▓░▒▓████▒ ██ ▒█ █▓░▒█▒░▒█▒
███▓░██▓ ▓█ █ █▓ ▒▓█▓▓█▒
░██▓ ░█░ █ █▒ ▒█████▓▒ ██▓░▒
███░ ░ █░ ▓ ░█ █████▒░░ ░█░▓ ▓░
██▓█ ▒▒▓▒ ▓███████▓░ ▒█▒ ▒▓ ▓██▓
▒██▓ ▓█ █▓█ ░▒█████▓▓▒░ ██▒▒ █ ▒ ▓█▒
▓█▓ ▓█ ██▓ ░▓▓▓▓▓▓▓▒ ▒██▓ ░█▒
▓█ █ ▓███▓▒░ ░▓▓▓███▓ ░▒░ ▓█
██▓ ██▒ ░▒▓▓███▓▓▓▓▓██████▓▒ ▓███ █
▓███▒ ███ ░▓▓▒░░ ░▓████▓░ ░▒▓▒ █▓
█▓▒▒▓▓██ ░▒▒░░░▒▒▒▒▓██▓░ █▓
██ ▓░▒█ ▓▓▓▓▒░░ ▒█▓ ▒▓▓██▓ ▓▒ ▒▒▓
▓█▓ ▓▒█ █▓░ ░▒▓▓██▒ ░▓█▒ ▒▒▒░▒▒▓█████▒
██░ ▓█▒█▒ ▒▓▓▒ ▓█ █░ ░░░░ ░█▒
▓█ ▒█▓ ░ █░ ▒█ █▓
█▓ ██ █░ ▓▓ ▒█▓▓▓▒█░
█▓ ░▓██░ ▓▒ ▓█▓▒░░░▒▓█░ ▒█
██ ▓█▓░ ▒ ░▒█▒██▒ ▓▓
▓█▒ ▒█▓▒░ ▒▒ █▒█▓▒▒░░▒██
░██▒ ▒▓▓▒ ▓██▓▒█▒ ░▓▓▓▓▒█▓
░▓██▒ ▓░ ▒█▓█ ░░▒▒▒
▒▓▓▓▓▓▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒░░▓▓ ▓░▒█░
______ _ _ _ _____ ____ _ _____ _ _ _ BETA
| ____| (_) | | / ____|/ __ \| | / ____| (_) | |
| |__ | |_ _ __ | | __ | (___ | | | | | | | | |_ ___ _ __ | |_
| __| | | | '_ \| |/ / \___ \| | | | | | | | | |/ _ \ '_ \| __|
| | | | | | | | < ____) | |__| | |____ | |____| | | __/ | | | |_
|_| |_|_|_| |_|_|\_\ |_____/ \___\_\______| \_____|_|_|\___|_| |_|\__|
Welcome! Enter 'HELP;' to list all available commands. 'QUIT;' to exit.
Flink SQL> show databases;
default_database
Flink SQL> show tables;
[INFO] Result was empty.
Flink SQL> use catalog myhive;
Flink SQL> show databases;
default
dim
temp
Flink SQL> show tables;
my_first_table
test
Flink SQL> quit;
[INFO] Exiting Flink SQL CLI Client...
Shutting down the session...
done.
[root@kylin03 flink-1.10.0]#
感谢: