Spark和Hive的结合(让hive基于spark计算)
spark和hive结合 1.安装mysql 2.在spark/conf中创建一个hive-site.xml javax.jdo.option.ConnectionURL jdbc:mysql://192.168.224.132:3306/hive?createDatabaseIfNotExist=true JDBC connect string for a JDBC metastore javax.jdo.option.ConnectionDriverName com.mysql.jdbc.Driver Driver
下载地址
用户评论