1. 首页
  2. 移动开发
  3. 其他
  4. sqoop command

sqoop command

上传者: 2020-07-30 13:02:18上传 PDF文件 710.67KB 热度 15次
NULL博文链接:https://fypop.iteye.com/blog/2217562assworc root -query SELECT count(1)FRCM user_info"icuno inporrs will failnng soon version: 1.459152:ol: Getting your password on the ccmmand-line is insecure. Consider using -P instead.anagernager: Prepari ng to use a Mysql streaming resultset.1.4 create-hive-tablesqoop create-hive-table --connect jdbc: mysql: //192. 168. 1. 15: 3306/test --table user info-username root -password root -password root --hive-table hive user info test createfields-terminated-by ","--lines-terminated-by"nfifm丌 junmin repairer3312:8m·A5+hLLuyytr'sITdcn or operty mve metastore. local ro longer has any errect. wake sure to provice a valid val ue tor hive metastore13759)15 or 3 20 Main Lun ivEc u n mivecufr of i ae hive neLasLwr t, lucal dues IvL ExisLogging in'tializcd sing confi guration in jar:filc:/opt/hadocp250/apache hive 1.0.1 bin/lib/hive common 1.0.-. jar!,hive log1]. properthive> show tablesOKhive user info test createtest2Time taken: 0.03 seconds, Fetched 2 row(s)hive describe hive user info test createoKIntusernamestringuseageuser addressstrIngupdatet imestringTime taken: 0.111 seconds, Fetched: 5 row(s)1.5 exportexport data from hdfs to mysqlsqoop export --connect jdbc: mysql: //192.168. 1.15: 3306/test -username root -password root- tableuser infor--export-dir/user/hivetest/user info/ part-m-00000input-fields-terminated-by1.6 import 1sqoop import --conncct jdbc: mysql: //192.168. 1.4: 3306/test-uscrnamc root-password root --tablcuser info--hive-import --hive-Lable hive user info -m 2 --fields-lerminaled-by " , "--null-stringNSTR'--null-non-string NOBJ'hadoop(ahadoopmaster-IS sqoop create-hive-table--connect jdbc: mysql: //192.168. 1.4 3306/test--tablc uscr info -uscrnamc root -password root -password root --hivc-tablc hivc uscr infofields-terminated-by ,-lines-terminated-by nWarning: /opt/hadoop260/sqoop-145/. accumulo does not exist! Accumulo imports will failPlease set ACCUMULO HOME to the root of your Accumulo installationWarning: /opthadoop260isqoop-1.4.5//zookeeper does not exist! Accumulo imports will failPlcasc sct $ZOOKEEPER HOME to thc root of your Zookccpcr installation/opt/hadoop260/hbase-098. 13-hadoop2/confhbase-envsh: line 140filename argumentrequiredusage:filename arguments15/09/1507: 35: 47 INFO Sqoop. Syoop: Running Syoop version: 1. 4.515/ 09/1507: 35: 47 WaRn tool. BaseSgoop Tool: Setting your password on the command-line isinsecure. Consider using -P instead15/09/15 07: 35: 47 INFO manager. MySQLManager: Preparing to use a MySQL streamingresultset15/09/1507: 35: 49 INFO manager. SqlManager: Executing SQL Statement: SELECT L. FROMuser info as t limit 115/09/1507: 35: 49 INFO manager. SqlManager: Executing SQL statement: SELECT t. FROMuser info as liMit 115/09/1507: 35: 49 Warn hivc. TablcDcfWriter: Column undate Time had to be cast to a lessprecise type in HiveSLF4J: Class path contains multiple slf4J bindingsSLF4JFoundbindingjar:file:/opthadoop260/hadoop-2.6.0/share/hadop/common/lib/s4j-log4j12-1.7.5jar!orgsl4jimpl/ StaticLoggcr Binder classSLF4JFoundbindingLiar: file: /opt/hadoop260/hbase-098 13-hadoop2/lib/slf4j-1og4j12-164. jar /org/slf4j/ impl/StaticLoggerBinder classSLF4JFoundbindingLjar: file: /opt/hadoop260/apache-hive-10. 1-bin/libhive-jdbc-10. 1-standalone jar /org/slf4j/impl/ Stclogger Binder. classSlf4J:Seehttp://www.slf4j.org/codes.htmltmultiplebindingsforanexplanationSLF4J: Actual binding is of type [org. slf4j impl. Log4j Factory15/09/1507: 35: 50 INFO hive. Hivelmporl: Loading uploaded data into hive15/09/15 07: 35: 51 Warn confHive Conf: DEPRECATED: Configuration propertyhive. metastore. local no longer has any effect. Make sure to provide a valid value forhive. metastore uris if you are connecting to a remote metastore15/09/1507: 35 5I WARN conf Hive Conf: HiveConf of name hive. metastore local does not existinitializedusingconfigurationjar: file: /opw hadoop 260/apache-hive-10 1-bin/lib/hive-common-101.jar!/hive-log4j. propertiesOKTime taken: 0.948hadoop(ahadoopmaster s sqoop import --connect jdbc mysql: //192.168.1.4:3306/test-username root -password root --table user info --hive-import --hive-table hive user info-m 2Tields-Lerminated-by", " -null-string NSTR'--null-non-string NOBJWarning: /opt/hadoop260isqoop-145//accumulo does not exist! Accumulo imports will failPlease set SaCCUMUlO HOme to the root of your Accumulo installationWarning: /opt/hadoop260/sqoop-1.4.5/. zookeeper does not exist! Accumulo imports will fail.PIcasc sct ZOOKEEPER HOME to thc root of your Zookccpcr installation/opl/hadoop260/hbase-0.98. 13-hadoop2/conf/hbase-envsh: line 140:. filename argumentrequiredusage: filename arguments15/09/1507: 46: 41 INFO sqoop Sqoop Running Sqoop version: 1.4.515/09/1507: 46: 41 WARN tool. BascSqoop Tool: Sctting your password on the command-linc isinsecure. Consider using -P instead.1509/1507: 46: 41 INFO manager. MySQLManager: Preparing to use a MySQL streaming15/09/1507: 46: 41 INFO Lool Code Gen Tool: Beginning code generation15/09/1507: 46: 42 INFO manager. Sqlmanagcr Exccuting SQL statement: SELECT t. FROMuser info as t liMit 115/09/15 07: 46: 42 INFO manager. SqlManager: Executing SQL statement: SELECT t. FROMuser info as t LImit 115/09/15 07: 46: 42 INFO orm Compilation Manager: HADOOP MAPRED HOME is/opt/hadoop260/hadoop-2.6.0Note: /tmp/sqoop-hadoop/ compile/2d8a823f42608c5221dlacd174elaa5/user info java uses oroverrides a deprecated APINote: Recompile with -Xlint: deprecation for details15/09/15 07: 46: 45 INFO orm CompilaLion Manager: WritingJarfile/tmp sqoop-hadoop/compile/2d8a823f42608c5221dlacd174elaa59/user info. jar15/09/1507: 46: 45 WARN manager. My SQLManager: It looks like you are importing from mysql15/09/1507: 46: 45 WARN manager. MySQLManager: This transfer can be faster! Use the --direct15/09/1507: 46: 45 WARN manager. My SQLManager: option to exercise a My SQL-specific fast15/0971507: 46: 45 INFO manager. MySQLManager: Setting zero datetime behavior toconvertloNull (mysql)15/09/1507: 46: 45 INFO mapreduce. ImportJob Base: Beginning import of user infoSLF4.: Class path contains multiple SLF4J bindingsSLF4JFoundbindingLiar: file: /opt/hadoop260/hadoop-260/share/hadoop/common/lib/slf4j-1og4j12-175.jar !/org/slf4j/impl/StaticLoggerBinderclassSLF4JFoundbiindingLiar: file: /opt/hadoop260/hbase-0.98. 13-hadoop2/lib/slf4j-1og4j12-164. jar /org/slf4j/impl/StaticLoggerBinder classSLF4JFoundbindingInLiar: filc: /opt/hadoop260/apachc-hivc-10.1-bin/ib/hivc-jdbc-10.1-standaloncjar! /org/slf4j!impl/StaticLoggerBinder classSlf4J:Seehttp://www.slf4j.org/codes.html#multiplebindingsforanexplanationSLF4J: Actual binding is of type [org. slf4j impl. Log4jLoggerFactory15/09/1507: 46: 45 INFO Configuration deprecation: mapred jar is deprecated. Instead, usemapreduce job. jar15/09/1507: 46: 46 INFO Configuration deprecation: mapred map tasks is deprecated. Instead, usemapreduce job. maps5/09/15 07: 46: 46 INFO client. RMProxy: Connecting to RcsourccManagcrhadooprmaster/192.168.1.50:803215/09/1507: 46: 49 INFO db. DBInputFormat: Using read commited transaction isolation15/09/15 07: 46: 49 INFO db. DataDrivenDBlnputFormat: Bounding vals Query: SELECTMINCid), MAXCid)FROM user info1509/1507: 46: 49 INFO maprcduccJobSubmittcr: numbcr of splits: 215/09/15 07: 46: 49 INFO mapreduceJobSubmitter: Submitting tokens for jobjob1442327411384000115/09/15 07: 46: 50 INFO impl. Yarn Clientlmpl: Submittedapplicationapplication1442327411384000115/09/15 07: 46: 50 INFO mapreduccJobtrackhttp:/hadoopmaster:8088/proxy/application1442327411384000171509/1507: 46: 50 INFO mapreduceJob: Running job: job 1442327411384 000115/09/1507: 46: 59 INFO mapreduceJob: Job job 1442327411384 0001 running in uber mode15/09/1507: 46: 59 INFO mapreduceJob: map o% reduce 0%15/09/1507: 47: 09 INFO mapreduceJob: map 50% reduce 0%15/09/1507: 47: 10 INFO mapreduceJob: map 100% reduce 0%15/09/1507: 47: 10 INFO mapreduce Job: Job job 1442327411384 0001 completed successfull15/09/1507: 47: 10 INFO mapreduceJob: Counters: 30File System CountersFILE: Number of bytes read=0FILE: Number of bytes written=229282FILE: Number of read operations=0FILE: Number of large read operations=0FILE: Number of write operationsHDFS Numbed=197HDFS: Number of bytes written=247HDFS: Number of read operations=&HDFS: Number of large read operations=0HDFS: Number of write operations =4Job countersLaunched map tasks=2Other local map tasks=2Total time spent by all maps in occupied slots(ms-16838Total time spent by all reduces in occupied slots(ms)=0Total timc spent by all map tasks(ms=16838Total vcore-seconds taken by all map tasks=16838Total megabyte-seconds taken by all map tasks=17242112Map-Reduce FrameworkMap input records=6Map output records6Input split bytes-=197Spilled records=oed Shuffles=0Merged map outputs=0GC time elapsed(ms)=120CPU time spent(s )=2310Physical memory(bytes)snapshot=352079872Virtual memory(bytes )snapshot=1753681920Total committed heap usage(bytes)=169869312File Input format counterBytes Read=0File Oulput Format CountersBytes Writtcn=24715/09/1507: 47: 10 INFO mapreduce. ImportJobBase: Transferred 247 bytes in 24.0238 seconds(10.2815 bytes/sec)15/09/1507: 47: 10 INFO mapreduce. ImportJobBase: Retrieved 6 record15/09/1507: 47: 10 INFO manager. SylManager: Execuling SQL Statement: SELECT L. FROMuser info as t limit 115/09/1507: 47:10 WARN hive. TableDefWriter: Column update Time had to be cast to a lessprecise type in Hiy15/09/1507: 47: 10 INFO hive. Hlivelmport: Loading uploaded data into Hive15/09/15 07: 47: 11 WaRn conf Hive Conf: DEPRECATED: Configuration propertyhive. metastore. local no longer has any effect. Make sure torovdealid value forhive. metastore uris if you are connecting to a remote metastore15/09/1507: 47: 1 1 WARN conf Hive Conf: HiveConf of name hive. metastore. local does not existLoggininitializedusingconfigurationjar file: /opt/ hadoop260/apache- hive-101-binlib/hive-common-101.jar! /hive-log4j- propertiesOKTime taken: 0.684 secondsLoading data to table default hive user infoTable default. hive user info stats: InumFiles-2, totalSize=2471OKTime taken: 0.51l secondsusernAmeuseaGe user AddressLupdateTime1 name1b address1201509-1521141232 name 293 address22015-09-1521:45:253 name38 address3201509-1521:6:274 name25 address01509-152-1291(Nul201509-1522:32:355 names8 address201509-152146:32ive> show tables.OKive user infoive user info test createtest2hive> describe hi ve_user Fetched 3 row(s)ime taken: 0, 031 secondsOKd1nTusernamestringuser adeIntuseraddstringupdatet imestringime taken:0.201 seconds, Fetched:5row〔5〕hive> select s from hive user info:K1namel 82015-09-1521:46:23.0name2 93address202015-09-1521:46:27.04named 25addresse2015-09-1521:46:29.0ne5682015-09-1521:46:32。06NSTR 1\NsTR2015-09-1522:32:35.0Time taken: 0.489 seconds, Fetched 6 row(s)hive>囗1.7 import 2qoop import--connect jdbc mysql: //192.168.1.4: 3306/test-username root-password root -tableuser info --hive-import--create-hive-table -m 2--fields-terminated-by " "--null-string Nnull-non - string nive> describe user infoOKidIntuser namestringuser adeuser addr essstringupdated meStrIngTime taken: 0.106 seconds, Fetched: 5 row(s)1.8 incremental import 1sqoop import --conncct jdbc: mysql: //192.168. 1.4: 3306/test-uscrnamc root-password root --tablcuser info--hive-import --hive-Lable hive user info -m 1 --fields-lerminaled-by " , "--null-stringANSTR'--null-non-string NOBJ'--incremental append--check-column id--last-value 6;userNameuser Age user Address updateTime1 name18 address1201509-1521:45:232 name293 address2201509-1521:45:253 name8 address.201509-1521:41:274 name25 address2015091521:45:2970un)(Null)(Null60~02015-09-1522:32:355 names58 address 5201509-1521:45:32[hadoop@hadoopmaster s sqoop import -connect jdbc mysql: //192.168. 1.4: 3306/test-username root -password root --table user info--hive-import --hive-table hive user info -m ITields- Lerminated-by " "--null-string NSTR'--null-non-string NOBJ'--incremenlal appendcheck-column id -last-value 6Warning: /opt/hadoop260/sqoop-1 45/.accumulo does not exist! Accumulo imports will failPlease set SaCCumulo HOME to the root of your accumulo installationWarning: /opt/hadoop260/sqoop -1 4.5/. zookccpcr docs not exist! Accumulo imports will failPlease set SZoOKEEPER HOME Lo the root of your Zookeeper installation/opt/hadoop260/hbase-098.13-hadoop 2/confhbase-envsh: line 140filename argumentrequiredusage: filename [arguments15/0971507: 52: 24 INFO sqoop. Sqoop: Running Sqoop vcrsion: 1. 4.515/09/1507: 52: 24 Warn tool Base SqoopTool: Setting your password on the command-line isnsecure. Consider using-P instead15/09/15 07: 52: 24 INFO manager. MySQLManager: Preparing to use a MySQL streamingresultset15/09/1507: 52: 24 INFO tool Codc Gicn Tool: Beginning codc generation15/09/1507: 52: 25 INFO manager. SqlManager: Executing SQL Statement: SELECT t. FROMuser info as t LiMit 115/09/1507: 52: 25 INFO manager. SqlManager: Executing SQL statement: SELECT t. FROMuser inlo as i LiMit 115/09/15 07: 52: 25 INFO orm. CompilationManager: HADOOP MAPRED HOME is/opt/ hadoop260/hadoop-2.6.0Note: /tmp/ sqoop-hadoopcompile/aa808a25be571a2e7fcdc17a4balacd6/user info. java uses oroverrides a deprecated APINote: Recompile with-XlinL: deprecation for details1509/1507:52:27INFOorm Compilation Manager: Writing jafile/tmp!sqoop-hadoop/compile/aa808a25be57la2e7fcdc17 a4balacd6/user info. jarSLF4J: Class path contains multiple SlF4J bindingsSLF4JFoundbindingLiar: file: /opt/hadoop260/hadoop-260/share/hadoop/common/lib/slf4j-1og4j12-175.jar!/org/slf4j/impl StaticLoggerBinder classSLF4JFoundbindingInLiar: filc: /opt/hadoop 260/hbasc-098.13-hadoop2/lib/ slf4j-log 4j12-164. jar !/org/slf4j/impl/staticLor binder classSLF4JFoundbindingLiar: file: /opt/hadoop260/apache-hive-10. 1-bin/ib/hive-jdbc-1..1-standalonejar!/org/slf4j/impl/StalicLogger Binder classSlf4J:Seehttp://www.slf4j.org/codes.htmltmultiplebindingsforanexplanationSLF4J: Actual binding is of type lorg slf4j impl. Log4jLoggerFactory]15/09/1507: 52: 28 INFO tool. ImportTool: Maximal id query for free form incremental importSELECT MAXCid) FROM uscr info15/09/1507: 52: 28 INFO tool. ImportTool: Incremental import based on column id15/09/1507: 52: 2 8 INFO tool Import Tool: Lower bound value: 615/09/1507: 52: 28 INFO tool. ImportTool: Upper bound value: 715/09/1507: 52: 28 WARN manager. MySQLManager: It looks like you are importing from mysql15/09/1507: 52: 28 WARN manager. MySQLManagcr: This transfcr can bc fastcr! Usc thc--dircct15/09/1507: 52: 28 WARN manager. MySQLManager: option to exercise a My SQL-specific fast15/09/1507: 52: 28 INFO manager. MySQLManager: Setting zero DatETime behavior toconvertToNull (mysql)15/09/1507: 52: 28 INFO mapreduce. Import. JobBasc: Beginning import of user info15/09/15 07: 52: 28 INFO Configuration deprecation: mapred jar is deprecated. Instead,usemapreduce job. jar15/09/1507: 52: 28 INFO Configuration deprecation: mapred map tasks is deprecated. Instead, usemapreduce job. maps15/09/15 07: 52: 29 INFo client. RMProxy: Connecting to Resource Manager athadoopmaster/192.168.1.50:803215/09/1507: 52: 30 INFO db. DBInputFormat: Using read commited transaction isolation15/09/1507: 52: 30 INFO mapreduce. JobSubmitter: number of splits: I15/09/15 07: 52:30 INFO mapreduce.JobSubmitter: Submitting tokens for jobjob1442327411384000215/09/1507:52:31INFOimpl. YarnClientImplSubmittedapplicationapplication1442327411384000215/09/15 07: 52: 31 INFO mapreduce Job: THurl to track the jobhttp:/hadoopmaster:8088/proxy/application14423274113840002715/09/1507: 52: 31 INFO mapreduceJob: Running job: job 1442327411384 000215/09/1507: 52: 37 INFO mapreduceJob: Job job 1442327411384 0002 running in uber modefalse15/09/1507: 52: 37 INFO mapreduceJob: map 0%o reduce 0%15/09/1507:52: 47 INFO mapreduceJob: map 100% reduce 0%15/09/1507: 52: 47 INFO mapreduceJob: Job job 1442327411384 0002 completed successfully15/09/1507: 52 47 INFO mapreduce Job: Counters: 30File System Counters
用户评论