环境 hive 版本hive-0.11.0
sqoop 版本 sqoop-1.4.4.bin__hadoop-1.0.0
从hive导到mysql
mysql 表: mysql> desc cps_activation; +————+————-+——+—–+———+—————-+ | Field | Type | Null | Key | Default | Extra | +————+————-+——+—–+———+—————-+ | id | int(11) | NO | PRI | NULL | auto_increment | | day | date | NO | MUL | NULL | | | pkgname | varchar(50) | YES | | NULL | | | cid | varchar(50) | YES | | NULL | | | pid | varchar(50) | YES | | NULL | | | activation | int(11) | YES | | NULL | | +————+————-+——+—–+———+—————-+中国的军衔等级 6 rows in t (0.01 c) hive表
hive> desc active; OK好课堂 id int None day string None pkgname string None cid string None pid string None activation int None 测试链接成功
[hadoop@hs11 ~]sqoop list-databas –connect jdbc:mysql://localhost:3306/ –urname root –password admin Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail. 贫血症状表现有哪些Plea t $HCAT_HOME to the root of your HCatalog installation. 13/08/20 16:42:26 WARN tool.BaSqoopTool: Setting your password on the command-line is incure. Consider using -P instead. 13/08/20 16:42:26 INFO manager.MySQLManager: Preparing to u a MySQL streaming resultt. information_schema easyhadoop mysql test [hadoop@hs11 ~]$ sqoop list-databas –connect jdbc:mysql://localhost:3306/test –urname root –password admin Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail. Plea t $HCAT_HOME to the root of your HCatalog installation. 13/08/20 16:42:40 WARN tool.BaSqoopTool: Setting your password on the command-line is incure. Consider using -P instead. 13/08/20 16:42:40 INFO manager.MySQLManager: Preparing to u a MySQL streaming resultt. information_schema easyhadoop mysql test [hadoop@hs11 ~]$ sqoop list-tables –connect jdbc:mysql://localhost:3306/test –urname root –password admin Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail. Plea t $HCAT_HOME to the root of your HCatalog installation. 13/08/20 16:42:54 WARN tool.BaSqoopTool: Setting your password on the command-line is incure. Consider using -P instead. 13/08/20 16:42:54 INFO manager.MySQLManager: Preparing to u a MySQL streaming resultt. active [hadoop@hs11 ~]$ sqoop create-hive-table –connect jdbc:mysql://localhost:3306/test –table active –urname root –password admin –hive-table test Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail. Plea t $HCAT_HOME to the root of your HCatalog installation. 13/08/20 16:57:04 WARN tool.BaSqoopTool: Setting your password on the command-line is incure. Consider using -P instead. 13/08/20 16:57:04 INFO tool.BaSqoopTool: Using Hive-specific delimiters for output. You can override 13/08/20 16:57:04 INFO tool.BaSqoopTool: delimiters with –fields-terminated-by, etc. 13/08/20 16:57:04 WARN tool.BaSqoopTool: It ems that you’ve specified at least one of following: 13/08/20 16:57:04 WARN tool.BaSqoopTool: –hive-home 13/08/20 16:57:04 WARN tool.BaSqoopTool: –hive-overwrite 13/08/20 16:57:04 WARN tool.BaSqoopTool: –create-hive-table 13/08/20 16:57:04 WARN tool.BaSqoopTool: –hive-table 13/08/20 16:57:04 WARN tool.BaSqoopTool: –hive-partition-key 13/08/20 16:57:04 WARN tool.BaSqoopTool: –hive-partition-value 13/08/20 16:57:04 WARN tool.BaSqoopTool: –map-column-hive 13/08/20 16:57:04 WARN tool.BaSqoopTool: Without specifying parameter –hive-import. Plea note that 13/08/20 16:57:04 WARN tool.BaSqoopTool: tho arguments will not be ud in this ssion. Either 13/08/20 16:57:04 WARN tool.BaSqoopTool: specify –hive-import to apply them correctly or remove them 13/08/20 16:57:04 WARN tool.BaSqoopTool: from command line to remove this warning. 13/08/20 16:57:04 INFO tool.BaSqoopTool: Plea note that –hive-home, –hive-partition-key, 13/08/20 16:57:04 INFO tool.BaSqoopTool: hive-partition-value and –map-column-hive options are 13/08/20 16:57:04 INFO tool.BaSqoopTool: are also valid for HCatalog imports and exports 13/08/20 16:57:04 INFO manager.MySQLManager: Preparing to u a MySQL streaming resultt. 13/08/20 16:57:05 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `active` AS t LIMIT 1 13/08/20 16:57:05 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `active` AS t LIMIT 1 13/08/20 16:57:05 WARN hive.TableDefWriter: Column day had to be cast to a less preci type in Hive 13/08/20 16:57:05 INFO hive.HiveImport: Loading uploaded data into Hive 1、拒绝连接
[hadoop@hs11 ~]$ sqoop export –connect jdbc:mysql://localhost/test –urname root –password admin –table test –export-dir /ur/hive/warehou/actmp Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail. Plea t $HCAT_HOME to the root of your HCatalog installation. 13/08/21 09:14:07 WARN tool.BaSqoopTool: Setting your password on the command-line is incure. Consider using -P instead. 13/08/21 09:14:07 INFO manager.MySQLManager: Preparing to u a MySQL streaming resultt. 13/08/21 09:14:07 INFO tool.CodeGenTool: Beginning code generation 13/08/21 09:14:07 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1 13/08/21 09:14:07 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1 13/08/21 09:14:07 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/hadoop-1.1.2 Note: /tmp/sqoop-hadoop/compile/0b5cae714a00b3940fb793c3694408ac/test.java us or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 13/08/21 09:14:08 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/0b5cae714a00b3940fb793c3694408ac/test.jar 孟宾于13/08/21 09:14:08 INFO mapreduce.ExportJobBa: Beginning export of test 13/08/21 09:14:09 INFO input.FileInputFormat: Total input paths to process : 1 13/08/21 09:14:09 INFO input.FileInputFormat: Total input paths to process : 1 13/08/21 09:14:09 INFO util.NativeCodeLoader: Loaded the native-hadoop library 13/08/21 09:14:09 WARN snappy.LoadSnappy: Snappy native library not loaded 13/08/21 09:14:10 INFO mapred.JobClient: Running job: job_201307251523_0059 13/08/21 09:14:11 INFO mapred.JobClient: map 0% reduce 0% 13/08/21 09:14:20 INFO mapred.JobClient: Task Id : attempt_201307251523_0059_m_000000_0, Status : FAILED java.io.IOException:&sql.jdbc.CommunicationsException: Communications link failure due to underlying exception: ** BEGIN NESTED EXCEPTION ** java.ConnectException MESSAGE: Connection refud STACKTRACE: java.ConnectException: Connection refud at java.PlainSocketImpl.socketConnect(Native Method) at java.PlainSocketImpl.doConnect(PlainSocketImpl.java:351) at tToAddress(PlainSocketImpl.java:213) at t(PlainSocketImpl.java:200) at t(SocksSocketImpl.java:366) at t(Socket.java:529) at t(Socket.java:478) at java.Socket.<init>(Socket.java:375) at java.Socket.<init>(Socket.java:218) at&sql.t(StandardSocketFactory.java:256) at&sql.jdbc.MysqlIO.<init>(MysqlIO.java:271) at&sql.ateNewIO(Connection.java:2771) at&sql.jdbc.Connection.<init>(Connection.java:1555) at&sql.t(NonRegisteringDriver.java:285) at java.Connection(DriverManager.java:582) at java.Connection(DriverManager.java:185) at org.apache.sqoop.mapreduce.Connection(DBConfiguration.java:294) at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(AsyncSqlRecordWriter.java:76) at org.apache.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:95) at org.apache.sqoop.RecordWriter(ExportOutputFormat.java:77) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:628) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:753) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.curity.AccessController.doPrivileged(Native Method) at javax.curity.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.curity.UrGroupInformation.doAs(UrGroupInformation.java:1149) at org.apache.hadoop.mapred.Child.main(Child.java:249) ** END NESTED EXCEPTION ** Last packet nt to the rver was 1 ms ago. at org.apache.sqoop.RecordWriter(ExportOutputFormat.java:79) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:628) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:753) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.curity.AccessController.doPrivileged(Native Method) at javax.curity.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.curity.UrGroupInformation.doAs(UrGroupInformation.java:1149) 远瞩at org.apache.hadoop.mapred.Child.main(Child.java:249) Caud by:&sql.jdbc.CommunicationsException: Communications link failure due to underlying exception: ** BEGIN NESTED EXCEPTION ** java.ConnectException MESSAGE: Connection refud mysql 用户权限问题
mysql> show grants; mysql> GRANT ALL PRIVILEGES ON *.* TO ‘root’@'%’ IDENTIFIED BY PASSWORD ‘*4ACFE3202A5FF5CF467898FC58AAB1D615029441′ WITH GRANT OPTION; mysql> FLUSH PRIVILEGES; mysql> create table test (mkey varchar(30),pkg varchar(50),cid varchar(20),pid varchar(50),count int,primary key(mkey,pkg,cid,pid) ); alter ignore table cps_activation add unique index_day_pkgname_cid_pid (`day`,`pkgname`,`cid`,`pid`); Query OK, 0 rows affected (0.03 c) 2. 表不存在
=========== [hadoop@hs11 ~]$ sqoop export –connect jdbc:mysql://10.10.20.11/test –urname root –password admin –table test –export-dir /ur/hive/warehou/actmp Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail. Plea t $HCAT_HOME to the root of your HCatalog installation. 13/08/21 09:16:26 WARN tool.BaSqoopTool: Setting your password on the command-line is incure. Consider using -P instead. 13/08/21 09:16:26 INFO manager.MySQLManager: Preparing to u a MySQL streaming resultt. 13/08/21 09:16:26 INFO tool.CodeGenTool: Beginning code generation 13/08/21 09:16:27 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1 13/08/21 09:16:27 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1 13/08/21 09:16:27 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/hadoop-1.1.2 Note: /tmp/sqoop-hadoop/compile/74d18a91ec141f2feb777dc698bf7eb4/test.java us or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 13/08/21 09:16:28 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/74d18a91ec141f2feb777dc698bf7eb4/test.jar 13/08/21 09:16:28 INFO mapreduce.ExportJobBa: Beginning export of test 13/08/21 09:16:29 INFO input.FileInputFormat: Total input paths to process : 1 13/08/21 09:16:29 INFO input.FileInputFormat: Total input paths to process : 1 13/08/21 09:16:29 INFO util.NativeCodeLoader: Loaded the native-hadoop library 13/08/21 09:16:29 WARN snappy.LoadSnappy: Snappy native library not loaded 13/08/21 09:16:29 INFO mapred.JobClient: Running job: job_201307251523_0060 13/08/21 09:16:30 INFO mapred.JobClient: map 0% reduce 0% 13/08/21 09:16:38 INFO mapred.JobClient: Task Id : attempt_201307251523_0060_m_000000_0, Status : FAILED java.io.IOException: Can’t export data, plea check task tracker logs at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112) at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144) at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.curity.AccessController.doPrivileged(Native Method) at javax.curity.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.curity.UrGroupInformation.doAs(UrGroupInformation.java:1149) at org.apache.hadoop.mapred.Child.main(Child.java:249) Caud by: java.util.NoSuchElementException at java.util.(AbstractList.java:350) at test.__loadFromFields(test.java:252) at test.par(test.java:201) at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83) … 10 more 导出数据到MySQL,当然数据库表要先存在,否则会报错
此错误的原因为sqoop解析文件的字段与MySql数据库的表的字段对应不上造成的。因此需要在执行的时候给sqoop增加参数,告诉sqoop文件的分隔符,使它能够正确的解析文件字段。hive默认的字段分隔符为’\001′
===========
3. null字段填充符需指定 没有指定null字段分隔符,导致错位。
[hadoop@hs11 ~]$ sqoop export –connect jdbc:mysql://10.10.20.11/test –urname root –password admin –table test –export-dir /ur/hive/warehou/actmp –input-fields-tminated-by ‘\001′ Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail. Plea t $HCAT_HOME to the root of your HCatalog installation. 13/08/21 09:21:07 WARN tool.BaSqoopTool: Setting your password on the command-line is incure. Consider using -P instead. 13/08/21 09:21:07 INFO manager.MySQLManager: Preparing to u a MySQL streaming resultt. 13/08/21 09:21:07 INFO tool.CodeGenTool: Beginning code generation 13/08/21 09:21:07 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1 13/08/21 09:21:07 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1 13/08/21 09:21:07 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/hadoop-1.1.2 Note: /tmp/sqoop-hadoop/compile/04d183c9e534cdb8d735e1bdc4be3deb/test.java us or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 13/08/21 09:21:08 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/04d183c9e534cdb8d735e1bdc4be3deb/test.jar 13/08/21 09:21:08 INFO mapreduce.ExportJobBa: Beginning export of test 13/08/21 09:21:09 INFO input.FileInputFormat: Total input paths to process : 1 13/08/21 09:21:09 INFO input.FileInputFormat: Total input paths to process : 1 13/08/21 09:21:09 INFO util.NativeCodeLoader: Loaded the native-hadoop library 13/08/21 09:21:09 WARN snappy.LoadSnappy: Snappy native library not loaded 13/08/21 09:21:10 INFO mapred.JobClient: Running job: job_201307251523_0061 13/08/21 09:21:11 INFO mapred.JobClient: map 0% reduce 0% 13/08/21 09:21:17 INFO mapred.JobClient: map 25% reduce 0% 13/08/21 09:21:19 INFO mapred.JobClient: map 50% reduce 0% 13/08/21 09:21:21 INFO mapred.JobClient: Task Id : attempt_201307251523_0061_m_000001_0, Status : FAILED java.io.IOException: Can’t export data, plea check task tracker logs at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112) at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144) at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.curity.AccessController.doPrivileged(Native Method) at javax.curity.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.curity.UrGroupInformation.doAs(UrGroupInformation.java:1149) at org.apache.hadoop.mapred.Child.main(Child.java:249) Caud by: java.lang.NumberFormatException: For input string: “665A5FFA-32C9-9463-1943-840A5FEAE193″ at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48) at java.lang.Integer.parInt(Integer.java:458) at java.lang.Integer.valueOf(Integer.java:554) at test.__loadFromFields(test.java:264) at test.par(test.java:201) at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83) … 10 more =========== 4.成功
[hadoop@hs11 ~]$ sqoop export –connect jdbc:mysql://10.10.20.11/test –urname root –password admin –table test –export-dir /ur/hive/warehou/actmp –input-fields-terminated-by ‘\001′ –input-null-string ‘\\N’ –input-null-non-string ‘\\N’ Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail. Plea t $HCAT_HOME to the root of your HCatalog installation. 13/08/21 09:36:13 WARN tool.BaSqoopTool: Setting your password on the command-line is incure. Consider using -P instead. 13/08/21 09:36:13 INFO manager.MySQLManager: Preparing to u a MySQL streaming resultt. 13/08/21 09:36:13 INFO tool.CodeGenTool: Beginning code generation 13/08/21 09:36:13 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1 13/08/21 09:36:13 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 1 13/08/21 09:36:13 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/hadoop-1.1.2 最好不相见Note: /tmp/sqoop-hadoop/compile/e22d31391498b790d799897cde25047d/test.java us or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 13/08/21 09:36:14 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/e22d31391498b790d799897cde25047d/test.jar 13/08/21 09:36:14 INFO mapreduce.ExportJobBa: Beginning export of test 13/08/21 09:36:15 INFO input.FileInputFormat: Total input paths to process : 1 13/08/21 09:36:15 INFO input.FileInputFormat: Total input paths to process : 1 13/08/21 09:36:15 INFO util.NativeCodeLoader: Loaded the native-hadoop library 13/08/21 09:36:15 WARN snappy.LoadSnappy: Snappy native library not loaded 13/08/21 09:36:16 INFO mapred.JobClient: Running job: job_201307251523_0064 13/08/21 09:36:17 INFO mapred.JobClient: map 0% reduce 0% 13/08/21 09:36:23 INFO mapred.JobClient: map 25% reduce 0% 13/08/21 09:36:25 INFO mapred.JobClient: map 100% reduce 0% 13/08/21 09:36:27 INFO mapred.JobClient: Job complete: job_201307251523_0064 13/08/21 09:36:27 INFO mapred.JobClient: Counters: 18 13/08/21 09:36:27 INFO mapred.JobClient: Job Counters 13/08/21 09:36:27 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=13151 13/08/21 09:36:27 INFO mapred.JobClient: Total time spent by all reduces waiting after rerving slots (ms)=0 13/08/21 09:36:27 INFO mapred.JobClient: Total time spent by all maps waiting after rerving slots (ms)=0 13/08/21 09:36:27 INFO mapred.JobClient: Rack-local map tasks=2 13/08/21 09:36:27 INFO mapred.JobClient: Launched map tasks=4 13/08/21 09:36:27 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=0 13/08/21 09:36:27 INFO mapred.JobClient: File Output Format Counters 13/08/21 09:36:27 INFO mapred.JobClient: Bytes Written=0 13/08/21 09:36:27 INFO mapred.JobClient: FileSystemCounters 用车13/08/21 09:36:27 INFO mapred.JobClient: HDFS_BYTES_READ=1519 13/08/21 09:36:27 INFO mapred.JobClient: FILE_BYTES_WRITTEN=234149 13/08/21 09:36:27 INFO mapred.JobClient: File Input Format Counters 13/08/21 09:36:27 INFO mapred.JobClient: Bytes Read=0 13/08/21 09:36:27 INFO mapred.JobClient: Map-Reduce Framework 13/08/21 09:36:27 INFO mapred.JobClient: Map input records=6 13/08/21 09:36:27 INFO mapred.JobClient: Physical memory (bytes) snapshot=663863296 13/08/21 09:36:27 INFO mapred.JobClient: Spilled Records=0 13/08/21 09:36:27 INFO mapred.JobClient: CPU time spent (ms)=3720 13/08/21 09:36:27 INFO mapred.JobClient: Total committed heap usage (bytes)=2013790208 13/08/21 09:36:27 INFO mapred.JobClient: Virtual memory (bytes) snapshot=5583151104 13/08/21 09:36:27 INFO mapred.JobClient: Map output records=6 13/08/21 09:36:27 INFO mapred.JobClient: SPLIT_RAW_BYTES=571 13/08/21 09:36:27 INFO mapreduce.ExportJobBa: Transferred 1.4834 KB in 12.1574 conds (124.9446 bytes/c) 13/08/21 09:36:27 INFO mapreduce.ExportJobBa: Exported 6 records. ———- 5. mysql字符串长度定义太短,存不下
java.io.IOException:&sql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column ‘pid’ at row 1 at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.clo(AsyncSqlRecordWriter.java:192) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.clo(MapTask.java:651) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.curity.AccessController.doPrivileged(Native Method) at javax.curity.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.curity.UrGroupInformation.doAs(UrGroupInformation.java:1149) at org.apache.hadoop.mapred.Child.main(Child.java:249) Caud by:&sql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column ‘pid’ at row 1 at&sql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2983) at&sql.jdbc.MysqlIO.ndCommand(MysqlIO.java:1631) at&sql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1723) at&sql.SQL(Connection.java:3283) at&sql.uteInternal(PreparedStatement.java:1332) at&sql.ute(PreparedStatement.java:882) at org.apache.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:233) ———————- 6.日期格式问题 mysql date日期格式,hive中字符串必须是yyyy-mm-dd, 我原来使用yyyymmdd,报下面的错误。
13/08/21 17:42:44 INFO mapred.JobClient: Task Id : attempt_201307251523_0079_m_000000_1, Status : FAILED java.io.IOException: Can’t export data, plea check task tracker logs at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112) at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144) at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.curity.AccessController.doPrivileged(Native Method) at javax.curity.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.curity.UrGroupInformation.doAs(UrGroupInformation.java:1149) at org.apache.hadoop.mapred.Child.main(Child.java:249) Caud by: java.lang.IllegalArgumentException at java.sql.Date.valueOf(Date.java:138) at cps_activation.__loadFromFields(cps_activation.java:308) at cps_activation.par(cps_activation.java:255) at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83) … 10 more ———————- 7. 字段对不上或字段类型不一致
Caud by: java.lang.NumberFormatException: For input string: “06701A4A-0808-E9A8-0D28-A8020B494E37″ at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48) at java.lang.Integer.parInt(Integer.java:458) at java.lang.Integer.valueOf(Integer.java:554) at test.__loadFromFields(test.java:264) at test.par(test.java:201) at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83) … 10 more |