load hdfs data into infinidb cluster

2 posts / 0 new
Last post
gongcheng911
gongcheng911's picture
Offline
Last seen: 2 weeks 4 days ago
Joined: May 21 2014
Junior Boarder

Posts: 13

qingsen zhou
load hdfs data into infinidb cluster

I got a new problem when I load hdfs data into infinidb cluster(not hadoop mode) [root@hadoop-master bin]# ./sqoop export -D mapred.task.timeout=0 --direct --connect jdbc:infinidb://hadoop-master/test --username root --table usersum --export-dir /testroot/data/sum_20140711*.csv --input-fields-terminated-by '\t'14/07/26 14:46:33 INFO tool.BaseSqoopTool: Found an InfiniDB connect string, using a mysql connection string for compatibility14/07/26 14:46:33 INFO tool.BaseSqoopTool: Using InfiniDB-specific delimiters for output if not explicitly specified14/07/26 14:46:33 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.14/07/26 14:46:33 INFO tool.CodeGenTool: Beginning code generation14/07/26 14:46:33 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `usersum` AS t LIMIT 114/07/26 14:46:33 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `usersum` AS t LIMIT 114/07/26 14:46:34 INFO orm.CompilationManager: HADOOP_HOME is /usr/cdh/hadoop-2.3.0-cdh5.0.2Note: /tmp/sqoop-root/compile/478eed3401a8f47a392ad78c19ee54c5/usersum.java uses or overrides a deprecated API.Note: Recompile with -Xlint:deprecation for details.14/07/26 14:46:35 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/478eed3401a8f47a392ad78c19ee54c5/usersum.jar14/07/26 14:46:35 INFO mapreduce.ExportJobBase: Beginning export of usersum14/07/26 14:46:36 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable14/07/26 14:46:36 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar14/07/26 14:46:37 WARN mapreduce.ExportJobBase: Input path hdfs://hadoop-master:8020/testroot/data/sum_20140711*.csv does not exist14/07/26 14:46:37 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative14/07/26 14:46:37 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps14/07/26 14:46:37 INFO client.RMProxy: Connecting to ResourceManager at hadoop-master/192.168.1.62:803214/07/26 14:46:39 INFO input.FileInputFormat: Total input paths to process : 314/07/26 14:46:39 INFO mapreduce.InfiniDBExportInputFormat: Adding blocks to split hadoop-slave0114/07/26 14:46:39 INFO mapreduce.InfiniDBExportInputFormat:    path hdfs://hadoop-master:8020/testroot/data/sum_201407110900.csv offset 0 length 3599067114/07/26 14:46:39 INFO mapreduce.InfiniDBExportInputFormat: Adding blocks to split hadoop-slave0214/07/26 14:46:39 INFO mapreduce.InfiniDBExportInputFormat:    path hdfs://hadoop-master:8020/testroot/data/sum_201407111000.csv offset 0 length 3599067114/07/26 14:46:39 INFO mapreduce.InfiniDBExportInputFormat: Adding blocks to split hadoop-slave0314/07/26 14:46:39 INFO mapreduce.InfiniDBExportInputFormat:    path hdfs://hadoop-master:8020/testroot/data/sum_201407110800.csv offset 0 length 3599067114/07/26 14:46:39 INFO mapreduce.JobSubmitter: number of splits:314/07/26 14:46:39 INFO Configuration.deprecation: mapred.task.timeout is deprecated. Instead, use mapreduce.task.timeout14/07/26 14:46:39 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1406356950848_000114/07/26 14:46:40 INFO impl.YarnClientImpl: Submitted application application_1406356950848_000114/07/26 14:46:40 INFO mapreduce.Job: The url to track the job:http://hadoop-master:8088/proxy/application_1406356950848_0001/14/07/26 14:46:40 INFO mapreduce.Job: Running job: job_1406356950848_000114/07/26 14:46:47 INFO mapreduce.Job: Job job_1406356950848_0001 running in uber mode : false14/07/26 14:46:47 INFO mapreduce.Job:  map 0% reduce 0%14/07/26 14:46:47 INFO mapreduce.Job: Job job_1406356950848_0001 failed with state FAILED due to: Application application_1406356950848_0001 failed 2 times due to AM Container for appattempt_1406356950848_0001_000002 exited with  exitCode: 1 due to: Exception from container-launch: org.apache.hadoop.util.Shell$ExitCodeException: org.apache.hadoop.util.Shell$ExitCodeException:         at org.apache.hadoop.util.Shell.runCommand(Shell.java:505)        at org.apache.hadoop.util.Shell.run(Shell.java:418)        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)        at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:283)        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:79)        at java.util.concurrent.FutureTask.run(FutureTask.java:262)        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)        at java.lang.Thread.run(Thread.java:744)  Container exited with a non-zero exit code 1.Failing this attempt.. Failing the application.14/07/26 14:46:47 INFO mapreduce.Job: Counters: 014/07/26 14:46:47 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead14/07/26 14:46:47 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 10.0758 seconds (0 bytes/sec)14/07/26 14:46:47 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead14/07/26 14:46:47 INFO mapreduce.ExportJobBase: Exported 0 records.14/07/26 14:46:47 ERROR tool.ExportTool: Error during export: Export job failed!You have new mail in /var/spool/mail/root It seems sqoop can't load the hdfs data into the infinidb(not hadoop mode)or there is some err on my hadoop cluster.ExitCodeException:  has nothing output ,so I don't know the reason yet.

gongcheng911
gongcheng911's picture
Offline
Last seen: 2 weeks 4 days ago
Joined: May 21 2014
Junior Boarder

Posts: 13

qingsen zhou
I have find the problem in

I have find the problem in the log of map-reduce on hdfs

"Could not find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster"

And I have checked job.xml ,got this

<property><name>yarn.nodemanager.env-whitelist</name><value>JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,HADOOP_YARN_HOME</value><source>yarn-default.xml</source></property>

Maybe I have to set these env for the map-reduce job to got hadoop.mapreduce.v2.app.MRAppMaster clase.

But I have set all of the in the .bashrc for root user and in the /etc/profile .

It still can't got the class.

Is it controlled by sqoop ?

what can I do for it?