2017-02-03 4 views
0

Я пытаюсь выполнить следующую команду Sqoop:Sqoop ошибка при попытке подключиться

sqoop import --connect jdbc:mysql://localhost:3306/sunil_sqoop --table sqoop_emp --username root --password 225dvrdlr) 

Однако, я получаю эту ошибку:

17/02/04 00:04:53 WARN security.UserGroupInformation: PriviledgedActionException as:avinash (auth:SIMPLE) cause:java.io.FileNotFoundException: File does not exist: hdfs://localhost:9000/home/avinash/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/lib/slf4j-api-1.6.1.jar 17/02/04 00:04:53 ERROR tool.ImportTool: Encountered IOException running import job: java.io.FileNotFoundException: File does not exist: hdfs://localhost:9000/home/avinash/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/lib/slf4j-api-1.6.1.jar at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1093) at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1085) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1085) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93) at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57) at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:267) at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:388) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:481) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313) at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196) at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169) at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266) at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673) at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) at org.apache.sqoop.Sqoop.run(Sqoop.java:143) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227) at org.apache.sqoop.Sqoop.main(Sqoop.java:236)

Что я должен делать.

ответ

0

Ошибка:

File does not exist: hdfs://localhost:9000/home/avinash/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/lib/slf4j-api-1.6.1.jar 

Вы должны скопировать файл SLF4J-Апи-1.6.1.jar в каталог в HDFS:

home/avinash/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/lib/. 

Или вы можете скопировать эту банку в Oozie sharelib.