`
coolsunchen
  • 浏览: 61785 次
  • 性别: Icon_minigender_1
  • 来自: 深圳
文章分类
社区版块
存档分类
最新评论

oozie调sqoop从oracle把数导入到hive报错解决

 
阅读更多

oozie调sqoop从oracle把数导入到hive中,
用oozie调度会报以下的错,但是直接用sqoop运行脚本就没有问题
51370 [Thread-67] INFO  org.apache.sqoop.hive.HiveImport  - FAILED: Error in semantic analysis: Line 2:17 Path is not legal ''hdfs://biginsightmaster:9000/user/biadmin/STUDENT'': Move from: hdfs://biginsightmaster:9000/user/biadmin/STUDENT to: hdfs://BigInsightMaster:9000/biginsights/hive/warehouse/student is not valid. Please check that values for params "default.fs.name" and "hive.metastore.warehouse.dir" do not conflict.
51404 [main] ERROR org.apache.sqoop.tool.ImportTool  - Encountered IOException running import job: java.io.IOException: Hive exited with status 10
at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:364)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:314)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:226)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:415)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:205)
at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:174)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:37)
at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:47)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:467)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:435)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:371)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(AccessController.java:310)
at javax.security.auth.Subject.doAs(Subject.java:573)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
at org.apache.hadoop.mapred.Child.main(Child.java:249)

在通过Oozie调用sqoop action从DB导入数据到hive表的时候,发现了异常。

使用sqoop命令行执行却没有报错。

我翻查了源代码,发现Hive会做一个强制校验,校验uri的scheme值,特别是在sqoop的时候,会生成一个Hive脚本,在做load data local inpath 的时候进行scheme的校验。
我也翻查了所有读取配置,都是配置成一样的DNS,大写的。
(Scheme也就是一个hdfs地址的NameNode地址部分。)

如果发现Scheme不一样,如,大小写不一致,就会出现Path is not legal 的Exception

Hive的校验代码:
// only in 'local' mode do we copy stuff from one place to another.
    // reject different scheme/authority in other cases.
    if (!isLocal
        && (!StringUtils.equals(fromURI.getScheme(), toURI.getScheme()) || !StringUtils
        .equals(fromURI.getAuthority(), toURI.getAuthority()))) {
      String reason = "Move from: " + fromURI.toString() + " to: "
          + toURI.toString() + " is not valid. "
          + "Please check that values for params "default.fs.name" and "
          + ""hive.metastore.warehouse.dir" do not conflict.";
      throw new SemanticException(ErrorMsg.ILLEGAL_PATH.getMsg(ast, reason));
}

同样因为schema大小写不同用eclipse连接hive JDBC读hive中的表也会报错:
java.io.IOException: cannot find dir = hdfs://biginsightmaster:9000/biginsights/hive/warehouse/student/part-m-00000 in pathToPartitionInfo:[hdfs://BigInsightMaster:9000/biginsights/hive/warehouse/student];
hive 设置有检查机制,是大小写敏感的,hdfs://biginsightmaster:9000/biginsights/hive/warehouse/student/与hdfs://BigInsightMaster:9000/biginsights/hive/warehouse/student/的schema不一致导致的

解决方法一,修改hive默认的schema:
hive安装选择元数据存储在derby里面,只需把derby里面的元数据中存放的hdfs的schema修改即可
./ij
connect 'jdbc:derby://BigInsightMaster:1528/commonDb;user=HIVE;password=HIVE';

update DBS set DB_LOCATION_URI = replace(DB_LOCATION_URI, 'BigInsightMaster', 'biginsightmaster')

update SDS set LOCATION =  replace(LOCATION, 'BigInsightMaster', 'biginsightmaster') where LOCATION like '%BigInsightMaster%';
解决方法二,单独修改student表的schema
describe extended student 查看location并修改location
hive> alter table student set location 'hdfs://biginsightmaster:9000/biginsights/hive/warehouse/student';

分享到:
评论

相关推荐

Global site tag (gtag.js) - Google Analytics