本文介绍了NoSuchMethodError HTableDescriptor.addFamily的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经安装了hadoop 2.5.2和hbase 1.0.1.1(它们相互兼容)。但在hadoop代码中,我试图在hbase表中添加columnfamily。

我的代码是

$ p $ 配置hbaseConfiguration = HBaseConfiguration.create() ;
作业hbaseImportJob =新作业(hbaseConfiguration,FileToHBase);


HBaseAdmin hbaseAdmin = new HBaseAdmin(hbaseConfiguration);

if(!hbaseAdmin.tableExists(Config_values.tableName)){
TableName tableName1 = TableName.valueOf(tableName);
HTableDescriptor hTableDescriptor = new HTableDescriptor(tableName1);
HColumnDescriptor hColumnDescriptor1 = new HColumnDescriptor(columnFamily1);
hTableDescriptor.addFamily(hColumnDescriptor1);
hbaseAdmin.createTable(hTableDescriptor);
}

我得到这个错误


解决方案

为了安全起见,您应该使用相同版本的hbase编译和运行你的jar。当您运行mvn clean package -DskipTests = true时,请确保您的hbase pom依赖与您的cdh hbase(不是版本)相匹配,但与其包含的方法相同,cdh可能不遵循apache orign。也许你可以尝试使用它在网站上支持的pom(maven仓库)。

 < name> c-cdh-行家-DEP< /名称> 
< url> https://repository.cloudera.com/artifactory/cloudera-repos/< / url>
<! - < url> http://maven.apache.org< / url> - >

<属性>
< project.build.sourceEncoding> UTF-8< /project.build.sourceEncoding>
< / properties>


<! - < repositories> <库> < ID> Cloudera的< / ID> < URL> HTTPS://repository.cloudera.com/artifactory/cloudera-repos/< / URL>
< / repository> < /储存库> - >

<依赖关系>
<! - <依赖关系> <&的groupId GT; junit的< /&的groupId GT; < artifactId的> junit的< / artifactId的>
< version> 3.8.1< / version> <范围>试验< /范围> < /依赖性> - >

< dependency>
< groupId> org.apache.hadoop< / groupId>
< artifactId> hadoop-common< / artifactId>
< version> 2.6.0-cdh5.7.0< / version>
< /依赖关系>

<! - <依赖>
< groupId> org.apache.hadoop< / groupId>
< artifactId> hadoop-hdfs< / artifactId>
< version> 2.6.0-cdh5.7.0< / version>
< /依赖关系> - >
<! - <依赖关系>
< groupId> org.apache.hadoop< / groupId>
< artifactId> hadoop-maven-plugins< / artifactId>
< version> 2.6.0-cdh5.7.0< / version>
< /依赖关系> - >
< dependency>
< groupId> org.apache.hbase< / groupId>
< artifactId> hbase-client< / artifactId>
< version> 1.2.0-cdh5.7.0< / version>
< /依赖关系>
<! - <依赖关系>
< groupId> org.apache.hbase< / groupId>
< artifactId> hbase-client< / artifactId>
< version> 1.2.0< / version>
< /依赖关系> - >


< dependency>
< groupId> org.apache.hbase< / groupId>
< artifactId> hbase-hadoop2-compat< / artifactId>
< version> 1.2.0< / version>
< /依赖关系>

< dependency>
< groupId> org.apache.spark< / groupId>
< artifactId> spark-core_2.10< / artifactId>
< version> 1.5.1< / version>
< /依赖关系>

< dependency>
< groupId> org.apache.spark< / groupId>
< artifactId> spark-sql_2.10< / artifactId>
< version> 1.5.1< / version>
< /依赖关系>
< dependency>
< groupId> org.apache.spark< / groupId>
< artifactId> spark-hive_2.10< / artifactId>
< version> 1.5.1< / version>
< /依赖关系>
< dependency>
< groupId> org.apache.spark< / groupId>
< artifactId> spark-streaming_2.10< / artifactId>
< version> 1.5.1< / version>
< /依赖关系>

< dependency>
< groupId> org.apache.hadoop< / groupId>
< artifactId> hadoop-client< / artifactId>
< version> 2.6.0< / version>
< /依赖关系>
<! - Hadoop dep end - >

< dependency>
< groupId> org.apache.spark< / groupId>
< artifactId> spark-streaming-kafka_2.10< / artifactId>
< version> 1.5.1< / version>
< /依赖关系>
<! - spark dep end - >

< dependency>
< groupId> org.clojure< / groupId>
< artifactId> clojure< / artifactId>
< version> 1.6.0< / version>
< /依赖关系>
< dependency>
< groupId> com.google.guava< / groupId>
< artifactId>番石榴< / artifactId>
< version> 11.0.2< / version>
< /依赖关系>

< dependency>
< groupId> com.google.protobuf< / groupId>
< artifactId> protobuf-java< / artifactId>
< version> 2.5.0< / version>
< /依赖关系>
< dependency>
< groupId> io.netty< / groupId>
< artifactId> netty< / artifactId>
< version> 3.6.6.Final< / version>
< /依赖关系>

< dependency>
< groupId> org.apache.zookeeper< / groupId>
< artifactId> zookeeper< / artifactId>
< version> 3.4.5< / version>
< /依赖关系>
< dependency>
< groupId> org.cloudera.htrace< / groupId>
< artifactId> htrace-core< / artifactId>
< version> 2.01< / version>
< /依赖关系>


<! - <依赖>
< groupId> org.apache.hbase< / groupId>
< artifactId> hbase< / artifactId>
< version> 2.0.0-SNAPSHOT< / version>
< type> pom< / type>
< /依赖关系> - >



<! - https://mvnrepository.com/artifact/org.apache.hbase/hbase - >
<! - <依赖关系>
< groupId> org.apache.hbase< / groupId>
< artifactId> hbase< / artifactId>
< version> 1.2.0< / version>
< type> pom< / type>
< /依赖关系> - >



< groupId> org.apache.hbase< / groupId>
< artifactId> hbase-common< / artifactId>
< version> 1.0.0< / version>
< /依赖关系> - >
< dependency>
< groupId> org.apache.hbase< / groupId>
< artifactId> hbase-server< / artifactId>
< version> 1.2.0< / version>
< /依赖关系>


I have installed hadoop 2.5.2 and hbase 1.0.1.1 (which are compatible with each other) .But In the hadoop code I am trying to add columnfamily in the hbase table.

My code is

Configuration hbaseConfiguration =  HBaseConfiguration.create();
Job hbaseImportJob = new Job(hbaseConfiguration, "FileToHBase");


HBaseAdmin hbaseAdmin = new HBaseAdmin(hbaseConfiguration);

if (!hbaseAdmin.tableExists(Config_values.tableName)) {        
    TableName tableName1 = TableName.valueOf("tableName");
    HTableDescriptor hTableDescriptor = new HTableDescriptor(tableName1);
    HColumnDescriptor hColumnDescriptor1 = new HColumnDescriptor("columnFamily1");                                 
    hTableDescriptor.addFamily(hColumnDescriptor1);                                   
    hbaseAdmin.createTable(hTableDescriptor);
}

I am getting this error

解决方案

For safety reason, you should use the same version of hbase for both compiling and running you jar. when you run mvn clean package -DskipTests=true,make sure that your hbase pom dependence match with your cdh hbase ,not version ,but the same method it contains,the cdh may not follow the apache orign. Maybe you can try use the pom (maven repository) which cdh support on it website.

    <name>c-cdh-maven-dep</name>
<!---  you need try both  -->
    <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
    <!---  i have tried this and it works well -->
    <!-- <url>http://maven.apache.org</url> -->

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    </properties>


    <!-- <repositories> <repository> <id>cloudera</id> <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url> 
        </repository> </repositories> -->

    <dependencies>
        <!-- <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> 
            <version>3.8.1</version> <scope>test</scope> </dependency> -->

        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>2.6.0-cdh5.7.0</version>
        </dependency>

    <!--    <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-hdfs</artifactId>
            <version>2.6.0-cdh5.7.0</version>
        </dependency> -->
<!--        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-maven-plugins</artifactId>
            <version>2.6.0-cdh5.7.0</version>
        </dependency> -->
        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-client</artifactId>
            <version>1.2.0-cdh5.7.0</version>
        </dependency>
<!--        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-client</artifactId>
            <version>1.2.0</version>
        </dependency> -->


        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-hadoop2-compat</artifactId>
            <version>1.2.0</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.5.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.10</artifactId>
            <version>1.5.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-hive_2.10</artifactId>
            <version>1.5.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.10</artifactId>
            <version>1.5.1</version>
        </dependency>

        <!-- hadoop dependency start -->
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>2.6.0</version>
        </dependency>
        <!-- Hadoop dep end -->

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming-kafka_2.10</artifactId>
            <version>1.5.1</version>
        </dependency>
        <!-- spark dep end -->

        <dependency>
            <groupId>org.clojure</groupId>
            <artifactId>clojure</artifactId>
            <version>1.6.0</version>
        </dependency>
        <dependency>
            <groupId>com.google.guava</groupId>
            <artifactId>guava</artifactId>
            <version>11.0.2</version>
        </dependency>

        <dependency>
            <groupId>com.google.protobuf</groupId>
            <artifactId>protobuf-java</artifactId>
            <version>2.5.0</version>
        </dependency>
        <dependency>
            <groupId>io.netty</groupId>
            <artifactId>netty</artifactId>
            <version>3.6.6.Final</version>
        </dependency>

        <dependency>
            <groupId>org.apache.zookeeper</groupId>
            <artifactId>zookeeper</artifactId>
            <version>3.4.5</version>
        </dependency>
        <dependency>
            <groupId>org.cloudera.htrace</groupId>
            <artifactId>htrace-core</artifactId>
            <version>2.01</version>
        </dependency>


    <!--    <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase</artifactId>
            <version>2.0.0-SNAPSHOT</version>
            <type>pom</type>
        </dependency> -->



        <!-- hbase dep start -->
        <!-- https://mvnrepository.com/artifact/org.apache.hbase/hbase -->
<!--        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase</artifactId>
            <version>1.2.0</version>
            <type>pom</type>
        </dependency> -->



<!--        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-common</artifactId>
            <version>1.0.0</version>
        </dependency> -->
        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-server</artifactId>
            <version>1.2.0</version>
        </dependency>

这篇关于NoSuchMethodError HTableDescriptor.addFamily的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-21 05:09