Hadoop2.4.1
HBase0.98.5
[Reference]
http://diveintodata.org/2009/11/27/how-to-make-a-table-in-hbase-for-beginners/
Running java program
$ hadoop jar hbop.jar HBoperation.HbaseOperation
which jar file contant:
package HBoperation;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.*;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
public class HbaseOperation {
public static void main(String[] args) throws Exception{
Configuration myConf = HBaseConfiguration.create(); //create hbase conf object
//Conf generally will be set by $HBASE_HOME/conf(if it is set in $HADOOP_CLASSPATH)
//myConf.set() isn't necessary if the conf has be set in $HADOOP_CLASSPATH
myConf.set("hbase.master", "192.168.0.7:60000");
HBaseAdmin hbase = new HBaseAdmin(conf);//Create Admin to operate HBase
/////////////////////
// Create Table //
/////////////////////
//HTableDescriptor desc = new HTableDescriptor("TEST");//Deprecate from 0.98ver
HTableDescriptor desc = new HTableDescriptor(TableName.valueOf("TEST"));
HColumnDescriptor meta = new HColumnDescriptor("personal".getBytes());
HColumnDescriptor pref = new HColumnDescriptor("account".getBytes());
desc.addFamily(meta);
desc.addFamily(pref);
hbase.createTable(desc);
///////////////////////
// Connect Table //
///////////////////////
HConnection hconnect = HConnectionManager.createConnection(conf);
HTableInterface testTable = hconnect.getTable("TEST");
//////////////////////////
// Put Data to Table //
//////////////////////////
Put p = new Put(Bytes.toBytes("student1"));
p.add(Bytes.toBytes("personal"), Bytes.toBytes("name"), Bytes.toBytes("John"));
p.add(Bytes.toBytes("account"), Bytes.toBytes("id"), Bytes.toBytes("3355454"));
testTable.put(p);
testTable.close();
hbase.close();
}
}
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.*;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
public class HbaseOperation {
public static void main(String[] args) throws Exception{
Configuration myConf = HBaseConfiguration.create(); //create hbase conf object
//Conf generally will be set by $HBASE_HOME/conf(if it is set in $HADOOP_CLASSPATH)
//myConf.set() isn't necessary if the conf has be set in $HADOOP_CLASSPATH
myConf.set("hbase.master", "192.168.0.7:60000");
HBaseAdmin hbase = new HBaseAdmin(conf);//Create Admin to operate HBase
/////////////////////
// Create Table //
/////////////////////
//HTableDescriptor desc = new HTableDescriptor("TEST");//Deprecate from 0.98ver
HTableDescriptor desc = new HTableDescriptor(TableName.valueOf("TEST"));
HColumnDescriptor meta = new HColumnDescriptor("personal".getBytes());
HColumnDescriptor pref = new HColumnDescriptor("account".getBytes());
desc.addFamily(meta);
desc.addFamily(pref);
hbase.createTable(desc);
///////////////////////
// Connect Table //
///////////////////////
HConnection hconnect = HConnectionManager.createConnection(conf);
HTableInterface testTable = hconnect.getTable("TEST");
//////////////////////////
// Put Data to Table //
//////////////////////////
Put p = new Put(Bytes.toBytes("student1"));
p.add(Bytes.toBytes("personal"), Bytes.toBytes("name"), Bytes.toBytes("John"));
p.add(Bytes.toBytes("account"), Bytes.toBytes("id"), Bytes.toBytes("3355454"));
testTable.put(p);
testTable.close();
hbase.close();
}
}
- Check HBase
$hbase shell
hbase>list
- Result
TABLE
1 row(s) in 0.0390 seconds
[Problem]
When I run the jar file fisrt time, the error occurs as following:
"opening socket connection to server localhost 127.0.0.1:2181 will not attempt to authenticate using SASL"
[Solution]
- THINK:
We set the HBase locaiton(with Zookeeper) is "192.168.0.7" , so "server localhost 127.0.0.1" is weird. Maybe the Hbase conf doen't be include in HADOOP_CLASSPATH, because we used "hadoop jar" command.
-Method:
1. modify the bashrc file and add:
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HBASE_HOME/conf
2. rerun the env conf:
$. ~/.bashrc
沒有留言:
張貼留言