Month: August 2016

How to Export/Import HBase Table

Posted on Updated on

– EXPORT –

eg, hbase table: hbase_test_table
and today date is 20160820

1. Create a temporary folder in hdfs for the exported files:

$ hadoop fs -mkdir /tmp/hbasedump/20160820

2. Execute this shell command in any hadoop node that has hbase gateway

$ hbase org.apache.hadoop.hbase.mapreduce.Export hbase_test_table /tmp/hbasedump/20160820/hbase_test_table

3. Please don’t forget to get the table structure, so you will be able to import the data back later on if needed.

$ hbase shell
 hbase-shell> describe 'hbase_test_table'
Table hbase_test_table is ENABLED
 hbase_test_table
 COLUMN FAMILIES DESCRIPTION
 {NAME => 'test_cf', DATA_BLOCK_ENCODING => 'NONE', BLOOMFILTER => 'NONE', REPLICATION_SCOPE => '0', COMPRESSION => 'SNAPPY', VERSIONS => '1', MIN_VERSIONS => '0', TTL => 'FOREVER'
 , KEEP_DELETED_CELLS => 'FALSE', BLOCKSIZE => '65536', IN_MEMORY => 'false', BLOCKCACHE => 'false'}
 1 row(s) in 0.1290 seconds

– IMPORT –

eg, hbase table: test_import_hbase_test_table

1. Let say you have the dumped export file for that table in (hdfs) /tmp/hbasedump/20160820/hbase_test_table
And you want to import it to a new table “test_import_hbase_test_table”

2.

$ hbase shell

– Create the table if it’s not yet created “test_import_hbase_test_table”
– Create the table with the same column family name (get the information on the export step #3 above).

3. Start the import process:

$ hbase org.apache.hadoop.hbase.mapreduce.Import "test_import_hbase_test_table" "/tmp/hbasedump/20160820/hbase_test_table"
Advertisements