Knowledge increases exponentially. Today, you probably own more books than great universities of times past—Cambridge University owned less than two hundred books in the fifteenth century. First came the invention of writing, then alphabets, then paper, then the printing press, then mechanization. Each step caused an exponential increase in the collective human knowledge. In our generation, Al Gore invented the internet and the last barriers to the spread of knowledge have been broken. Today, everybody has the ability to contribute, communicate, and collaborate. We are all caught up in a tsunami, an avalanche, a conflagration, a veritable explosion of knowledge for the betterment of humankind. This is the blog of the good folks at Database Specialists, a brave band of Oracle database administrators from the great state of California. We bid you greeting, traveler. We hope you find something of value on these pages and we wish you good fortune in your journey.

RMAN Compression Algorithms in 11gR2

One of the many useful features of rman is its ability to create compressed backup sets.  Prior to the widespread adoption of rman, most backups would be compressed using OS utilities (gzip, compress, zip, winzip).  If you compress rman backup pieces in this manner, then you will need to uncompress them manually before they can be used for recovery.  This leaves room for human error and increases recovery time.

 As of version 11.2.0.2, there are 4 compression algorithms available: BASIC, LOW, MEDIUM and HIGH.  The 11g Backup & Recovery Guide describes these options as follows:

  • BASIC - default compression algorithm
  • HIGH - Best suited for backups over slower networks where the limiting factor is network speed
  • MEDIUM -Recommended for most environments. Good combination of compression ratios and speed
  • LOW - Least impact on backup throughput and suited for environments where CPU resources are the limiting factor.

 Unfortunately, unless you have purchased the Advanced Compression Option, your only choice is BASIC.  Regardless, I did some testing to see the difference in compression ratio as well as the time it takes to backup.  The test script that I used is pretty simple.  It specifies the compression algorithm and then does a full backup of the database and archivelogs.  As a final test, I did a non-compressed rman backup and then used gzip to compress it.  While I wouldn’t recommend you do backups in this way, I think it is interesting for comparison purposes.

 Test script:
#BASIC
CONFIGURE COMPRESSION ALGORITHM ‘BASIC’;
backup as compressed backupset database format ‘/tmp/basic/db%U.dbf’;
backup as compressed backupset archivelog all format ‘/tmp/basic/arch%U.dbf’;

#LOW
CONFIGURE COMPRESSION ALGORITHM ‘LOW’;
backup as compressed backupset database format ‘/tmp/low/db%U.dbf’;
backup as compressed backupset archivelog all format ‘/tmp/low/arch%U.dbf’;

#MEDIUM
CONFIGURE COMPRESSION ALGORITHM ‘MEDIUM’;
backup as compressed backupset database format ‘/tmp/medium/db%U.dbf’;
backup as compressed backupset archivelog all format ‘/tmp/medium/arch%U.dbf’;

#HIGH
CONFIGURE COMPRESSION ALGORITHM ‘HIGH’;
backup as compressed backupset database format ‘/tmp/high/db%U.dbf’;
backup as compressed backupset archivelog all format ‘/tmp/high/arch%U.dbf’;

#NONE
backup database format ‘/tmp/none/db%U.dbf’;
backup archivelog all format ‘/tmp/none/arch%U.dbf’;

This test was done using 11.2.0.2 Enterprise Edition on 64-bit linux.  The total amount of data that needs to be backed up (datafiles plus archivelogs) is approximately 40GB but rman will skip the empty blocks, making the backup smaller than that.

As you can see from the results below, there is a big difference in the compression ratio as well as the amount of time taken.  The HIGH compression took twice as long as LOW but the backup is 56% smaller.  The BASIC option performed quite well, particularly considering that it is the only choice that doesn’t require additional licensing.  Clearly the slowest option is to do a non-compressed backup and then compress it yourself.  As with every test your results may differ from mine.  The type and amount of data as well as the hardware in your environment will all make a difference.  From my test, I conclude that the rman BASIC compression algorithm does quite a good job and there is probably no need to use any of the other options.

 Test Results: 

Algorithm Time to backup (minutes) Size of database backup (GB)
BASIC 33 6.6
LOW 28 7.9
MEDIUM 29 6.4
HIGH 56 4.5
NONE 31 23
NONE + gzip 61 5.2

3 comments to RMAN Compression Algorithms in 11gR2

Leave a Reply

 

 

 

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>