Sunday, June 13, 2010

Using Reduced Redundancy Storage (RRS) in S3

This is just a quick blog post to provide a few examples of using the new Reduced Redundancy Storage (RRS) feature of S3 in boto.  This new storage class in S3 gives you the option to tradeoff redundancy for cost.  The normal S3 service (and corresponding pricing) is based on a 12-nines 11 nines (yes, that's 99.999999999% - Thanks to Jeff Barr for correction in comments below) level of durability.  In order to achieve this extremely highly level of reliability, the S3 service must incorporate a high-level of redundancy.  In other words, it keeps many copies of your data in many different locations so that even if multiple locations encounter failures, your data will still be safe.

That's a great feature but not everyone needs that level of redundancy.  If you already have copies of your data locally and are just using S3 as a convenient place to store data that is actively being accessed by services within the AWS infrastructure, RRS may be for you.  It provides a much lower level of durability (99.99%) at a significantly lower cost.  If that fits the bill for you, the next three code snippets will provide you with the basics you need to start using RRS in boto.

Create a New S3 Key Using the RRS Storage Class


Convert An Existing S3 Key from Standard Storage Class to RRS


Create a Copy of an Existing S3 Key Using RRS

6 comments:

  1. S3's standard storage class is actually "just" 11 9's of durability.

    ReplyDelete
  2. Thanks for posting this and implementing the storage class changing feature. I wrote a very simple extension of your script which will convert all the objects in a bucket to the reduced redundancy model.
    http://www.bryceboe.com/2010/07/02/amazon-s3-convert-objects-to-reduced-redundancy-storage/

    ReplyDelete
  3. I get this error

    AttributeError: 'Provider' object has no attribute 'storage_class'

    Does this mean my bucket does not have this RRS feature enabled?

    ReplyDelete
  4. I too get this "AttributeError: 'Provider' object has no attribute 'storage_class'"

    Anyone found any solutions??

    ReplyDelete
  5. Hi -

    Could you create an issue on the Google Code project site (http://boto.googlecode.com/)? Provide any details you can and then I can track this to resolution. Thanks!

    Mitch

    ReplyDelete
  6. Hi,

    I am trying to store new files to S3 using boto v 1.9x and this is what I am doing:-

    headers['x-amz-acl'] = acl
    headers['x-amz-storage-class'] = 'REDUCED_REDUNDANCY'
    key.set_contents_from_filename(filename, headers,)


    However, when I list the metadata of the key after I uploaded my file to S3, the storage class is 'STANDARD' and is not 'REDUCED_REDUNDANCY' as expected.

    Did anyone try uploading to S3 using boto v1.9x and seen anything like this??

    Thanks,
    KT

    ReplyDelete