Tuesday, February 4, 2014

Back To The Future

Today is my last day at Amazon Web Services.

I have accepted a position as Director of Cloud Operations at Scopely. Scopely is an L.A.-based startup focusing on multiplayer mobile games. They have had a number of very successful games and have a clear vision of where they want to go. And they are all-in on AWS. So, I will be using boto and the AWS CLI and a host of other tools to help them achieve their goals and use AWS efficiently and effectively. They are a great group of people and I'm really excited and grateful for the opportunity. I originally created boto so I could use it to build cool things on AWS and I'm looking forward to doing that again.

Leaving AWS is difficult. I've been a customer since the beginning but being part of AWS as an employee has been a fantastic experience. The things that really stand out for me are:

  • People - Over the past two years I've been gobsmacked by the consistently high quality of the people I've worked with at AWS. It's remarkable.
  • Innovate and Iterate - People talk about the pace of innovation at AWS and it is very impressive. But innovation is just the fun part, the sexy part. What utltimately leads to success is the patient, persistent, customer-focused iteration that occurs after the initial "A ha!" moment. I've never seen anyone do it better.
  • Support for Open Source - When I joined AWS, I brought with me a mature and vibrant open source project. There were innumerable ways things could have gone pear-shaped. But they didn't. We worked together to build a partnership that allowed AWS to contribute while also allowing the boto community to contribute as they always have. In addition to boto, all of the other AWS SDK's are released as open source and welcome contributions. In my experience, AWS has shown a real respect and appreciation for open source software and the communities that emerge around it.
Boto has been immeasurably improved by AWS's participation and I am glad to know it will continue in the future.

Friday, December 16, 2011

Looking at Clouds from Both Sides Now

I'll apologize up front for that horrible pun in the title.  No excuse, really.

After 18 months at Eucalyptus, the best private cloud vendor out there, I have decided to see what things are like on the public cloud side.  As of Monday, December 19, I will be a senior engineer at Amazon Web Services.

I was very reluctant to leave Eucalyptus.  It is a great company, full of great people and with a corporate culture that absolutely cannot be beat.  And, while a lot of people's attention has been focused on shiny new things over the past year, Eucalyptus has quietly and steadily built amazing sales, support, marketing and professional services teams to match their already awesome engineering team.  2012 is going to be another kick-ass year for Eucalyptus and I really hate to miss that.

But the idea of seeing how the sausage is made at the biggest public cloud is an opportunity I couldn't pass up.  In my new job, I will still be focusing on software tools and how to make it easier for developer's to use cloud infrastructures, both public and private.  I will still be doing a lot of Python stuff and definitely still making sure that boto stays a popular, useful and independent open source project just as it did while I was at Eucalyptus.

It should be fun!

Wednesday, December 7, 2011

Don't reboot me, bro!

If you are an AWS user with EC2 instances running, you may have already gotten an email from AWS informing you that your instance(s) will be rebooted in the near future.  I'm not exactly sure what is prompting this massive rebooting binge but the good folks at AWS have actually provided a new EC2 API request just so you can find out about upcoming maintenance events planned for your instances.

We just committed code to boto that adds support for the new DescribeInstanceStatus request.  Using this, you can programmatically query the status of any or all of your EC2 instances and find out if there is a reboot in their future and, if so, when to expect it.

Here's an example of using the new method and accessing the data returned by it.

Sunday, November 13, 2011

Mapping Requests to EC2 API Versions

I recently did some analysis of the EC2 API.  I wanted to look at the API over time so I could remember which API requests were added in each of the 23 separate versions of the API over the past 5 years.  The results were kind of interesting and I thought it would be worthwhile to share them here.

The following image shows a graph of the number of requests over time.  If you click on the image, you will see a high-res PNG version of the information that lets you zoom in to get much greater detail.  The reddish color section of each of the bars in the bar graph actually contain the names of the individual requests added in each version but those are really only readable in the high-res version of the graphic. 

Note that this analysis is only looking at the request level.  I'm not diving deeper to look at the individual parameters in each requests which, in some cases, have also changed over time.  I may do that analysis at some point but it's a huge amount of work and I doubt that I'll find the time.

The raw JSON data behind this can be found in the missingcloud github repo.

Monday, October 31, 2011

Python and AWS Cookbook Available

I recently completed a short book for O'Reilly called "Python and AWS Cookbook".  It's a collection of recipes for solving common problems in Amazon Web Services.  The solutions are all in Python and, of course, use boto heavily.  The focus of this book is EC2 and S3 although there are a couple of quick detours into IAM and SNS.  Many of the examples also work with Eucalyptus so I have included some information about using boto with Eucalyptus as well as with Google Cloud Storage for some of the S3-related recipes.

You can get a hardcopy of the book but if you buy the e-book, you get free updates and I am expecting to do quite a few updates.  Many of the recipes came from problems people have posted on the boto users group or on the boto IRC channel but I'm sure there are lots of other areas where additional example code would be useful.  If you have specific requests, let me know.  Depending on the response, I might also do additional cookbooks that focus on other services.

The bird on the cover is a Sand Grouse.  I lobbied heavily for a Honey Badger but to no avail.

Friday, October 14, 2011

Does Python Scale?

I wonder how many times I've been asked that question over the years.  Often, it's not even in the form of a question (Sorry, Mr. Trebek) but rather stated emphatically; "Python doesn't scale".  This can be the start of long, heated discussions involving Global Interpreter Locks, interpreters vs. compilers, dynamic vs. static typing, etc.  These discussions rarely end satisfactorily for any of the parties involved.  And rarely are any opinions changed as a result.

So, does Python scale?

Well, YouTube is written mostly in Python.  DropBox is written almost entirely in Python.  Reddit.  Quora.  Disqus.  FriendFeed.  These are huge sites, handling gazillions of hits a day.  They are written in Python.  Therefore, Python scales.

Yeah, but what about that web app I wrote that one time.  Hosted on a cheapo, oversubscribed VPS, running straight CGI talking to a remote MySQL database running in a virtual machine on my Macbook Air.  That thing fell over like a drunken sailor when I invited a few of my friends to go check it out.  So, yeah.  Forget what I said before.  Obviously Python doesn't scale.

The truth is, it's the wrong question.  The stuff that allows Dropbox to store a million files every 15 minutes has little to do with Python just as the things that caused my feeble web app to fail had little to do with Python.  It has to do with the overall architecture of the application.  How databases are sharded, how loosely or tightly components have been coupled, how you monitor, and how you react to the data your monitoring is providing you.  And lots of other stuff.  But you have to deal with those issues no matter what language you write the system in.

No reasonable choice of computer language is going to guarantee your success or your failure.  So pick the one you are most productive in and focus on properly architecting your app.  That scales.

Thursday, October 13, 2011

Accessing the Eucalyptus Community Cloud with boto

The Eucalyptus Community Cloud (ECC) is a great resource that allows you to try out a real cloud computing system without installing any software or incurring any costs.  It's a sandbox environment that is maintained by Eucalyptus Systems to allow people to testdrive Eucalyptus software and experiment with cloud computing.

To access the ECC, you need to sign up following the instructions here.  Once you are signed up, you will be able to download a zip file containing the necessary credentials for accessing the ECC.  If you unzip that file somewhere on your local filesystem you will find, among other things, a file called eucarc.  The contents of that file will look something like this:

To get things to work seamlessly in boto, you need to copy a few pieces of information from the eucarc file to your boto config file, which is normally found in ~/.boto.  Here's the info you need to add.  The actual values, of course, should be the ones from your own eucarc file.

Notice that the values needed for eucalyptus_host and walrus_host are just the hostname or ip address of the server as specified in the EC2_HOST and S3_HOST variables.  You don't have to include the port number or the http prefix.  Having edited your boto config file, you can now easily access the ECC services in boto.

This example assumes you are using the latest version of boto from github or the release candidate for version 2.1 of boto.