S3 Commando (Line)

Image credit: Screenshot of Hulk Hogan in Suburban Commando

Image crdit: Screenshot of Hulk Hogan in Suburban Commando

In my last post I wrote about moving bavatuesdays (and the rest of my personal, server based online life) to Reclaim Hosting. It was long overdue, and it might be appropriate to recognize that my partner in all things Reclaim, Tim Owens, has breathed new life into just about everything I do these days. Not only did he architect the Reclaim Hosting infrastructure from the ground up, but he’s also a ticket support guru. More than that, he’s got me installing and playing with new open source applications like Known, imagining possibilities for virtualized servers environments, and even pushing myself to wrap my head around services like Amazon S3, which I’ll get to shortly.

He’s inspired new life into the work we do at DTLT, in fact he’s really helped us return to the experimental spirit and fun that defined this group from 2004 through 2008. I would wager I’m not the only one at DTLT who feels this way, because on top of all the awesome he brings to bear on what we do, Tim’s a complete pleasure to work with—sharp sense of humor, no ego, and all about pushing on the edge of edtech ferociously. En fuego.

I wrote that so I can write this, Tim has been coaching me though my move from my old host to Reclaim, and we’ve been in a holding pattern for a few months. The obvious reason is time, another is I suck at passwords, and a third is I have some crazy DNS, subdomain, wildcard apache redirects happening on the multi-network WordPress site I have been hacking on since 2007. It’s a house of cards, and after this move I’m going to revisit how it’s setup, but in the meantime making sure the various domains and subdomains mapped onto my multi-network site worked took a bit of courage.

I finally committed to moving my online life this weekend. After doing a full backup of my cPanel, I tried uploading it via FTP to the Reclaim Hosting server I would be moving to. Unfortunately the 12 GB compressed file failed to transfer to the new server via both FTP and the File Manager in cPanel. The only other way I thought I could get this fairly large file to Tim was uploading it directly to Amazon S3, which I did. Turns out, that was a great solution because Tim showed me how I could use S3 command line ( s3cmd ) to actually transfer that file to the server in a matter of seconds (the download from my old server took over four hours, and the upload to S3 took over an hour at least).  So, below is a tutorial Tim created for me explaining how I could use s3cmd to transfer my full backup from Amazon S3 to the Reclaim Hosting server. So cool!*

The tutorial below was written by Tim Owens, I am stealing it for posterity. I blurred out various credentials for Amazon s3, but the commands he employs to make this happen are all there. I’m beginning to understand why the hardcore geeks prrfer command line, more custom control!


Using s3cmd to work with Amazon S3

s3cmd is a command line tool that will allow you to interact directly with Amazon S3 buckets that you have access to. You can do a variety of things from downloading or uploading files/folders to syncing a folder’s contents to a bucket. Here’s a quick tutorial to download a specific file (you can view all available s3cmd commands here).

1. Log in to a server via ssh

2. You will use the command s3cmd –configure to setup your access to S3. It asks for the Access Key ID and Secret Access Key. I don’t currently put anything in for encryption or SSL.

3. You can test the credentials to confirm s3cmd can communicate with Amazon S3. Then navigate to where you want to download the file to. You can list the files in a bucket by typing s3cmd ls s3://bucketname

4. Downloading a file is as easy as typing s3cmd get s3://bucketname/filename and it will start downloading. Transfer is really fast since it’s a direct connection between the server and Amazon S3.

5. You’re done! The file will now be stored locally on the server.

*Tim’s tutorial is so good, particularly because he did the transfer to reclaim while writing it, saving me an extra step of actually using the s3cmd 🙂 Although, I spent the next hour transfering random large files to the Reclaim server just to be blown away by a 1 GB file being transfered in 15 seconds.

This entry was posted in Archiving, AWS, bavatuesdays, s3 and tagged , , , , , . Bookmark the permalink.

One Response to S3 Commando (Line)

  1. Pingback: New blog post: "S3 Commando (Line)" http://bavatuesdays.com/s3-commando-line/

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.