Running S3 on your local computer

Lately, I've been experimenting with AWS S3, EC2, and Lambda and the process is pretty great. That said, I don't feel comfortable wasting compute time by spinning up an S3 bucket or EC2 instance when I can first write my code on a local machine. Plus, the second benefit is that I can also work offline while traveling.


Fake S3 allows you to run an S3 bucket on your local machine. As of right now, it's only available as a Ruby gem.

sudo gem install fakes3

Once you install the Ruby gem, the next step is instantiate a web server on port [4567]. The --limit parameter artificially creates a data cap in order to simulate a mobile phone connection.

sudo fakes3 -r /mnt/fakes3_root -p 4567 --limit=50K

Thats it!

Writing to S3, from S3 and a few other DevOps-type requests

After installing S3, the next step was to test it out. The script below allows an administrator to:

  1. Creates an S3 Bucket
  • Create 26 files from A-Z with the words "Hello World" inside the bucket.
  • Iterate through each file in the bucket (26 times), create a JSON file and publish it to the /uploads folder.
  • Copy the entire /uploads folder to a new folder named /ec2.
  • Delete the bucket.
require 'rubygems'
#This is the AWS SDK
require 'aws/s3'
#This is the tool that lets do you bash-like commands
require 'fileutils'
#This is simply for demonstration purposes
require 'time'
#This is simply for the demo
require 'json'

#This is just a class that focuses on creating functions that are based on bash commands
class CopyUtil
  APP_ROOT   = File.dirname(__FILE__)
  OUTPUT_DIR = "uploads"
  EC2_DIR    = "ec2"

  #A. This is the first command fired when the class is started
  def initialize

  #B. Create a directory 
  def create_output_directory
    #Create a /JSON directory
    #Make it platform independent
    $:.unshift(File.join(APP_ROOT, OUTPUT_DIR))
  #Create a blank file
  def create_file(file_name, file_content)
    @file_type = ".json"
    @mode = "w"
    @output = + "/" + "#{file_name}" + @file_type, @mode)
    @output.puts file_content
  #C. Copy the directory
  def copy_files
    FileUtils.cp_r(OUTPUT_DIR + "/.", EC2_DIR)

#This class is simply design for demo
class JSONUtil
  def create(key, value)
    { "#{key}" => "#{value}:#{get_timestamp}" }.to_json
  def get_timestamp

#include a library
include AWS::S3

#Create an S3 connection
AWS::S3::Base.establish_connection!(:access_key_id => "123",
                                    :secret_access_key => "abc",
                                    :server => "localhost",
                                    :port => "4567")

# Get the name of the bucket
my_first_bucket = 'myFirstBucket'

#Create the bucket

#Go from A - Z and store crap in the bucket
('a'..'z').each do |filename|, 'Hello World', my_first_bucket)

#Create a new tool that will make files and copy them over
copy_util =

#Print out the contents of the bucket
bucket = Bucket.find(my_first_bucket)
#Iterate through each item in the bucket and create a json file
bucket.objects.each do |s3_obj|
  @key   = "#{s3_obj.key}" 
  @value = "#{s3_obj.value}"
  #Print this out on terminal
  puts @key + ":" + @value
  #Create a JSON file
  @json =, @value)
  #Copy the contents of this bucket into files on the desktop
  copy_util.create_file(@key, @json)

#Copy over the entire directory

# Delete your bucket and all its keys
Bucket.delete(my_first_bucket,:force => true)

I wrote this as a single file so that I can later be ported to a single function within AWS Lambda.

The next step is to port this over to NodeJS (or Python) so that I can use this as an AWS Lambda function.

If I'm porting this over why did I write this in Ruby? Because I love Ruby!