Mock amazon s3 bucket for local development
I also include .Net core application as an example of how to work with the local s3 bucket. Check this git repo.
Intro
Purposes of having s3 bucket locally may be different:
- To not spend money and resources on real aws s3 bucket.
- For testing (end-to-end testing environment was my purpose).
- You can work offline!
How can we actually achieve this?
Mainly because of Localstack — https://github.com/localstack/localstack, an easy-to-use fully functional test/mocking framework for developing Cloud applications. And you still can use aws cli as a tool to manage your AWS services.
Setup
- Install Docker. We will run localstack as a docker container, but I also add Dockerfile and instructions on how to run in the git repo.
- Then you need to install awscli-local from here as a wrapper around localstack. The
awslocal
command has the same usage as theaws
command, but simplifies local development.
Install it using pip from your terminal (pip is package manager, install it in case you don’t already have it here):pip install awscli-local
Run s3 bucket locally
Now we can run our s3 bucket in docker. You can run as a docker command, but I prefer docker-compose here as we have many ENV variables and also volumes (We will use volume to store our files in bucket, so that bucket can have files initially added to it from the start).
- Create docker-compose.yml file and paste following
https://gist.github.com/yarchiT/79d24cac8c230d8a042634b3de4e8ff8 - Run in terminal from the folder containing docker-compose.yml file
docker-compose up -d
You can verify it’s working by going to http://localhost:4566/
- Now we can use awslocal command to work with our running s3 services. I include list of command in the readME inside S3BucketSetup folder in git repo.
Couple examples:
1. Create bucket:
awslocal s3api create-bucket — bucket test-bucket — region eu-west-1
2. List-buckets:
awslocal s3api list-buckets — query “Buckets[].Name”
3. Copy file (SomeFolder directory will be created automatically. Aws s3 bucket doesn’t have typical folder structure, and both directories and files are keys. You can determine if a particular key is a file or directory, by checking if it ends with “/”.). Assuming that aws.txt file is your current directory, if not you need to add the path to file.
awslocal s3 cp aws.txt s3://test-bucket/SomeFolder/aws.txt
Fill bucket with files
If you want to have a bucket already filled with files when you start your application, you need to create volume, add DATA_DIR env variable in docker-compose file. (Check docker-compose file in git repo.).
With this configuration in place, you need to run your container using `docker-compose up`.
After this create your bucket and add files there. (Using command I’ve listed above). You will notice you have a new file within the /init/data
folder called recorded_pi_calls.json
which contains new lines, which looks something like this:
{"a": "s3", "m": "PUT", "p": "/my-bucket", "d": "", "h": {"Remote-Addr": "127.0.0.1", "Host": "localhost:4566", "Accept-Encoding": "identity", "User-Agent": "aws-cli/1.19.36 Python/3.8.2 Linux/4.19.128-microsoft-standard botocore/1.20.36", "X-Amz-Date": "20210331T141536Z", "X-Amz-Content-Sha256": "e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855", "Authorization": "AWS4-HMAC-SHA256 Credential=testkey/20210331/eu-west-1/s3/aws4_request, SignedHeaders=host;x-amz-content-sha256;x-amz-date, Signature=7497332b452fb80aba9cb2346c43fe29d89d69bee8eb26af416ceb49f8c52abe", "Content-Length": "0", "X-Forwarded-For": "127.0.0.1, localhost:4566, 127.0.0.1, localhost:4566", "x-localstack-edge": "https://localhost:4566", "content-type": "binary/octet-stream", "Connection": "close"}, "rd": "PENyZWF0ZUJ1Y2tldFJlc3BvbnNlIHhtbG5zPSJodHRwOi8vczMuYW1hem9uYXdzLmNvbS9kb2MvMjAwNi0wMy0wMSI+PENyZWF0ZUJ1Y2tldFJlc3BvbnNlPjxCdWNrZXQ+dGVzdC1idWNrZXQ8L0J1Y2tldD48L0NyZWF0ZUJ1Y2tldFJlc3BvbnNlPjwvQ3JlYXRlQnVja2V0UmVzcG9uc2U+"}
IMPORTANT!
There is no other way to add files to aws bucket except for this way. Aws encrypts data you add and then reads this file when you are starting the container again and adds all keys there. Make sure you commit this file !
.Net core example
I’ve added simple console application to show how aws client works. Check working code example in me github repo: https://github.com/yarchiT/awsS3bucket-local
Hit Clap if you liked the explanation and do comment your doubts if any :)