Previously I wrote my first Lambda (part 1) and connected it to data store Dynamo DB (part 2). Today I am going to use default file system for AWS system – S3. Again using boto3 library is very handy.
To make it easy and treat DynamoDB as historical table only, I decided to make calculations based on the state written to file. To calculate current state of the wallet I am subtracting operation amount from the last total. And I will keep it in simple JSON on my bucket.
Reading from S3
Before I start to do the basic operations, let me mention that there are no folders on AWS file system, we have buckets here. These are basic containers for S3 files. Here is how we can easily read file from bucket using Pythonic Lambda.
Writing to S3
Writing is nothing simpler than using
boto3.resource again. Play with it on your AWS landing page.
Again we are to make all of our fun safe and clean. So before running you will need to set permissions for user running lambda to use this file system. Following policy
AmazonS3FullAccess is a silver bullet to all s3 operations.
Of course it is only tip of the iceberg, when it comes to S3 file system. It can accept any files, provides versioning and triggers, that can be leveraged to any processing I can imagine. But what about saving simple JSON file? Works.