File Uploader — Local to AWS-S3 Bucket

In this article, we’ll be covering how to upload files from your local system to an Amazon S3-bucket using the Flask web framework .

Step 1: Create a S3 Bucket

  • Login to AWS management console and goto S3 and click on Create Bucket. My bucket name is : “mlbankproject27091995”
  • We can also see that initially the S3 Bucket is empty .

Step 2: Connect to Amazon S3

  • Before we can start uploading our files, we need a way to connect to s3 and fetch the correct bucket. Log in to your AWS management console and under your name (on the top right, select “My security credentials” then open the “Access Keys (Access Key ID and Access Key Secret)” tab and finally click “Create New Access Key”.
  • Note: These credentials will be later required to connect to S3 using boto client.
  • boto3 is the newest version of the AWS SDK, it provides a high level interface to interact with their API.
Goto — My Security Credentials
After creating new access keys

Step 3: Create a Flask App

  • It looks like the below image. (Code will be uploaded to Github)
  • Since, I will be using this to do some Machine Learning stuffs later so here we are taking 3 inputs: Train.csv, Test.csv and user_id.
  • user_id will help us to differentiate and prevent overlaps in the uploads for each user. Please refer below example.

Example: (You will see this later during execution of the Flask App !)

  • Amazon S3/Bucket_name/user1/data/train.csv and test.csv
  • Amazon S3/Bucket_name/user2/data/train.csv and test.csv
UI for Flask WebApp

Step 4: Connect to AWS

  • In our project, we will create a file. We'll use boto3 to establish a connection to the S3 service.
  • In aws_access_key_id and aws_secret_access_key provide the keys you obtained Step 2 of this blog.
Connect to AWS S3

Step 5: Upload files from local system to AWS S3-Bucket

  • Now that we are successfully connected to S3, we created a function in that will send the user’s files directly into our bucket.

Step 6: Let’s see it in action !

  • Select the train.csv , test.csv and user_id.
Files selection
  • After hitting submit button , you can open your AWS console and inside your bucket you can see the user_id and inside that you can see the files uploaded by that user.
data uploaded for each user_id
train.csv and test.csv uploaded


  • There you have it people ! This is pretty much how you can upload files directly to Amazon S3 using Flask. AWS S3 in an amazing service, you should totally take advantage of it.

Just follow all the steps to build, train, deploy your Machine Learning Model using AWS SageMaker.

I hope you enjoyed reading this blog ! Please give a clap if you find this blog useful. Thanks !! :)

Where can you find my code?


Youtube video for implementation details :

Please subscribe to my YouTube channel and enjoy the content.

Data Scientist