Create S3 bucket using AWS CLI and Python Boto3

Updated On August 13, 2020 | By Mahesh Mogal

Today we are going to learn how to create AWS S3 bucket using AWS CLI and Python boto3 and from S3 management console. While creating S3 buckets, we need to give attention to its access permission as well as region. Access permission decides who can use this bucket and what actions are allowed for each user. The region is useful when we want low latency or save cost or when we have to follow some regulatory restrictions (Like Data in the USA should stay in USA region buckets only). Let's get started.

S3 Bucket using AWS CLI

We can use two different commands while creating S3 bucket using CLI.

# Create s3 bucket using s3api

$ aws s3api create-bucket \
 --profile admin-analyticshut \
 --bucket testbucket-fromcli-1 \
 --region us-east-1

# ouptput
# {
#    "Location": "/testbucket-fromcli-1"
# }

Above command create a bucket named "testbucket-fromcli-1" in S3 in us-east-1 region. We can specify access permission on the bucket using --acl parameter. ACL(Access Control List) can have one following of values.

  • "private"
  • "public-read"
  • "public-read-write"
  • "authenticated-read"

If you do not specify any ACL then by default, all S3 buckets are created with private acl applied to them.

AWS Facts

We can also use AWS S3 command to create a bucket. When using the S3 command we need to pass bucket URI instead of bucket name while creating S3 bucket.

# create s3 bucket using mb command

$ aws s3 mb s3://testbucket-fromcli-2 --profile admin-analyticshut --region us-west-1

# output
# make_bucket: testbucket-fromcli-2

Creating S3 bucket using Python and Boto3

Let us create a S3 bucket using Python and boto3 now. Boto3 offers client and service resource for S3. Both of them have create_bucket function command and both functions have same definition and accept the same set of parameters. We can use the following code to create a bucket using S3 client.

import boto3
import pprint

#
# setting up configured profile on your machine.
# You can ignore this step if you want use default AWS CLI profile.
#
boto3.setup_default_session(profile_name='admin-analyticshut')
s3 = boto3.client('s3')

# creates 3 bucket with defulat set up
response = s3.create_bucket(Bucket='testbucket-frompython-1')

# following parameters can be passed while creating bucket
response = s3.create_bucket(ACL='private',
                            Bucket='testbucket-frompython-3',
                            CreateBucketConfiguration={'LocationConstraint': 'ap-south-1'}
                            )

print(pprint.pprint(response))

Once you run above code you can see output like this.

Bucket created output
Bucket created output

Here we need to pass "CreateBucketConfiguration" parameter to set bucket location.

By default bucket is created in US East (North Virginia) region.

AWS facts

Boto3 S3 service resource have similar code to create s3 bucket.

s3_resource = boto3.resource('s3')
response = s3_resource.create_bucket(Bucket="testbucket-frompython-4")

# following parameters can be passed while creating bucket
response = s3_resource.create_bucket(ACL='private',
   Bucket='bucket_name',
   CreateBucketConfiguration={'LocationConstraint': 'us-west-1'})

After running all above commands and python code, we can check our S3 console and verify all buckets have been created.

S3 buckets created

Creating S3 bucket on S3 management console

AWS offers simple steps to create a bucket on the S3 console as well. My issue with S3 console is, it keeps changing whereas CLI stays the same. Anyways, you can check below to know how to create an S3 bucket on the console. I will try to keep this updated if AWS again made changes to these steps.

So which one you like, creating bucket using CLI/Code or using S3 console?

Conclusion

We have seen simple ways to create S3 buckets But there is a lot to S3. Versioning, ACL and Policies, Encryption, Static website hosting to name a few. We are going to explore all that in this series of blogs. See you around.

.

Mahesh Mogal

I am passionate about Cloud, Data Analytics, Machine Learning, and Artificial Intelligence. I like to learn and try out new things. I have started blogging about my experience while learning these exciting technologies.

Stay Updated with Latest Blogs

Get latest blogs delivered to your mail directly.

Recent Posts

Sorting in Spark Dataframe

In this blog, we will learn how to sort rows in spark dataframe based on some column values.

Read More
Removing White Spaces From Data in Spark

White spaces can be a headache if not removed before processing data. We will learn how to remove spaces from data in spark using inbuilt functions.

Read More
Padding Data in Spark Dataframe

In this blog, we will learn how to use rpad and lpad functions to add padding to data in spark dataframe.

Read More

Leave a Reply

Your email address will not be published. Required fields are marked *

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram
Share via
Copy link
Powered by Social Snap