Palzin Track
Get 15% off with code PTRACKSIGNUP15 

Laravel Diary Logo

copy files from local to aws S3 Bucket(aws cli + s3 bucket)

aws
Table of Contents

AWS CLI and S3 Bucket

In my current project, I need to deploy/copy my front-end code into AWS S3 bucket. But I do not know how to perform it. One of my colleagues found a way to perform this task.

here are the guidelines from start to end, how to install aws cli, how to use aws cli and other functionalities.

So, let’s start from the beginning.
In my mac, I do not installed aws cli, so I got the error when running the following command.
Open your terminal,

    $ aws --version
    
    output
    -bash: aws: command not found

(Here I got the solution, https://qiita.com/maimai-swap/items/999eb69b7a4420d6ab64)

So now let’s install the brew, if you do not installed yet.

Step1. Install the brew

For MacOS use below


    $ ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"

To install on Linux you can use below code

    $ sudo yum install aws-cli

Step2. check the brew version


    $ brew -v
    
    output
    Homebrew 1.5.2
    Homebrew/homebrew-core (git revision 58b9f; last commit 2018-01-24)
    

Step3. install aws cli, use the following command


    $ brew install awscli

Step4. check the aws CLI version


    $ aws --version
    
    output
    aws-cli/1.14.30 Python/3.6.4 Darwin/17.3.0 botocore/1.8.34

that’s great, now it’s time to configure the AWS credential.

Step5. now configure the aws profile


    $ aws configure
    AWS Access Key ID [None]: <your access key>
    AWS Secret Access Key [None]: <your secret key>
    Default region name [None]: <your region name>
    Default output format [None]: ENTER

All settings are done.
now you can access your s3 bucket.

now from here, we can learn how to manage our bucket.

Managing Buckets

aws s3 commands support commonly used bucket operations, such as creating, removing, and listing buckets.

1.Listing Buckets


    $ aws s3 ls
    
    output
    2017-12-29 08:26:08 my-bucket1
    2017-11-28 18:45:47 my-bucket2

The following command lists the objects in bucket-name/path (in other words, objects in bucket-name filtered by the prefix path/).


    $ aws s3 ls s3://bucket-name

2.Creating Buckets


    $ aws s3 mb s3://bucket-name

(aws s3 mb command to create a new bucket. Bucket names must be unique.)

3.Removing Buckets
To remove a bucket, use the aws s3 rb command.


    $ aws s3 rb s3://bucket-name

By default, the bucket must be empty for the operation to succeed. To remove a non-empty bucket, you need to include the --force option.


    $ aws s3 rb s3://bucket-name --force

This will first delete all objects and subfolders in the bucket and then remove the bucket.

Managing Objects

The high-level aws s3 commands make it convenient to manage Amazon S3 objects as well. The object commands include aws s3 cp, aws s3 ls, aws s3 mv, aws s3 rm, and sync. The cp, ls, mv, and rm commands work similarly to their Unix

The cp, mv, and sync commands include a --grants option that can be used to grant permissions on the object to specified users or groups. You set the --grants option to a list of permissions using the following syntax:


    --grants Permission=Grantee_Type=Grantee_ID
             [Permission=Grantee_Type=Grantee_ID ...]

Each value contains the following elements:
* Permission – Specifies the granted permissions, and can be set to read, readacl, writeacl, or full.
* Grantee_Type – Specifies how the grantee is to be identified and can be set to uri, email address, or id.
* Grantee_ID – Specifies the grantee based on Grantee_Type.
* uri – The group's URI. For more information, see Who Is a Grantee?
* email address – The account's email address.
* id – The account's canonical ID.


    aws s3 cp file.txt s3://my-bucket/ --grants read=uri=http://acs.amazonaws.com/groups/global/AllUsers [email protected]

** Copy multiple files from directory**
if you want to copy all files from a directory to s3 bucket, then checkout the below command.


    aws s3 cp <your directory path> s3://<your bucket name> --grants read=uri=http://acs.amazonaws.com/groups/global/AllUsers --recursive

We use the --recursive flag to indicate that ALL files must be copied recursively.
When passed with the parameter --recursive, the following cp command recursively copies all files under a specified directory to a specified bucket.

we can set exclude or include a flag, while copying files.


    aws s3 cp <your directory path> s3://<your bucket name>/ --recursive --exclude "*.jpg"  --include "*.log"

For more details, please go through the aws official link.
https://docs.aws.amazon.com/cli/latest/userguide/using-s3-commands.html

thank you for taking your precious time to read.

Using aws s3 cp from the AWS Command-Line Interface (CLI) will require the --recursive parameter to copy multiple files.

aws s3 cp s3://myBucket/dir localdir --recursive

The aws s3 sync command will, by default, copy a whole directory. It will only copy new/modified files.

aws s3 sync s3://mybucket/dir localdir

Just experiment to get the result you want.

Documentation:

cp command sync command

::Share it on::

Comments (0)

What are your thoughts on "copy files from local to aws S3 Bucket(aws cli + s3 bucket)"?

You need to create an account to comment on this post.

Related articles