Uly.me

cloud engineer

  • Home
  • About
  • Archives
Home/Archives for s3

April 5, 2020

AWS S3 LS Recursive

Here’s how to list S3 files recursively.

aws s3 ls s3://mybucket --recursive | awk '{print $4}'

aws s3 ls s3://mybucket --recursive | awk '{print $4}'

By default, the results display date, time, disk size and filename. Use awk to display the filename only.

February 2, 2020

AWS CLI S3 Recursive Copy

Here’s how to copy multiple files recursively using AWS CLI. The S3 cp command by default only copies a single file. To copy multiple files, you have to use the –recursive option along with –exclude and –include. In this example, we will exclude every file, but include only files with a json extension.

aws s3 cp /tmp/folder s3://bucket/folder \
  --recursive
  --exclude "*"
  --include "*.json"

aws s3 cp /tmp/folder s3://bucket/folder \ --recursive --exclude "*" --include "*.json"

The result:

upload: ./abc.json to s3://bucket/folder/abc.json    
upload: ./xyz.json to s3://bucket/folder/xyz.json

upload: ./abc.json to s3://bucket/folder/abc.json upload: ./xyz.json to s3://bucket/folder/xyz.json

January 28, 2020

Standard S3 Policy

Here’s a standard S3 policy to grant an IAM user access to a bucket within an AWS account. User is allowed to add, update, and delete objects. These 3 actions s3:ListAllMyBuckets, s3:GetBucketLocation, and s3:ListBucket are the additional permissions required to access the console. Also, the s3:PutObjectAcl and the s3:GetObjectAcl actions are required to be able to copy, cut, and paste objects within the console.

{
   "Version":"2012-10-17",
   "Statement":[
      {
         "Effect":"Allow",
         "Action":[
            "s3:ListAllMyBuckets"
         ],
         "Resource":"arn:aws:s3:::*"
      },
      {
         "Effect":"Allow",
         "Action":[
            "s3:ListBucket",
            "s3:GetBucketLocation"
         ],
         "Resource":"arn:aws:s3:::examplebucket"
      },
      {
         "Effect":"Allow",
         "Action":[
            "s3:PutObject",
            "s3:PutObjectAcl",
            "s3:GetObject",
            "s3:GetObjectAcl",
            "s3:DeleteObject"
         ],
         "Resource":"arn:aws:s3:::examplebucket/*"
      }
   ]
}

{ "Version":"2012-10-17", "Statement":[ { "Effect":"Allow", "Action":[ "s3:ListAllMyBuckets" ], "Resource":"arn:aws:s3:::*" }, { "Effect":"Allow", "Action":[ "s3:ListBucket", "s3:GetBucketLocation" ], "Resource":"arn:aws:s3:::examplebucket" }, { "Effect":"Allow", "Action":[ "s3:PutObject", "s3:PutObjectAcl", "s3:GetObject", "s3:GetObjectAcl", "s3:DeleteObject" ], "Resource":"arn:aws:s3:::examplebucket/*" } ] }

January 9, 2020

AWS Boto3 Client

Boto is the Amazon Web Services (AWS) SDK for Python. It enables Python developers to create, configure, and manage AWS services, such as EC2 and S3. Boto provides an easy to use, object-oriented API, as well as low-level access to AWS services.

Here’s a quick example how to query for AWS resources.

import logging
import boto3
from botocore.exceptions import ClientError
 
# set aws profile
session = boto3.Session(profile_name="default")
 
# get buckets
s3 = session.client('s3')
results = s3.list_buckets()
 
# display buckets
print('Existing buckets:')
for key in results['Buckets']:
    print(f'{key["Name"]}')
 
# get ec2 instances. set region.
ec2 = session.client('ec2', region_name='us-east-1')
results = ec2.describe_instances()
 
# display instances
for key in results["Reservations"]:
      for instance in key["Instances"]:
           instance_id = instance["InstanceId"]
           instance_type = instance["InstanceType"]
           availability_zone = instance["Placement"]["AvailabilityZone"]
           print(instance_id + "\t" + instance_type + "\t" + availability_zone)
 
# get list of users
client = session.client('iam')
results = client.list_users()
 
# display list of users
for key in results['Users']:
    print (key['UserName'])

import logging import boto3 from botocore.exceptions import ClientError # set aws profile session = boto3.Session(profile_name="default") # get buckets s3 = session.client('s3') results = s3.list_buckets() # display buckets print('Existing buckets:') for key in results['Buckets']: print(f'{key["Name"]}') # get ec2 instances. set region. ec2 = session.client('ec2', region_name='us-east-1') results = ec2.describe_instances() # display instances for key in results["Reservations"]: for instance in key["Instances"]: instance_id = instance["InstanceId"] instance_type = instance["InstanceType"] availability_zone = instance["Placement"]["AvailabilityZone"] print(instance_id + "\t" + instance_type + "\t" + availability_zone) # get list of users client = session.client('iam') results = client.list_users() # display list of users for key in results['Users']: print (key['UserName'])

December 20, 2019

AWS S3 Replication Policy

Here’s the policy for S3 replication between regions.

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Action": [
                "s3:Get*",
                "s3:ListBucket"
            ],
            "Effect": "Allow",
            "Resource": [
                "arn:aws:s3:::source-bucket-name",
                "arn:aws:s3:::source-bucket-name/*"
            ]
        },
        {
            "Action": [
                "s3:ReplicateObject",
                "s3:ReplicateDelete",
                "s3:ReplicateTags",
                "s3:GetObjectVersionTagging"
            ],
            "Effect": "Allow",
            "Resource": "arn:aws:s3:::destination-bucket-name/*"
        }
    ]
}

{ "Version": "2012-10-17", "Statement": [ { "Action": [ "s3:Get*", "s3:ListBucket" ], "Effect": "Allow", "Resource": [ "arn:aws:s3:::source-bucket-name", "arn:aws:s3:::source-bucket-name/*" ] }, { "Action": [ "s3:ReplicateObject", "s3:ReplicateDelete", "s3:ReplicateTags", "s3:GetObjectVersionTagging" ], "Effect": "Allow", "Resource": "arn:aws:s3:::destination-bucket-name/*" } ] }

  • « Previous Page
  • 1
  • 2
  • 3
  • 4
  • 5
  • Next Page »
  • Cloud
  • Linux
  • Git

Copyright © 2012–2021