Skip to main content

How to use AWS CLI for uploading/downloading/deleting files

 Hello again

It's been a lot of time that we have posted here. So let's start with AWS S3 CLI.  We can use AWS CLI for various operations such as

  • Uploading files from local machine into S3 bucket
  • Downloading files from S3 bucket into local machine
  • Listing all the files in a S3 bucket
  • Deleting files from the S3 bucket
For all the above operations to be performed we would need relevant access permissions to the specified bucket as well. Apart from permissions the three things which we would need are a) Bucket name b)Access key c) Secret Access key

First lets install AWS CLI on our machine.


In Mac

Installing aws cli
curl "https://awscli.amazonaws.com/AWSCLIV2.pkg" -o "AWSCLIV2.pkg"
sudo installer -pkg AWSCLIV2.pkg -target /

In order to test this has worked or not try the simple command

aws --version







We should get a result in above format

Now lets configure for that we need to pass the following command

aws configure

upon clicking enter it would ask for enter access key and secret access key. region can be ignored output format name also can be ignored









Once this is done it's the four basic operations which we will try to perform

a) Uploading files from local machine

aws s3 cp test.txt s3://bucketname/input/test.txt

if you are in different directory in your machine. Also "input" is a directory in S3 bucket

aws s3 cp users/anirudh/downloads/test.txt s3://bucketname/input/test.txt

b) Downloading files from S3 into local machine

aws s3 cp s3://bucketname/input/Dummy.csv /Users/anirudh/downloads/dummy.txt

c) Listing out all files in S3

aws s3 ls s3://bucketname/input

d) Deleting files in S3

aws s3 rm s3://bucketname/input/test.txt

Comments

Popular posts from this blog

Target Load Type - Normal or Bulk in Session Properties

We can see the Target load type ( Normal or Bulk) property in session under Mapping tab and we will go for Bulk to improve the performance of session to load large amount of data. SQL loader utility will be used for Bulk load and it will not create any database logs(redolog and undolog), it directly writes to data file.Transaction can not be rolled back as we don't have database logs.However,Bulk loading is very as compared to Normal loading. In target if you are using Primary Key or Primary Index or any constraints you can't use Bulk mode. We can see this property in the below snap shot.

Looping using Expression Transformation in Informatica

One of the most common used transformation in Informatica is Expression transformation. In Expression transformation we can perform various operations such as data conversions i.e to_date,to_char, string manipulation such as substr,instr etc. Now coming to one of the widely and prominent task which we perform using Expression transformation is looping a value. Expression transformation has three types of ports i.e. input,variable and output.Only output port values can be propagated to next transformations. So in order to pass values of input and variable ports to next level of transformation these must be assigned to output ports.The order of execution in Expression transformation is top to bottom and first input then variable and finally output ports are processed. let us consider the following scenario   The files should be generated with employee name as file name and that particular file should have the details of that respective employee only, if the employee has more than

Finding Duplicate records and Deleting Duplicate records in TERADATA

Requirement: Finding duplicates and removing duplicate records by retaining original record in TERADATA Suppose I am working in an office and My boss told me to enter the details of a person who entered in to office. I have below table structure. Create Table DUP_EXAMPLE ( PERSON_NAME VARCHAR2(50), PERSON_AGE INTEGER, ADDRS VARCHAR2(150), PURPOSE VARCHAR2(250), ENTERED_DATE DATE ) If a person enters more than once then I have to insert his details more than once. First time, I inserted below records. INSERT INTO DUP_EXAMPLE VALUES('Krishna reddy','25','BANGALORE','GENERAL',TO_DATE('01-JAN-2014','DD-MON-YYYY')) INSERT INTO DUP_EXAMPLE VALUES('Anirudh Allika','25','HYDERABAD','GENERAL',TO_DATE('01-JAN-2014','DD-MON-YYYY')) INSERT INTO DUP_EXAMPLE VALUES('Ashok Vunnam','25','CHENNAI','INTERVIEW',TO_DATE('01-JAN-2014',