1

s3://bucketname/yyyymmdd/foo/bar I have some files in the root directory and I want to copy them. I do not want to copy any subdirectories to the root (foo). I'm trying this in powershell:

aws s3 cp "s3://bucketname" . --recursive --exclude "*/foo/*/*" --include "foo/*.*" --dryrun 

or

aws s3 cp "s3://bucketname" . --recursive --exclude "yy*/foo/*/*" --include "foo/*.*" --dryrun 

I get the files in foo but I also get some files stored like s3://bucketname/otherdirectory/otherfiles. I tried some alternatives: This appears to list EVERY file in the bucket:

aws s3 cp "s3://bucketname" . --recursive --exclude "*/foo/*/*" --include "*/foo/*.*" --dryrun 

My gut says it should be

aws s3 cp source destination --recursive --exclude "*" --include "*/foo/*.*" --dryrun 

meaning /directory/*anyfilename.*anyextention but this gives me every file recursively below s3://bucketname/yyyymmdd/foo/.

I'm trying to whip up a one liner without foreach patterns but I may have to resort to that.

1 Answer 1

1

Figured it out:

(get-s3object -bucket bucketname | where{$_.key -like "*/foo/*"} | where{$_.key -notlike "*/foo/*/*"}).key 

This list just the files in foo with no subfolders or subfolder files. It does this regardless of the YYYYMMDD for the bucket. I can now pipe this to read-s3object to get the files.

get-s3object -bucket bucketname | where{$_.key -like "*/foo/*"} | where{$_.key -notlike "*/foo/*/*"} | read-s3object -folder . 

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.