本文共 11287 字,大约阅读时间需要 37 分钟。
aws s3 cli
It is the second article in the Learn AWS CLI series. It gives you an overview of working with the AWS S3 bucket using CLI commands. We also look at a brief overview of the S3 bucket and its key components.
这是Learn AWS CLI系列文章中的第二篇。 它概述了如何使用CLI命令使用AWS S3存储桶。 我们还将简要介绍S3存储桶及其关键组件。
You should meet the following prerequisites before going through exercises demonstrated in this article.
在进行本文演示的练习之前,您应该满足以下先决条件。
Amazon Web Services (AWS) provide a cloud storage service to store and retrieves files. It is known as Simple Storage Service or AWS S3. You might be familiar with Dropbox or Google Drive for storing images, docs, and text files in the cloud. AWS S3 is a similar kind of service from Amazon. You can store a single file up to 5 TB with unlimited storage. It provides benefits such as flexibility, scalability, durability, Availability.
Amazon Web Services(AWS)提供了一种云存储服务来存储和检索文件。 它称为简单存储服务或AWS S3。 您可能对Dropbox或Google云端硬盘在云中存储图像,文档和文本文件很熟悉。 AWS S3是亚马逊提供的类似服务。 您最多可以存储5 TB的单个文件,而不受限制。 它提供了诸如灵活性,可伸缩性,耐用性,可用性等优点。
Log in to the AWS Console using either root account or IAM user and then expand Services. You can see S3 listed in the Storage group as shown below.
使用root帐户或IAM用户登录到AWS控制台,然后展开Services 。 您可以在存储组中看到S3,如下所示。
Click on S3, and it launches the S3 console. Here, you see an existing bucket (if any) and options to create a new bucket.
单击S3,它将启动S3控制台。 在这里,您会看到现有的存储桶(如果有)以及创建新存储桶的选项。
Object URL: Once we upload any object in the AWS S3 bucket, it gets a unique URL for the object. You can use this URL to access the document. This URL is in the following format:
对象URL:将任何对象上传到AWS S3存储桶后,它将获得该对象的唯一URL。 您可以使用此URL访问文档。 该URL的格式如下:
https://[BucketName].[Region].[amazonaws.com]/object key.file_extension
https:// [BucketName]。[Region]。[amazonaws.com] / object key.file_extension
In the following example, we can see Image URL in the same format.
在以下示例中,我们可以看到相同格式的图片网址。
You can also view the S3 bucket URL representation in the following image. Each object contains a different URL, although the basic format remains similar.
您还可以在下图中查看S3存储桶URL表示形式。 每个对象包含一个不同的URL,尽管基本格式保持相似。
Once you upload an object in the S3 bucket, it follows Read after Write consistency. It refers to the fact that after uploading an object, it is available immediately to all users (with relevant access) to read it. However, once you remove an item, it is Eventual consistent. It takes some time to remove the item for all edge locations (cache).
将对象上传到S3存储桶后,它遵循“读后写”一致性。 它指的是以下事实:上传对象后,所有用户(具有相关访问权限)都可以立即读取它。 但是,一旦删除项目,它最终是一致的。 删除所有边缘位置(缓存)的项目需要花费一些时间。
As of now, you should be familiar with an AWS CLI tool and an S3 bucket for storing objects. In this section, we use the CLI command to perform various tasks related to the S3 bucket.
到目前为止,您应该已经熟悉AWS CLI工具和用于存储对象的S3存储桶。 在本节中,我们使用CLI命令执行与S3存储桶相关的各种任务。
We use mb command in CLI to create a new S3 bucket. You should have configured the CLI profile in your environment before executing this command. We specified a default region Asia Pacific (Mumbai) ap-south-1 in the production profile.
我们在CLI中使用mb命令来创建新的S3存储桶。 在执行此命令之前,您应该已经在环境中配置了CLI配置文件。 我们在生产资料中指定了默认区域亚太(孟买)ap-south-1 。
Open a command prompt and execute the below CLI code. It creates a new S3 bucket named sqlshackdemocli in the default region.
打开命令提示符并执行以下CLI代码。 它将在默认区域中创建一个名为sqlshackdemocli的新S3存储桶。
aws s3 mb s3://sqlshackdemocli --profile production
In the query output, it returns the bucket name.
在查询输出中,它返回存储桶名称。
Now, go back to the AWS web console and refresh the S3 buckets. You can see the new bucket in the following screenshot.
现在,返回AWS Web控制台并刷新S3存储桶。 您可以在以下屏幕截图中看到新存储桶。
Select the S3 bucket and click on Copy ARN. It is a unique Amazon resource name. It returns following ARN- arn:aws:s3:::sqlshackdemocli for S3 bucket.
选择S3存储桶,然后单击复制ARN 。 这是唯一的Amazon资源名称。 它针对S3存储桶返回以下ARN- arn:aws:s3 ::: sqlshackdemocli 。
You should provide an S3 bucket name as per the AWS standards. For example, we cannot use underscore(_) in the bucket name. It gives you the following error message.
您应该根据AWS标准提供S3存储桶名称。 例如,我们不能在存储桶名称中使用下划线(_)。 它给您以下错误信息。
We use ls command to retrieve S3 bucket names in your AWS account.
我们使用ls命令在您的AWS账户中检索S3存储桶名称。
aws s3 ls --profile production
As per the previous screenshot, we have three buckets in AWS. You get the bucket name along with the creation date in the output using the CLI command.
根据上一个屏幕截图,我们在AWS中有三个存储桶。 您可以使用CLI命令在输出中获取存储桶名称以及创建日期。
Once we created an S3 bucket, we need to upload the relevant objects in it. It uses copy command (cp) to copy a file from the local directory to the S3 bucket. The following command uploads a text file into S3. It might take time to upload depending upon file size and internet bandwidth.
创建S3存储桶后,我们需要在其中上载相关对象。 它使用复制命令(cp)将文件从本地目录复制到S3存储桶。 以下命令将文本文件上传到S3。 上载可能要花费一些时间,具体取决于文件大小和Internet带宽。
aws s3 cp C:\FS\aarti.txt s3://sqlshackdemocli
You can open the S3 bucket and verify that the uploaded file exists in the bucket.
您可以打开S3存储桶并验证存储桶中是否存在上传的文件。
Suppose you want to upload multiple files in the S3. It is not feasible to execute the above command with each file name. We want a way to upload them without specifying file names.
假设您要在S3中上传多个文件。 对每个文件名执行上述命令是不可行的。 我们想要一种不指定文件名就上传它们的方法。
We still use the cp command to specify a directory along with argument recursive. Here, we do not need to specify the file names.
我们仍然使用cp命令指定目录以及参数recursive 。 在这里,我们不需要指定文件名。
aws s3 cp directory_path s3://bucket_name –recursive
For this demo, I want to upload the following 5 files from the FS folder to the S3 bucket.
对于此演示,我想将以下5个文件从FS文件夹上载到S3存储桶。
This command uploads all files available in the specified folder to the AWS S3 bucket.
该命令将指定文件夹中的所有可用文件上载到AWS S3存储桶。
aws s3 cp C:\FS\ s3://sqlshackdemocli - - recursive
As you can see, it goes through each file available in the specified folder and uploads it.
如您所见,它将遍历指定文件夹中的每个可用文件,然后将其上传。
Refresh the S3 bucket and verify the uploaded files using a recursive argument.
刷新S3存储桶,并使用递归参数验证上传的文件。
Before we move further, select the files in the S3 bucket and delete them. Now, we have an empty bucket.
在继续之前,请选择S3存储桶中的文件并删除它们。 现在,我们有一个空桶。
Now, suppose we do not want to upload any jpg files into the S3 bucket. We can exclude specific files as well to upload using the exclude extension.
现在,假设我们不想将任何jpg文件上传到S3存储桶中。 我们也可以排除特定文件,也可以使用排除扩展名将其上传。
The following command excludes *.jpg files and uploads other files. You can verify it in the following screenshot.
以下命令排除* .jpg文件并上传其他文件。 您可以在以下屏幕截图中进行验证。
aws s3 cp C:\FS\Upload s3://sqlshackdemocli --recursive --exclude "*.jpg"
Similarly, we can use both include and exclude arguments together as well. For example, we require to exclude text files and include JPG files, use the following command.
同样,我们也可以同时使用包含和排除参数。 例如,我们要求排除文本文件并包括JPG文件,请使用以下命令。
aws s3 cp C:\FS\Upload s3://sqlshackdemocli --recursive --exclude *.txt* --include "*.jpg"
Suppose we have various files in the source folder, and a few of them are already uploaded in the S3 bucket.
假设我们在源文件夹中有各种文件,其中一些已经上传到S3存储桶中。
Look at the following source and S3 bucket files. We do not have three files (highlighted in the Source) in the S3 bucket.
查看以下源文件和S3存储桶文件。 S3存储桶中没有三个文件(在源代码中突出显示)。
Source (local director)
来源(本地总监)
S3 bucket
S3斗
We want to upload only remaining files from source to destination. We can achieve the requirement using the sync argument.
我们只想将其余文件从源上传到目标。 我们可以使用sync参数来达到要求。
aws s3 sync C:\FS\Upload s3://sqlshackdemocli
In the output, we see it uploaded only files that are not available in the source folder.
在输出中,我们看到它仅上载了源文件夹中不可用的文件。
By default, uploaded files do not have public access. If you try to access the object URL, it gives the following error message.
默认情况下,上传的文件没有公共访问权限。 如果您尝试访问对象URL,则会显示以下错误消息。
We can set permissions while copying the files as well. Specify the acl argument and set permissions to public-read.
我们也可以在复制文件时设置权限。 指定acl参数并将权限设置为public-read。
aws s3 cp C:\FS\Upload s3://sqlshackdemocli --recursive --acl public-read
We can remove a file in a bucket using the rm command. Use a recursive argument to delete all files.
我们可以使用rm命令删除存储桶中的文件。 使用递归参数删除所有文件。
aws s3 rm s3://sqlshackdemocli –recursive
It deletes the files from the S3 bucket and lists the deleted files name in the output.
它从S3存储桶中删除文件,并在输出中列出已删除文件的名称。
We can remove an S3 bucket using the rb command. The following command removes the S3 bucket named sqlshackdemocli.
我们可以使用rb命令删除S3存储桶。 以下命令将删除名为sqlshackdemocli的S3存储桶。
aws s3 rb s3://sqlshackdemocli
We get an error message because the bucket is not empty.
由于存储桶不为空,我们收到一条错误消息。
We can either remove the objects using the commands specified above or use the force argument to delete the bucket along with its content.
我们可以使用上面指定的命令删除对象,也可以使用force参数删除存储桶及其内容。
aws s3 rb s3://sqlshackdemocli –force
It first deletes the existing files and then removes the S3 bucket as shown below.
它首先删除现有文件,然后删除S3存储桶,如下所示。
In this article, we explored AWS CLI commands to perform various operations in the AWS S3 bucket. CLI makes it easy to perform tasks using simple commands and arguments. I would encourage you to explore CLI commands and perform the tasks as per your requirements. I will continue discovering more CLI commands in the upcoming articles.
在本文中,我们探索了AWS CLI命令以在AWS S3存储桶中执行各种操作。 CLI使您可以使用简单的命令和参数轻松执行任务。 我鼓励您探索CLI命令并根据需要执行任务。 我将在接下来的文章中继续发现更多的CLI命令。
翻译自:
aws s3 cli
转载地址:http://cdiwd.baihongyu.com/