# Amazon S3

## Amazon S3 overview

Amazon Simple Storage Service (Amazon S3) is an object storage that has a simple web services interface to store and retrieve any amount of data. It gives any developer access to the same highly scalable, reliable, fast, inexpensive data storage infrastructure that Amazon uses to run its own global network of web sites.

{% embed url="<https://docs.aws.amazon.com/AmazonS3/latest/userguide/Welcome.html>" %}

### S3 bucket

To store an object in Amazon S3, you create a bucket and then upload the object to the bucket.

{% hint style="info" %}
A bucket is a container for objects. An object is a file and any metadata that describes that file.
{% endhint %}

When the object is in the bucket, you can open it, download it, and move it. When you no longer need an object or a bucket, you can clean up your resources.

An Amazon S3 bucket name is globally unique, and the namespace is shared by all AWS accounts. This means that after a bucket is created, the name of that bucket cannot be used by another AWS account in any AWS Region until the bucket is deleted.

### Addressing model

There are two addressing models to access a bucket:

* `Virtual-Hosted–Style`:

  ```http
  https://<bucket-name>.s3.<region>.amazonaws.com/<key-name>
  ```
* `Path-Style`:

  ```http
  https://s3.<region>.amazonaws.com/<bucket-name>/<key-name>
  ```

### Access control list

Amazon S3 access control lists (ACLs) enable you to manage access to buckets and objects. Each bucket and object has an ACL attached to it as a subresource. It defines which AWS accounts or groups are granted access and the type of access. When a request is received against a resource, Amazon S3 checks the corresponding ACL to verify that the requester has the necessary access permissions.

{% embed url="<https://docs.aws.amazon.com/AmazonS3/latest/userguide/acl-overview.html>" %}

## Security issues

### Bucket takeover

If an application is using a domain-linked S3 bucket that has been deleted by developers and CNAME records from [Amazone Route 53](https://aws.amazon.com/route53/) are still pending deletion, you can claim this unclaimed S3 bucket name by using an other AWS account.

To verify whether bucket takeover may be possible, run:

```bash
$ curl -s https://<url-to-bucket> | grep -E -q '<Code>NoSuchBucket</Code>|<li>Code: NoSuchBucket</li>' && echo "Subdomain takeover may be possible" || echo "Subdomain takeover is not possible"
```

References:

* [Hands-on AWS S3 Bucket Takeover](https://blog.securelayer7.net/hands-on-aws-s3-bucket-account-takeover-vulnerability/)
* [Sub-Domain Take Over — AWS S3 Bucket](https://towardsaws.com/subdomain-takeover-aws-s3-bucket-4699815d1b62)
* [Writeup: s3 bucket takeover presented in https://github.com/reddit/rpan-studio/blob/e1782332c75ecb2f774343258ff509788feab7ce/CI/full-build-macos.sh](https://hackerone.com/reports/1285598)

### Improper ACL permissions

If ACL permissions are misconfigured you can get unauthenticated access to a bucket. Moreover, such permissions may allow you to both read and modify objects.

```bash
# configure AWS CLI
$ aws configure
# listing a bucket
$ aws s3 ls s3://<bucket-name>/
# create file.txt in a bucket
$ aws s3 cp s3://<bucket-name>/file.txt
# remove file.txt in a bucket
$ aws s3 rm s3://<bucket-name>/file.txt
```

You can use the following tools to automate the process:

* [S3Scanner](https://github.com/sa7mon/S3Scanner) - Scan for open S3 buckets and dump the contents.
* [s3inspector](https://github.com/clario-tech/s3-inspector) - Tool to check AWS S3 bucket permissions.
* [lazys3](https://github.com/nahamsec/lazys3) - A Ruby script to bruteforce for AWS s3 buckets using different permutations.

References:

* [Hands-on AWS S3 Bucket Vulnerabilities](https://blog.securelayer7.net/hands-on-aws-s3-bucket-vulnerabilities/)

## References

* [Basics of AWS Penetration Testing for S3 Bucket Service](https://blog.securelayer7.net/aws-penetration-testing-for-s3-bucket-service-basics-security/)


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://0xn3va.gitbook.io/cheat-sheets/cloud/aws/s3.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
