DEV Community

Hyelngtil Isaac
Hyelngtil Isaac

Posted on • Originally published at hyelngtil.awstech

Networking Series 8: Access S3 from a VPC

Introducing Today's Project!

What is Amazon VPC?
Amazon VPC creates a private cloud network in AWS where your resources (like EC2) operate securely. It allows you to control traffic, define access, and safely connect to external services like S3, making cloud architecture both flexible and secure.

How I used Amazon VPC in this project
In todayʼs project, I used Amazon VPC to create a secure cloud network and launched an EC2 instance inside it. I gave it AWS credentials and successfully used CLI commands to access and interact with an Amazon S3 bucket from within the VPC.

One thing I didn't expect in this project
One thing I didnʼt expect was that EC2 instance in a private VPC accesses S3 over the public internet. I assumed internal traffic, but since S3 lives outside the VPC, a VPC endpoint is needed for secure, private communication.

This project took me about 45 mins: setting up VPC, EC2, access keys, and testing S3 connectivity via CLI.


In the first part of my project

Step 1 - Architecture set up
Iʼm about to create a new VPC and launch an EC2 instance into it. This sets up the foundation for our project, giving me a secure network and a virtual server to work with. Letʼs build this cloud!

Step 2 - Connect to my EC2 instance
I'm about to connect to my EC2 instance using Instance Connect so I can access its terminal securely.

Step 3 - Set up access keys
I'm giving my EC2 instance access to my AWS environment by configuring access keys. This allows it to securely authenticate with AWS and interact with services like S3, making sure it can communicate with my account from inside the VPC.


Architecture set up

I started my project by launching a custom VPC and an EC2 instance within it to begin connecting AWS services securely from a private network.

I also set up an S3 bucket so my EC2 instance inside the VPC can store and retrieve data securely using AWS access keys.


Running CLI commands

AWS CLI is a tool for managing AWS services. I have access since it's pre-installed on my EC2 instance.

The first command I ran was 'aws s3 ls'. This command is used to list my S3 buckets and verify EC2ʼs ability to connect using AWS CLI.

The second command I ran was 'aws configure'. This command is used to set up AWS CLI credentials.


Access keys

Credentials
To set up my EC2 instance to interact with my AWS environment, I configured it using 'aws configure' to provide credentials and region settings securely.

Access keys are credentials used to securely access AWS services. They include an Access Key ID and Secret Access Key, which authenticate users or apps interacting with resources like S3 via AWS CLI.

Secret access keys are secret credentials paired with an access key ID to let AWS tools authenticate and access services securely.

Best practice
Although I'm using access keys in this project, a best practice alternative is to use IAM roles for secure, credential-free access from EC2 to AWS services.


In the second part of my project

Step 4 - Set up an S3 bucket
Iʼm going to launch a bucket in Amazon S3 so my EC2 instance can store and retrieve data securely from within my VPC as part of this AWS setup.

Step 5 - Connecting to my S3 bucket
Iʼm heading back to my EC2 instance to connect it with my S3 bucket so I can test secure communication using access keys and the AWS CLI.


Connecting to my S3 bucket

The first command I ran was 'aws s3 ls'. This command is used to list my S3 buckets and verify EC2ʼs ability to connect using AWS CLI.

When I ran the command again, the terminal showed S3 bucket indicating AWS access was successful.


Connecting to my S3 bucket

Another CLI command I ran was 'aws s3 ls s3://maven-vpc-s3', which returned bucket contents showing VPC-to-S3 access worked.


Uploading objects to S3

To upload a new file to my bucket, I first ran the command 'sudo touch /tmp/test.txt'. This command creates an empty test file on my EC2 instance for S3 upload testing.

The second command I ran was 'aws s3 cp /tmp/test.txt s3://maven-vpc-s3'. This command will upload the test file, proving EC2 in the VPC can transfer data to S3.

The third command I ran was 'aws s3 ls s3://maven-vpc-s3' again, which validated that 'test.txt' was uploaded, confirming my EC2 instance successfully accessed and updated the S3 bucket.


🤝 The final in this networking series is: "VPC Endpoints"

Top comments (0)