1. Introduction
I recently undertook the AWS resume challenge, which was quite interesting, so I decided to write about it.
The primary goal of the AWS resume challenge is for participants to build a static personal website using cloud services.
it encourages hands-on experience with various cloud technologies including infrastructure as code(IaC), serverless computing and storage devices.
For this challenge, I utilised the following:
Amazon Web Service(AWS): Served as my cloud provider.
AWS S3: Acted as a storage unit to host my static files.
AWS Cloudfront: as a content delivery network, to serve the static webpage. A content delivery network(CDN) is a geographically distributed group of servers that caches content close to users.
Pulumi: Used as my infrastructure as code(IaC) tool. Pulumi is an IaC that lets you describe and provision your cloud infrastructure using various programming languages.
2. Setting Up the Environment
The first steps were to install the necessary tools and software including Golang, an IDE(Goland) or any IDE of your choice and pulumi.
Next was to configure pulumi to work with my AWS account using the AWS CLI
Made sure I had the necessary permission to create and manage AWS resources in my AWS account.
Created a directory to initialise the Pulumi project. To do this I opened my terminal and ran the following command mkdir aws-resume-challenge && cd aws-resume-challenge
This creates a new folder and navigates into it.
Then I ran pulumi new aws-go
This creates a new pulumi aws project and downloads the go sdk. This also generates some files that are necessary for the program to execute successfully.
3.Creating the infrastructure
I copied my static website folder which contains some HTML, CSS, JavaScript, and jQuery code and my index.html file into the root of my project.
Because I have a folder containing about 6 different folders with different files I need an object for each file(each file is a separate resource). I need a program to crawl the directory and add a resource (Bucket object) for each file. I’ll use a component that will handle the folder, to do this I created another go file and called it s3folder.go which contains the following code:
type Folder struct {
pulumi.ResourceState
bucketName pulumi.IDOutput `pulumi:"bucketName"`
websiteUrl pulumi.StringOutput `pulumi:"websiteUrl"`
}
func NewS3Folder(ctx *pulumi.Context, bucketName string, siteDir string, args *FolderArgs) (*Folder, error) {
var resource Folder
// Stack exports
err := ctx.RegisterComponentResource("pulumi:example:S3Folder", bucketName, &resource)
if err != nil {
return nil, err
}
// Create a bucket and expose a website index document
siteBucket, err := s3.NewBucket(ctx, bucketName, &s3.BucketArgs{
Website: s3.BucketWebsiteArgs{
IndexDocument: pulumi.String("index.html"),
},
}, pulumi.Parent(&resource))
if err != nil {
return nil, err
}
// For each file in the directory, create an S3 object stored in `siteBucket`
err = filepath.Walk(siteDir, func(name string, info fs.FileInfo, err error) error {
if err != nil {
return err
}
if !info.IsDir() {
rel, err := filepath.Rel(siteDir, name)
if err != nil {
return err
}
if _, err := s3.NewBucketObject(ctx, rel, &s3.BucketObjectArgs{
Bucket: siteBucket.ID(),
Source: pulumi.NewFileAsset(name),
ContentType: pulumi.String(mime.TypeByExtension(path.Ext(name))),
}, pulumi.Parent(&resource)); err != nil {
return err
}
}
return nil
})
if err != nil {
return nil, err
}
return &resource, nil
}
type folderArgs struct {
}
type FolderArgs struct {
}
func (FolderArgs) ElementType() reflect.Type {
return reflect.TypeOf((*folderArgs)(nil)).Elem()
}
- FolderStruct: is a custom pulumi resource that has two properties, bucket name and website name. These are populated after the resource is created.
- newS3folder function: this function creates a new instance of the folder resource. The function takes in a pulumi context, a bucket name, a directory path and arguments for the folder resource.
It first creates a new component resource of type
pulumi:example:s3folder
The s3 bucket is created with a website configuration that sets the index.html file as the index of the document It goes through the provided directory for each file, it creates a new s3 bucket object.BucketobjectArgs
specifies the bucket, using the bucket id, it also specifies the source file and the content type of the file which is determined by the file extension. Inmain.go
:
func main() {
pulumi.Run(func(ctx *pulumi.Context) error {
// Create a bucket and expose a website index document
f, err := NewS3Folder(ctx, "resume-bucket", "./website", &FolderArgs{})
if err != nil {
return err
}
ctx.Export("bucketName", f.bucketName)
ctx.Export("websiteUrl", f.websiteUrl)
return nil
})
}
This is the entry point of a pulumi program, using the pulumi.Run func. This is where the infrastructure is defined. Firstly
It calls the newS3Folder function to create a new bucker called “resume bucket” and uploads the contents of the /website
directory to this bucket. It returns a folder resource
When the folder resource is created successfully it exports two outputs bucketname and websiteurl in the pulumi CLI
To deploy this infrastructure, I ran pulumi up
command which creates these resources.
4. Configuring AWS Cloudfront for the s3Bucket:
I created a new Cloudfront distribution to properly serve this web page with low latency
- Chose the resume bucket as the origin domain
- Under origin access, I selected Origin access control settings and created a new AoC, which will be provided when the distribution is created
- Under the viewer protocol policy, I selected “https only” which means CloudFront will only accept HTTPS requests Created the distribution and copied the policy
- Navigated to the bucket, then to the permission tab I edited the bucket policy, which says only traffic from Cloudfront should be allowed
5. Conclusion
In conclusion, participating in the AWS resume challenge has been rewarding, it allowed me to use my skills in Golang especially in crawling through the website folder and creating a bucket object for every one of the files, also My AWS skills were sharpened.
I intend to extend this and add further features to this static website like ci/cd. To automatically update the website when I make and push changes to it. Bye for now!🤗
the github link to the full project can be found here
Top comments (0)