Requirement
The requirement is to build 3 microservices which would get 3 metrics of a GitHub user, provided the username of the GitHub user.
The metrics were:
• Commits of the searched user
• Issues for the user
• Pull Requests made by the GitHub user.
Solution Outline
The solution involves creating 3 separate microservices to fetch the above GitHub user metrics. I chose python to implement all three microservices. The services were named Commit Service, Issue Service & PR Service.
Before I began, we need to create AWS Account and an IAM user to perform certain tasks in the CLI. I created a user with the below permission sets and policies.
Once that is created, I installed AWS CLI and used aws configure
command to configure AWS CLI to use the credentials belonging to the IAM user which can be found on the “Security Credentials” tab in Fig 1. I also created a Personal Access Token from GitHub to fetch user metrics. Boto3 is another tool I used to interact with the DynamoDB and my python code.
The below section explains how each service was implemented.
Commit Service
This service fetches the user’s commits and the URL of the commits and saves them to the DynamoDB table – deviq_commits and returns a json response consisting of those commits. This service runs on port 5001.
import commitServiceUtil
from flask import Flask
import requests
import boto3
app = Flask(__name__)
#dynamodb
session = boto3.Session(region_name='<aws-region>')
dynamodb = session.resource('dynamodb')
table_name = '<table_name>'
table = dynamodb.Table(table_name)
@app.route('/')
def index():
return "Hello! from Commit Service"
@app.route('/commits/<username>')
def get_user_commits(username):
try:
token = getCid.get_gh_token()
maxId = getCid.getMaxCid()
url = f"https://api.github.com/users/{username}/events"
ret_commit = []
response = requests.get(url, headers={"Authorization": f"token {token}"})
if response.status_code == 200:
events = response.json()
commits = []
for event in events:
if event['type'] == 'PushEvent':
for commit in event['payload']['commits'][:15]:
print(commit)
commits.append(commit['url'])
table.put_item(Item={
'cid':maxId,
'gh_username': username,
'commit_url': commit['url']
})
maxId+=1
ret_commit.append(commit)
return ret_commit
except requests.exceptions.RequestException as exp:
return f"Error Occured when fetching commits: {exp}"
if __name__ == '__main__':
app.run(debug=True,host='0.0.0.0', port=5001)
the commitServiceUtil.py
file contains helper functions such as getting the top cid (key of the table) for insertion and fetching the GitHub Personal Access Token which is stored in AWS Secrets Manager.
commitServiceUtil.py
import boto3
import json
# Initialize a DynamoDB client
dynamodb = boto3.resource('dynamodb', region_name='<aws-region-name>')
# Specify the table name and column (attribute) name
table_name = '<table_name>'
column_name = '<key>'
# Initialize a DynamoDB table resource
table = dynamodb.Table(table_name)
# Perform a scan operation to get all items sorted in descending order by the specified column
response = table.scan()
def getMaxCid():
# Check if any items were returned
if 'Items' in response:
items = response['Items']
if len(items) > 0:
max_value = max(item[column_name] for item in items)
print(f"The maximum value for {column_name} is: {max_value}")
return max_value + 1
else:
print(f"No items found in the table")
return 1
else:
print(f"Error in scanning the table")
return -1
def get_gh_token():
secret_name = "<SECRET_NAME>"
region_name = "<aws-region-name>"
session = boto3.Session()
client = session.client(service_name='secretsmanager', region_name=region_name)
get_s = client.get_secret_value(SecretId = secret_name)
secret = get_s['SecretString']
ss = json.loads(secret)
print(type(ss['github_token']))
return ss['github_token']
If the table is empty, the getcid
function returns 1 and if the table is not empty, the function returns the +1 of the maximum cid.
The get_gh_token
function is to fetch the GitHub Personal Access token from AWS Secrets manager. The token was manually uploaded to the AWS Secrets Manager and since committing tokens in the code is not a good practice, this is one good measure. Also, this benefits other two microservices as they also will be using the same token to fetch metrics.
Dockerfile:
FROM python:3.9
COPY . .
RUN pip install --no-cache-dir -r requirements.txt
RUN apt-get update && \
apt-get install -y awscli && \
apt-get clean
RUN aws configure set aws_access_key_id <hidden> && \
aws configure set aws_secret_access_key <hidden> && \
aws configure set default.region ap-south-1
EXPOSE 5001
CMD ["python", "commitService.py"]
Deployment.yaml
apiVersion: apps/v1
kind: Deployment
metadata:
name: commitservice-deployment
spec:
replicas: 1
selector:
matchLabels:
app: commitservice
template:
metadata:
labels:
app: commitservice
spec:
containers:
- name: commitservice
image: <ecr_image_id:tag>
ports:
- containerPort: 5001
service.yaml
apiVersion: v1
kind: Service
metadata:
name: commitservice-service
spec:
selector:
app: commitservice
ports:
- protocol: TCP
port: 5001
targetPort: 5001
type: LoadBalancer # or NodePort, depending on your setup
In Part 2 we will explore how the rest of the services were written & deployed.
Top comments (0)