Very quick guide on how to create expiring links to an object on Digital Ocean Spaces with Django. (<-- that's an affiliate link, btw, you'll get $50 to spend in 30 days if you signup)
tl;dr:
import boto3
from botocore.client import Config
session = boto3.session.Session()
client = session.client(
"s3",
region_name=AWS_S3_REGION_NAME,
endpoint_url=AWS_S3_ENDPOINT_URL,
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
config=Config(signature_version="s3"),
)
url = client.generate_presigned_url(
ClientMethod="get_object",
Params={
"Bucket": "bucket-name",
"Key": "object-key",
},
ExpiresIn=60 * 60,
)
Why
If you store your assets (images, PDFs, audio files, etc) in an object store like Spaces, but you want to restrict access to these objects for some reason you can create a signed link that expires after some defined time.
For example, I host all the audio files from Yet Another Sermon Host on Spaces, but I need to keep track of how many downloads occur to provide analytics for my customers. I can do this by registering a download in a Django view that then redirects to the expiring Spaces URL.
This way, all the downloads have to pass through my server that generates the expiring URL and I can record every download.
How
Digital Ocean Spaces is API compatible with Amazon S3, so we can use boto3 & django-storages - I'll assume you already have django-storages
setup (instructions for DO Spaces).
After installing django-storages
your settings.py
should contain something like this;
# settings.py
AWS_QUERYSTRING_AUTH = False
AWS_DEFAULT_ACL = "public-read"
AWS_ACCESS_KEY_ID = os.environ["DO_SPACES_ACCESS_KEY"]
AWS_SECRET_ACCESS_KEY = os.environ["DO_SPACES_SECRET"]
AWS_STORAGE_BUCKET_NAME = "bucket-name"
AWS_S3_REGION_NAME = "do-spaces-region"
AWS_S3_ENDPOINT_URL = f"https://{AWS_S3_REGION_NAME}.digitaloceanspaces.com"
AWS_S3_CUSTOM_DOMAIN = (
f"{AWS_STORAGE_BUCKET_NAME}.{AWS_S3_REGION_NAME}.cdn.digitaloceanspaces.com"
)
AWS_S3_OBJECT_PARAMETERS = {"CacheControl": "max-age=86400"}AWS_MEDIA_LOCATION = "yash-media"
AWS_S3_FILE_OVERWRITE = False # safer behaviour than the default
AWS_S3_OBJECT_PARAMETERS = {"CacheControl": "max-age=86400"} # do some caching, why not
And we'll add a boto3 client:
# settings.py
# client for expiring urls:
aws_session = boto3.session.Session()
AWS_CLIENT = aws_session.client(
"s3",
region_name=AWS_S3_REGION_NAME,
endpoint_url=AWS_S3_ENDPOINT_URL,
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
config=Config(signature_version="s3"),
)
And we should have a model with a FileField
# models.py
from django.db import models
class Asset(models.Model):
name = models.CharField(max_length=1000)
private_file = models.FileField()
# other fields here
Generating the expiring link is then a matter of using the boto3 library, so
let's add it as a method to the model:
# models.py
from django.conf import settings
from django.db import models
class Asset(models.Model):
name = models.CharField(max_length=1000)
private_file = models.FileField()
# other fields here
@property
def storage_key(self):
return f"{self.private_file.storage.location}/{self.private_file.name}"
def get_expiring_url(self, expires_in=1800):
"""
Generates a permissioned url that will expire at `expires_in` seconds.
Defaults to 30 minutes.
"""
if self.private_file is not None:
url = settings.AWS_CLIENT.generate_presigned_url(
ClientMethod="get_object",
Params={
"Bucket": settings.AWS_STORAGE_BUCKET_NAME,
"Key": self.storage_key,
},
ExpiresIn=expires_in,
)
return url
raise Http404
And we can redirect users like this:
# views.py
from django.shortcuts import get_object_or_404
from django.views.generic import RedirectView
class Download(RedirectView):
def get_redirect_url(self, *args, **kwargs):
"""
Override this method so the view doesn't try to do any
string interpolation.
"""
return self.url
def get(self, request, *args, **kwargs):
asset = get_object_or_404(models.Asset, pk=kwargs["pk"])
# TODO: register a download or check user has permission here
self.url = asset.get_expiring_url()
return super().get(request, *args, **kwargs)
Note: you have to set the ACL on the object to private
otherwise anyone can still access the files without the signed URL.
We can do this by making the entire bucket private:
# settings.py
AWS_DEFAULT_ACL = "private"
Or we can toggle individual objects with a couple of helper methods:
# models.py
class Asset(models.Model):
# other methods from above
def set_private(self):
self._set_acl("private")
def set_public_read(self):
self._set_acl("public-read")
def _set_acl(self, acl):
settings.AWS_CLIENT.put_object_acl(
Bucket=settings.AWS_STORAGE_BUCKET_NAME,
Key=self.storage_key, ACL=acl
)
Top comments (1)
Cool trick! Booooookmarked