DEV Community

Cover image for Build a Web3 Movie Streaming dApp using NextJs, Tailwind, and Sia Renterd: Part Two
Gospel Darlington
Gospel Darlington

Posted on

1

Build a Web3 Movie Streaming dApp using NextJs, Tailwind, and Sia Renterd: Part Two

The Backend Service

Welcome Back! Please read through Part 1 if you haven't already. Now, let's dive into Part 2: Building the backend service for our web3 movie streaming platform.

We've provided a starter code for the backend, which currently displays a "Welcome" message when you start the server and visit http://localhost:9000 in your browser. Let's build on this foundation.

We currently have these codes in the source directory of the backend, and let me briefly explain them to you.

Utility Files
This folder which can be fully addressed to backend/src/utils contains two essential files, an HTTP exception handler function and an interface for handling file upload information.

This code defines a custom HttpException class that extends the built-in JavaScript Error class, allowing for the creation of error instances with specific HTTP status codes and messages.

// backend/src/utils/HttpExceptions.ts
export default class HttpException extends Error {
public status: number;
public message: string;
constructor(status: number, message: string) {
super(message);
this.status = status;
this.message = message;
}
}

This code defines an interface FileUpload that represents an uploaded file, specifying its properties such as name, data, size, encoding, and more, providing a structured way to handle file uploads in this backend application.

// backend/src/utils/interfaces.ts
export interface FileUpload {
name: string;
data: Buffer;
size: number;
encoding: string;
tempFilePath: string;
truncated: boolean;
mimetype: string;
md5: string;
mv: Function;
}
view raw interfaces.ts hosted with ❤ by GitHub

And then at backend/src root folder, we have this index.ts file which sets up an Express.js server with CORS and file upload support, defines a single GET route that returns a "Welcome" message, and handles errors by catching and re-throwing them as custom HttpExceptions, then starts the server on the port 9000 as specified in the environment variables.

require('dotenv').config()
import cors from 'cors'
import express, { NextFunction, Request, Response } from 'express'
import fileupload from 'express-fileupload'
import { StatusCodes } from 'http-status-codes'
import HttpException from './utils/HttpExceptions'
const app = express()
const port = process.env.PORT
app.use(cors())
app.use(fileupload())
app.get('/', async (req: Request, res: Response, next: NextFunction) => {
try {
return res.status(StatusCodes.OK).json({ message: 'Welcome' })
} catch (error: any) {
next(new HttpException(StatusCodes.BAD_REQUEST, error.message))
}
})
app.listen(port, () => {
console.log(`Server is running on port ${port}`)
})
view raw index.ts hosted with ❤ by GitHub

Now that we've covered the key files, let's create two new files in a services folder, each serving a distinct purpose in our application.

Service Files

In the backend/src folder, make a new folder called services in this location, this is where we'll create two services:

  1. Sia Service: Handles file uploads, downloads, streaming, and caching, communicating with the Renterd service.
  2. Background Service: Manages cached files, automatically removing them after 7 days at midnight daily.

The Sia Service

Let’s create a file named sia.service.ts at the backend/src/services folder and follow the steps below to formulate this service.

require('dotenv').config()
import { FileUpload } from '../utils/interfaces'
import axios, { AxiosProgressEvent, AxiosResponse } from 'axios'
import fs from 'fs'
import path from 'path'
import { Readable, pipeline } from 'stream'
import { promisify } from 'util'
class SiaService {
private siaBucket: string
private siaUrl: string
private siaPassword: string
private baseUrl: string
constructor() {
this.siaBucket = String(process.env.SIA_API_BUCKET)
this.siaUrl = String(process.env.SIA_API_BASE_URL)
this.siaPassword = String(process.env.SIA_API_PASSWORD)
this.baseUrl = String(process.env.ORIGIN)
}
// The rest of the codes goes in here!
}
export default SiaService
view raw sia.service.ts hosted with ❤ by GitHub

This code defines a SiaService class that initializes with environment variables for Sia API settings and an origin URL, providing a foundation for managing interactions with the Sia service. Now, let's supply the rest of the codes for this service.

Uploading Files to Sia Renterd
To upload files to the Sia Network, we will need to add these three methods into the class, two will be private whereas one will be public.

private generateRandomString(length: number): string {
const characters =
'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789'
let result = ''
for (let i = 0; i < length; i++) {
result += characters.charAt(Math.floor(Math.random() * characters.length))
}
return result
}

This code defines a private method generateRandomString that generates a random string of a specified length, composed of uppercase and lowercase letters and numbers, using a loop to select characters randomly from a predefined string. We will use it to rename each file uniquely before shipping a file to Renterd.

private async uploadToSiaService(
file: FileUpload,
folder: string,
fileId: string
): Promise<AxiosResponse> {
// Step 1: Construct the upload URL
const url: string = `${this.siaUrl}/api/worker/objects/${folder}/${fileId}?bucket=${this.siaBucket}`
// Step 2: Set up the request configuration
let config = {
method: 'PUT',
maxBodyLength: Infinity,
url,
headers: {
Authorization: `Basic ${Buffer.from(`:${this.siaPassword}`).toString(
'base64'
)}`,
'Content-Type': file.mimetype,
},
data: file.data,
onUploadProgress: (progressEvent: AxiosProgressEvent) => {
// Step 3: Track upload progress
const { loaded, total } = progressEvent
const percentCompleted = Math.round((loaded / Number(total)) * 100)
console.log(`Upload progress: ${percentCompleted}%`)
},
}
try {
// Step 4: Send the upload request
return await axios.request(config)
} catch (e: any) {
// Step 5: Return the response
console.error(e.message)
throw new Error(e.message || 'Error uploading file')
}
}

The above code defines a private method uploadToSiaService that uploads a file to Sia Renterd using Axios, handling upload progress and errors, and returning the Axios response or throwing an error if the upload fails.

The Renterd endpoints are written in the API documentation which you can check out or watch the video below where I explained how the Sia Renterd API documentation.

Now let’s include the public method which we will later expose as an endpoint in our application.

public async uploadFile(file: FileUpload): Promise<object | Error> {
try {
// Step 1: Generate a unique identifier
const timestamp = Date.now().toString()
const randomString = this.generateRandomString(4)
const identifier = `${timestamp}__${randomString}`
// Step 2: Determine file folder and ID
const folder = file.mimetype.split('/')[0]
const extension = file.mimetype.split('/')[1]
const fileId = `${identifier}.${extension}`
// Step 3: Create local cache directory
const cacheDir = path.resolve(__dirname, '..', '..', 'cache')
const localFilePath = path.join(cacheDir, folder, fileId)
// Ensure the directory exists
const dirPath = path.dirname(localFilePath)
if (!fs.existsSync(dirPath)) {
fs.mkdirSync(dirPath, { recursive: true })
}
// Step 4: Save file to local cache
const writeStream = fs.createWriteStream(localFilePath)
const dataStream =
file.data instanceof Buffer ? Readable.from(file.data) : file.data
await promisify(pipeline)(dataStream, writeStream)
// Step 5: Silently upload file to Sia service
this.uploadToSiaService(file, folder, fileId)
// Step 6: Return file URL and success message
return {
url: `${this.baseUrl}/download/${folder}/${fileId}`,
message: 'File successfully uploaded!',
}
} catch (e: any) {
// Handle Error
console.error(e.message)
throw new Error(e.message || 'Error uploading file')
}
}
view raw uploadFile.ts hosted with ❤ by GitHub

This code defines a public method uploadFile that uploads a file by generating a unique identifier, saving the file to a local cache, and then uploading it to the Sia Renterd, returning the file's URL and a success message or throwing an error if the upload fails.

Downloading Files to Sia Renterd
To download files to the Sia Network, we will need to add these two methods into the class, one will be private and the other will be public.

private async downloadFromSiaService(
folder: string,
fileId: string
): Promise<NodeJS.ReadableStream> {
// Step 1: Construct the download URL
let url: string = `${this.siaUrl}/api/worker/objects/${folder}/${fileId}?bucket=${this.siaBucket}`
// Step 2: Set up the request configuration
let config = {
method: 'GET',
maxBodyLength: Infinity,
url,
headers: {
Authorization: `Basic ${Buffer.from(`:${this.siaPassword}`).toString(
'base64'
)}`,
},
responseType: 'stream' as const,
}
try {
const cacheDir = path.resolve(__dirname, '..', '..', 'cache')
const localFilePath = path.join(cacheDir, folder, fileId)
// Step 3: Check if the file is already cached locally
if (!fs.existsSync(cacheDir)) {
fs.mkdirSync(cacheDir, { recursive: true })
}
const folderDir = path.join(cacheDir, folder)
if (!fs.existsSync(folderDir)) {
fs.mkdirSync(folderDir, { recursive: true })
}
const fileExists = fs.existsSync(localFilePath)
if (fileExists) {
// Step 4: Return the cached file (if exists)
return fs.createReadStream(localFilePath)
} else {
// Step 5: Download the file from Sia service (if not cached)
const response = await axios.request(config)
// Step 6: Save the downloaded file to the local cache
const writeStream = fs.createWriteStream(localFilePath)
promisify(pipeline)(response.data, writeStream)
// Step 7: Return the file stream
return response.data
}
} catch (e: any) {
console.error(e)
// returns a readable stream of a 404 image.
if (e.response && e.response.status === 404) {
const notFound = path.resolve(
__dirname,
'..',
'..',
'response_files',
'404.png'
)
return fs.createReadStream(notFound)
} else {
}
throw new Error(e.message || 'Error downloading file')
}
}

This code defines a private method downloadFromSiaService that retrieves a file from the Sia service, caches it locally, and returns a readable stream of the file, handling errors and returning a 404 image if the file is not found.

Let’s have those response_files available in the backend directory, else we will experience an error calling the 404.png file. At the backend directory create another folder called response_files and copy the following images into it.

404.png

401.png

Perfect, now let’s complete this file download service, also add the method below in the SiaService class.

public async downloadFile(
folder: string,
fileId: string
): Promise<NodeJS.ReadableStream> {
const fileStream = await this.downloadFromSiaService(folder, fileId)
// Return the file stream
return fileStream
}
view raw downloadFile.ts hosted with ❤ by GitHub

This code defines a public method downloadFile that calls the private method downloadFromSiaService to retrieve a file from the Sia Renterd and returns the readable stream of the retrieved file.

Service Endpoints

It's time we couple these various methods to their respective endpoints, currently, we have just one, but we will need an additional two for uploading and downloading files. File streaming will also utilize the download endpoint.

Head to the backend/src/index.ts file and update its content with these codes.

require('dotenv').config()
import cors from 'cors'
import express, { NextFunction, Request, Response } from 'express'
import fileupload from 'express-fileupload'
import { FileUpload } from './utils/interfaces'
import { StatusCodes } from 'http-status-codes'
import HttpException from './utils/HttpExceptions'
import SiaService from './services/sia.service'
const app = express()
const port = process.env.PORT
const siaService = new SiaService()
app.use(cors())
app.use(fileupload())
// This endpoint simply displays a welcome message
app.get('/', async (req: Request, res: Response, next: NextFunction) => {
try {
return res.status(StatusCodes.OK).json({ message: 'Welcome' })
} catch (error: any) {
next(new HttpException(StatusCodes.BAD_REQUEST, error.message))
}
})
// This endpoint let's you write files to the Sia Network
app.post('/upload', async (req: Request, res: Response, next: NextFunction) => {
try {
if (!req.files) {
throw new HttpException(StatusCodes.NO_CONTENT, 'No file uploaded')
}
const fileUpload: FileUpload = req.files.file as FileUpload
const result = await siaService.uploadFile(fileUpload)
return res.status(StatusCodes.CREATED).json(result)
} catch (e: any) {
next(new HttpException(StatusCodes.BAD_REQUEST, e.message))
}
})
// This endpoint let's you read files from the Sia Network
app.get(
'/download/:folder/:fileId',
async (req: Request, res: Response, next: NextFunction) => {
const { folder, fileId } = req.params
try {
if (!folder || !fileId) {
return res
.status(StatusCodes.BAD_REQUEST)
.json({ message: 'Folder or File ID not found' })
} else {
const result = await siaService.downloadFile(folder, fileId)
return result.pipe(res).status(StatusCodes.OK)
}
} catch (error: any) {
next(new HttpException(StatusCodes.BAD_REQUEST, error.message))
}
}
)
app.listen(port, () => {
console.log(`Server is running on port ${port}`)
})
view raw index.ts hosted with ❤ by GitHub

This code sets up an Express.js server with CORS and file upload support, defining three endpoints: a welcome message, file upload to the Sia Network, and file download from the Sia Network, using the SiaService class to handle file operations and HttpException for error handling.

Watch this section of the video below if you require some visual aid, ensure you stop at the 01:50:44 timestamp.

We need to create a cache management service to ensure our server doesn't fill up with unused files by controlling how long files stay in the cache. Its important to know that the only reason we needed this service is to reduce data latency.

The Background Service

Head to the backend/src/services folder and create a file called background.service.ts and add these sequences of code to it.

import fs from 'fs'
import path from 'path'
import cron from 'node-cron'
class BackgroundService {
private cacheDir: string
constructor() {
this.cacheDir = path.resolve(__dirname, '..', '..', 'cache')
console.log('Background jobs mounted...')
}
// The rest of the codes goes in here!
}
export default BackgroundService

This code defines a BackgroundService class that sets up a cache directory and schedules daily jobs using the node-cron library, initializing the background jobs and logging a confirmation message. Let’s create a method that will be responsible for deleting files older than 7 days in the cache.

Deleting Old File
Add this method to the BackgroundService class.

public deleteOldFiles(): void {
// Step 1: Log the start of the job
console.log('Starting deleteOldFiles job at:', new Date().toISOString())
// Step 2: Read the cache directory
fs.readdir(this.cacheDir, (err, folders) => {
// Step 3: Handle errors (if any)
if (err) {
console.error('Error reading cache directory:', err)
return
}
// Step 4: Iterate through folders
folders.forEach((folder) => {
const folderPath = path.join(this.cacheDir, folder)
// Step 5: Read the folder contents
fs.readdir(folderPath, (err, files) => {
// Step 6: Handle errors (if any)
if (err) {
console.error('Error reading folder:', err)
return
}
// Step 7: Iterate through files
files.forEach((file) => {
const fileCreatedAt: number = Number(file.split('__')[0])
const targetTime: number = fileCreatedAt + 7 * 24 * 60 * 60 * 1000
// Step 8: Check file age
if (Date.now() > targetTime) {
// Step 9: Delete old files
const filePath = path.join(folderPath, file)
fs.unlink(filePath, (err) => {
// Step 10: Handle deletion errors (if any)
if (err) {
console.error(`Error deleting file ${filePath}:`, err)
} else {
console.log(`${folder}: ${file} expired and removed from cache`)
}
})
}
})
})
})
})
// Step 11: Log the end of the job
console.log('Finished deleteOldFiles job at:', new Date().toISOString())
}

This code defines a method called deleteOldFiles that removes files from a cache directory that are older than 7 days, by reading the directory, checking each file's creation time, removing files that exceed the target time, logging the start and end of the job, and any errors or successful deletions.

Now, let’s write a function that will utilize the node-cron package to schedule when to execute the file deletion.

private dailyJobs(): void {
// Schedule the cleanup to run every day at 00:00
cron.schedule('0 0 * * *', () => {
this.deleteOldFiles()
})
}
view raw dailyJobs.ts hosted with ❤ by GitHub

This code sets up a daily cron job to run the deleteOldFiles method every day at midnight (00:00) to perform automatic file cleanup.

We also need to update the constructor function to schedule the daily Jobs at the instantiation of the background service class.

constructor() {
this.cacheDir = path.resolve(__dirname, '..', '..', 'cache')
this.dailyJobs()
console.log('Background jobs mounted...')
}
view raw constructor.ts hosted with ❤ by GitHub

Perfect, lastly, let’s add this background operation as part of the server process at initialization. Head to the backend/src/index.ts file and update the app listener method and also importing the background service file.

// ...Previous modules imported...
import BackgroundService from './services/background.service'
// ...The rest of the codes...
app.listen(port, () => {
console.log(`Server is running on port ${port}`)
new BackgroundService() // Added the background service.
})

You should rerun the backend service command using $ yarn build && yarn start and see a terminal printout like the one in the image below.

Observe the console log from the Background Service

If you would rather watch how I coded the entire background service, the video below is for you, just ensure you stop at the 02:16:07 timestamp.

Next Step
Congratulations, you are now ready for the final part of this tutorial which is Part 3. If you encounter any issues, refer to the following resources for troubleshooting.

About Author

I am a web3 developer and the founder of Dapp Mentors, a company that helps businesses and individuals build and launch decentralized applications. I have over 8 years of experience in the software industry, and I am passionate about using blockchain technology to create new and innovative applications. I run a YouTube channel called Dapp Mentors where I share tutorials and tips on web3 development, and I regularly post articles online about the latest trends in the blockchain space.

Image of Timescale

🚀 pgai Vectorizer: SQLAlchemy and LiteLLM Make Vector Search Simple

We built pgai Vectorizer to simplify embedding management for AI applications—without needing a separate database or complex infrastructure. Since launch, developers have created over 3,000 vectorizers on Timescale Cloud, with many more self-hosted.

Read more

Top comments (0)