DEV Community

Connie Leung
Connie Leung

Posted on

Building a Video Generation Pipeline with Angular, Veo 3.1, and Firebase Cloud Functions

Building a Video Generation Pipeline with Angular, Veo 3.1, and Firebase Cloud Functions

In modern web applications, offloading heavy generative AI logic from the frontend to the backend is a necessity. In my latest project, I refactored an Angular application to generate high-quality videos using Veo 3.1 model, managed entirely through Firebase services.

By moving this logic to the server side, we secure our API keys and can update model parameters without redeploying the entire user interface.

Prerequisites

The technical stack of the project:

  • Angular 21, The latest version as of December 2025.
  • Node LTS, Use the LTS version as of December 2025.
  • Firebase Remote Config: To manage dynamic parameters.
  • Firebase Cloud Functions: To be called by the frontend to either generate a video or interpolate a video between two images.
  • Firebase Cloud Storage: To host the generated video files in the default Firebase storage bucket.
  • Firebase Cloud Functions Emulator: To test the functions locally at http://localhost:5001.
  • Gemini in Vertex AI: Use Gemini in Vertex AI to generate videos and store them in Firebase Cloud Storage.

The public Google AI Studio API is restricted in my region (Hong Kong). However, Vertex AI (Google Cloud) offers enterprise access that works reliably here, so I chose Vertex AI for this demo.

npm i -g firebase-tools
Enter fullscreen mode Exit fullscreen mode

Install firebase-tools globally using npm.

firebase logout
Enter fullscreen mode Exit fullscreen mode
firebase login
Enter fullscreen mode Exit fullscreen mode

Log out of Firebase and re-login to perform proper Firebase authentication.

firebase init
Enter fullscreen mode Exit fullscreen mode

Execute firebase init and follow the screens to set up Firebase Cloud Functions, Firebase Emulator Suite, Firebase Cloud Storage, and Firebase Remote Config.

If you have an existing project or multiple projects, you can specify the project ID on the command line.

firebase init --project <PROJECT_ID>
Enter fullscreen mode Exit fullscreen mode

In both cases, the Firebase CLI automatically installs firebase-admin and firebase-functions dependencies.

After completing the setup steps, the Firebase tools will generate the Firebase Functions emulator, functions, a Storage Rules file, remote config templates, and configuration files such as .firebaserc and firebase.json.

  • Dependency for the Angular application
npm i firebase
Enter fullscreen mode Exit fullscreen mode

The Angular application requires the firebase dependency to initialize a Firebase app, load remote config, and invoke Firebase Cloud Functions to generate videos.

  • Firebase dependencies
npm i @cfworker/json-schema @google/genai @modelcontextprotocol/sdk
Enter fullscreen mode Exit fullscreen mode

Install the above dependencies to access Gemini in Vertex AI. @google/genai depends on @cfworker/json-schema and @modelcontextprotocol/sdk. If I do not install them, the cloud functions cannot start.


Architecture

High level architecture of the video generation process

The frontend application is built with Angular. It relies on the Firebase AI Logic to generate images using Gemini 3 Image Pro Preview model. Then, the text prompt and the image are submitted to a Firebase Cloud Function to create a video. Firebase AI Logic does not support video generation, so the cloud function calls the Gemini API and Veo 3.1 model to create it. Moreover, the Gemini API allows a outputGcsUri parameter that is a valid Google Cloud Storage path with gs:// prefix. The function stores the generated videos in the specified bucket and returns the GCS uri. The client resolves the GCS URI to an HTTP URL and plays the video in an HTML video player element.


Firebase Integration

1. Configure Environment Variables

I define the environment variables in the Firebase project. This ensures the functions know the regions for storage and function hosting, and Veo model to use for video generation.

.env.example

GOOGLE_CLOUD_LOCATION="us-central1"
GOOGLE_GENAI_USE_VERTEXAI=true
GEMINI_VIDEO_MODEL_NAME="veo-3.1-fast-generate-001"
IS_VEO31_USED="true"
POLLING_PERIOD_MS="10000"
GOOGLE_FUNCTION_LOCATION="us-central1"
WHITELIST="http://localhost:4200"
REFERER="http://localhost:4200/"
Enter fullscreen mode Exit fullscreen mode
Variable Description
GOOGLE_CLOUD_LOCATION The location of the bucket. I chose us-central1 because the bucket is always free in this region.
GOOGLE_GENAI_USE_VERTEXAI Whether or not Vertex AI is used.
GEMINI_VIDEO_MODEL_NAME The name of the Gemini video model name.
IS_VEO31_USED Whether or not Veo 3.1 is used. If false, falls back to generating a video instead of interpolation.
POLLING_PERIOD_MS The polling period of the video operation in milliseconds.
GOOGLE_FUNCTION_LOCATION The region of the cloud functions. I chose us-central1 because so the functions and the bucket are in the same region.
WHITELIST Requests must come from http://localhost:4200
REFERER Requests are originated from http://localhost:4200

2. Validating Environment Variables

Before the Cloud Function proceeds with any AI calls, it is critical to ensure that all necessary environment variables are present. I implemented a validateVideoConfigFields helper function to check for whether or not Veo 3.1 is used, the polling period, whether or not Vertex AI is used, Vertex AI location, model name, and the project ID.

import logger from "firebase-functions/logger";

export function validate(value: string | undefined, fieldName: string, missingKeys: string[]) {
  const err = `${fieldName} is missing.`;
  if (!value) {
    logger.error(err);
    missingKeys.push(fieldName);
    return "";
  }

  return value;
}
Enter fullscreen mode Exit fullscreen mode
import { GenerateVideosParameters, GoogleGenAI } from "@google/genai";
import { validate } from "../validate";

export function validateVideoConfigFields() {
  process.loadEnvFile();

  const env = process.env;
  const isVeo31Used = (env.IS_VEO31_USED || "false") === "true";
  const pollingPeriod = Number(env.POLLING_PERIOD_MS || "10000");
  const vertexai = (env.GOOGLE_GENAI_USE_VERTEXAI || "false") === "true";

  const missingKeys: string[] = [];
  const location = validate(env.GOOGLE_CLOUD_LOCATION, "Vertex Location", missingKeys);
  const model = validate(env.GEMINI_VIDEO_MODEL_NAME, "Gemini Video Model Name", missingKeys);
  const project = validate(env.GOOGLE_CLOUD_QUOTA_PROJECT, "Project ID", missingKeys);

  if (missingKeys.length > 0) {
    throw new Error(`Missing environment variables: ${missingKeys.join(", ")}`);
  }

  return {
    genAIOptions: {
      project,
      location,
      vertexai,
    },
    aiVideoOptions: {
      model,
      storageBucket: `${project}.firebasestorage.app`,
      isVeo31Used,
      pollingPeriod,
    },
  };
}
Enter fullscreen mode Exit fullscreen mode

I am using Node 24 as of December 2025. Since Node 20, we can use the built-in process.loadEnvFile function that loads environment variables from the .env file.

If you are using a Node version that does not support process.loadEnvfile, the alternative is to install dotenv to load the environment variables.

npm i dotenv
Enter fullscreen mode Exit fullscreen mode
import dotenv from "dotenv";

dotenv.config();
Enter fullscreen mode Exit fullscreen mode

Firebase provides the GOOGLE_CLOUD_QUOTA_PROJECT variable, so it is not defined in the .env file.

When the missingKeys array is not empty, the function throws an error that lists all the missing variable names. If the validation is successful, the genAIOptions and aiVideoOptions are returned. The genAIOptions is used to initialize the GoogleGenAI and aiVideoOptions contains parameters for video generation and interpolation.

3. Generating Video and Storing in Firebase Storage

The generateVideo cloud function passes the payload to the generateVideoFunction function.

All cloud functions enforce App Check, CORS and timeout period of 180 seconds. If WHITELIST is unspecified, CORS is default to true. It is alright in demo but it is safer to default to false or a specific domain in production.

const cors = process.env.WHITELIST ? process.env.WHITELIST.split(",") : true;
const options = {
  cors,
  enforceAppCheck: true,
  timeoutSeconds: 180,
};

export const generateVideo = onCall( options,
  ({ data }) => generateVideoFunction(data)
);
Enter fullscreen mode Exit fullscreen mode

generateVideoFunction delegates to the generateVideoURL function to construct the video arguments and poll the video operation until it finishes. At the end, it returns either the GCS uri or throws an error.

import { GoogleGenAI } from "@google/genai";
import { AIVideoBucket, GenerateVideoRequest } from "./types/video.type";
import { generateVideoByPolling, validateVideoConfigFields } from "./video.util";

export async function generateVideoFunction(data: GenerateVideoRequest) {
  const variables = validateVideoConfigFields();
  if (!variables) {
    return "";
  }

  const { genAIOptions, aiVideoOptions } = variables;

  try {
    const ai = new GoogleGenAI(genAIOptions);
    return await generateVideoURL({ ai, ...aiVideoOptions }, data);
  } catch (error) {
    console.error("Error generating video:", error);
    throw new Error("Error generating video");
  }
}

async function generateVideoURL(aiVideo: AIVideoBucket, imageParams: GenerateVideoRequest) {
  const args = constructVideoArguments(aiVideo.isVeo31Used, imageParams);
  return generateVideoByPolling(aiVideo, args);
}
Enter fullscreen mode Exit fullscreen mode

Veo 3.1 supports resolution property and the possible values are 1080p and 720p. For demo purpose, I hardcoded the resolution to 720p. For Veo 3 and older, I left out resolution.

function constructVideoArguments(isVeo31Used: boolean, imageParams: GenerateVideoRequest) {
  const veoConfig = isVeo31Used ? {
    aspectRatio: "16:9",
    resolution: "720p",
  } : {
    aspectRatio: "16:9",
  };

  return {
    prompt: imageParams.prompt,
    imageBytes: imageParams.imageBytes,
    mimeType: imageParams.mimeType,
    config: veoConfig,
  };
}
Enter fullscreen mode Exit fullscreen mode

4. Asynchronous Polling

Both video generation and interpolation are long-running tasks. Because Vertex AI processes them asynchronously, the functions must poll the operation status until the done flag is true.

The Gemini API cannot see the Cloud Storage for Firebse Emulator, so it requires a real output GCS uri, which is gs://${storageBucket}.

import { GenerateVideosConfig, GoogleGenAI } from "@google/genai";

export type AIVideoBucket = {
  ai: GoogleGenAI;
  model: string;
  storageBucket: string;
  isVeo31Used: boolean;
  pollingPeriod: number;
}

export type GenerateVideoRequest = {
  prompt: string;
  imageBytes: string;
  mimeType: string;
  config?: GenerateVideosConfig;
}
Enter fullscreen mode Exit fullscreen mode
import { AIVideoBucket, GenerateVideoRequest } from "./types/video.type";
import { GenerateVideosParameters, GoogleGenAI } from "@google/genai";

export async function generateVideoByPolling(
  { ai, model, storageBucket, pollingPeriod }: AIVideoBucket,
  request: GenerateVideoRequest,
) {
  const genVideosParams: GenerateVideosParameters = {
    model,
    prompt: request.prompt,
    config: {
      ...request.config,
      numberOfVideos: 1,
      outputGcsUri: `gs://${storageBucket}`,
    },
    image: {
      imageBytes: request.imageBytes,
      mimeType: request.mimeType,
    },
  };

  return getVideoUri(ai, genVideosParams, pollingPeriod);
}
Enter fullscreen mode Exit fullscreen mode

When the done flag is true, the operation ends and one of the three outcomes occurs.
Outcome 1: The error is true and the video failed to generate. Therefore, the function threw an error.
Outcome 2: The video was stored at the GCS uri, and the function returned it to the client application
Outcome 3: Neither happened. No error and no GCS uri, and the function returned an unknown error.

async function getVideoUri(
  ai: GoogleGenAI,
  genVideosParams: GenerateVideosParameters,
  pollingPeriod: number,
): Promise<string> {
  let operation = await ai.models.generateVideos(genVideosParams);

  while (!operation.done) {
    await new Promise((resolve) => setTimeout(resolve, pollingPeriod));
    operation = await ai.operations.getVideosOperation({ operation });
  }

  if (operation.error) {
    const strError = `Video generation failed: ${operation.error.message}`;
    throw new Error(strError);
  }

  const uri = operation.response?.generatedVideos?.[0]?.video?.uri;
  if (uri) {
    return uri;
  }

  const strError = "Video generation finished but no uri was provided.";
  throw new Error(strError);
}
Enter fullscreen mode Exit fullscreen mode

Note: For demo purpose, polling is a decent solution to handle asychronous video generation. However, it is expensive, and creates unncessary load and latency. For production usage, you may consider push notifications such as websocket and server-sent sevents.

5. Video Interpolation between Frames

Veo 3.1 also introduces video interpolation where the model uses two images to infer what transpires in a video. In this mode, the function sends both a first and a last image frame. The AI generates the transition between them, effectively "animating" the sequence.

const cors = process.env.WHITELIST ? process.env.WHITELIST.split(",") : true;
const options = {
  cors,
  enforceAppCheck: true,
  timeoutSeconds: 180,
};

export const interpolateVideo = onCall( options,
  ({ data }) => generateVideoFromFramesFunction(data)
);
Enter fullscreen mode Exit fullscreen mode
import { GenerateVideosConfig } from "@google/genai";

export type GenerateVideoRequest = {
  prompt: string;
  imageBytes: string;
  mimeType: string;
  config?: GenerateVideosConfig;
}

export type GenerateVideoFromFramesRequest = GenerateVideoRequest & {
  lastFrameImageBytes: string;
  lastFrameMimeType: string;
}
Enter fullscreen mode Exit fullscreen mode
import { AIVideoBucket, GenerateVideoFromFramesRequest } from "./types/video.type";
import { GoogleGenAI } from "@google/genai";
import { generateVideoByPolling, validateVideoConfigFields } from "./video.util";

export async function generateVideoFromFramesFunction(data: GenerateVideoFromFramesRequest) {
  const variables = validateVideoConfigFields();
  if (!variables) {
    return "";
  }

  const { genAIOptions, aiVideoOptions } = variables;

  try {
    const ai = new GoogleGenAI(genAIOptions);
    return await interpolateVideo({ ai, ...aiVideoOptions }, data);
  } catch (error) {
    console.error("Error generating video:", error);
    throw new Error("Error generating video");
  }
}
Enter fullscreen mode Exit fullscreen mode

Currently, only Veo 3.1 supports lastFrame. When isVeo31Used is true, the data URL and mime type are provided to lastFrame. Otherwise, the function fallback to generate the video from the first image.

function constructVideoArguments(isVeo31Used: boolean, imageParams: GenerateVideoFromFramesRequest) {
  const veoConfig = isVeo31Used ? {
    aspectRatio: "16:9",
    resolution: "720p",
    lastFrame: {
      imageBytes: imageParams.lastFrameImageBytes,
      mimeType: imageParams.lastFrameMimeType,
    },
  } : {
    aspectRatio: "16:9",
  };

  return {
    prompt: imageParams.prompt,
    imageBytes: imageParams.imageBytes,
    mimeType: imageParams.mimeType,
    config: veoConfig,
  };
}
Enter fullscreen mode Exit fullscreen mode

Similarly, the interpolateVideo function reuses the generateVideoByPolling function to poll the operation until the operation completes. The function either returns the GCS uri or throws an error.

async function interpolateVideo(aiVideo: AIVideoBucket, imageParams: GenerateVideoFromFramesRequest) {
  try {
    const args = constructVideoArguments(aiVideo.isVeo31Used, imageParams);
    return await generateVideoByPolling(aiVideo, args);
  } catch (e) {
    throw e instanceof Error ?
      e :
      new Error("An unexpected error occurred in video generation using the first and last frames.");
  }
}
Enter fullscreen mode Exit fullscreen mode

6. Storage Security Rules

To ensure the generated content is handled correctly, Firebase Storage rules are configured to allow only MP4 files.

service firebase.storage {
    match /b/{bucket}/o {
        match /{allPaths=**} {
        allow read: if resource.name.matches('.*\\.mp4');
        allow write: if request.resource.name.matches('.*\\.mp4')
                    && request.resource.contentType == 'video/mp4';
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

7 Housekeeping Videos in Storage

Even though us-central1 region is a free tier, videos should be purged after a period of time such that the bucket does not grow indefinitely.

I added a rule in Google Cloud Storage to delete objects from the Firebase Storage bucket 5 days after they were created. I chose 5 days arbitrary and you can choose any number according to your scenario.

Google Cloud Console > Select the Firebase Project > Cloud Storage > Buckets > Select Bucket name > Lifecycle > Rules > Add a rule
Enter fullscreen mode Exit fullscreen mode
Field Value
Action Delete Object
Object Condition 5+ days since object was created

8. Firebase App Configuration and reCAPTCHA Site Key

getFirebaseConfig is a Firebase cloud function that returns both Firebase App configuration and reCAPTCHA site key.

const cors = process.env.WHITELIST ? process.env.WHITELIST.split(",") : true;
const whitelist = process.env.WHITELIST?.split(",") || [];
const refererList = process.env.REFERER?.split(",") || [];

export const getFirebaseConfig = onRequest( { cors },
  (request, response) => {
    if (request.method !== "GET") {
      response.status(405).send("Method Not Allowed");
      return;
    }

    try {
      const referer = request.header("referer");
      const origin = request.header("origin");
      if (!referer) {
        response.status(401).send("Unauthorized, invalid referer.");
        return;
      }

      if (!refererList.includes(referer)) {
        response.status(401).send("Unauthorized, invalid referer.");
        return;
      }

      if (!origin) {
        response.status(401).send("Unauthorized, invalid origin.");
        return;
      }

      if (!whitelist.includes(origin)) {
        response.status(401).send("Unauthorized, invalid origin.");
        return;
      }

      const config = {
            app: {
                apiKey: '<Firebase API Key>',
                appId: '<Firebase App ID',
                projectId: '<Google Cloud Project ID>',
                storageBucket: '<Firebase Storage Bucket>
                messagingSenderId: '<Firebase Messaging Sender ID>',
                authDomain: '<Firebase Auth Domain>',
            },
            recaptchaSiteKey: '<reCAPTCHA Site Key>',
      };

      response.set("Cache-Control", "public, max-age=3600, s-maxage=3600");
      response.json(config);
    } catch (err) {
      console.error(err);
      response.status(401).send(err);
    }
  }
);
Enter fullscreen mode Exit fullscreen mode

9. Local Development with Emulators

For local development, I used the Firebase Emulator Suite. In the bootstrapFirebase process, the app calls connectFunctionsEmulator to link to the Cloud Functions running at http://localhost:5001.

The port number was defaulted to 5001 when firebase init was executed.

Note: While the Cloud Function runs locally (at zero cost), the Storage emulator is not used. This is because the Gemini API requires an actual accessible GCS bucket to store the generated video.

loadFirebaseConfig is a helper function that makes request to the Cloud function to obtain the Firebase configuration and the reCAPTCHA site key.

{
  "appUrl": "<Firebase cloud function base URL>"
}
Enter fullscreen mode Exit fullscreen mode
import { connectFunctionsEmulator, Functions, getFunctions } from "firebase/functions";
import { fetchAndActivate, getRemoteConfig, getValue, RemoteConfig } from 'firebase/remote-config';
import { FirebaseApp, initializeApp } from 'firebase/app';
import { initializeAppCheck, ReCaptchaEnterpriseProvider } from 'firebase/app-check';
import { HttpClient } from '@angular/common/http';
import { inject } from '@angular/core';
import { catchError, lastValueFrom, throwError } from 'rxjs';

async function loadFirebaseConfig() {
  const httpService = inject(HttpClient);
  const firebaseConfig$ =
    httpService.get(`${config.appUrl}/getFirebaseConfig`)
      .pipe(
        catchError((e) => throwError(() => e))
      );
  return lastValueFrom(firebaseConfig$);
}

export async function bootstrapFirebase() {
    try {
      const firebaseConfig = await loadFirebaseConfig();
      const { app, recaptchaSiteKey } = firebaseConfig;
      const firebaseApp = initializeApp(app);

      initializeAppCheck(firebaseApp, {
        provider: new ReCaptchaEnterpriseProvider(recaptchaSiteKey),
        isTokenAutoRefreshEnabled: true,
      });

      const functions = getFunctions(firebaseApp, 'us-central1');
      if (location.hostname === 'localhost') {
        connectFunctionsEmulator(functions, 'localhost', 5001);
      }
    } catch (err) {
      console.error(err);
    }
}
Enter fullscreen mode Exit fullscreen mode

The AppConfig remains unchanged.


import { ApplicationConfig, provideAppInitializer } from '@angular/core';
import { bootstrapFirebase } from './app.bootstrap';

export const appConfig: ApplicationConfig = {
  providers: [
    provideAppInitializer(async () => bootstrapFirebase()),
  ]
};
Enter fullscreen mode Exit fullscreen mode

10. Frontend Integration (Angular)

The Angular frontend triggers the process using httpsCallable. Once the function returns the Cloud Storage path, the app fetches the download URL for playback.

The ConfigService stores the Firebase app and functions to be used throughout the application.

import { Injectable } from '@angular/core';
import { FirebaseApp } from 'firebase/app';
import { Functions } from 'firebase/functions';

@Injectable({
  providedIn: 'root'
})
export class ConfigService  {

    firebaseApp: FirebaseApp | undefined = undefined;
    functions: Functions | undefined = undefined;

    loadConfig(firebaseApp: FirebaseApp, functions: Functions) {
      this.firebaseApp = firebaseApp;
      this.functions = functions;
    }
}
Enter fullscreen mode Exit fullscreen mode

The retrieveVideoUri method calls the cloud function directly to retrieve the GCS uri.

The downloadVideoAsUrl method resolves the uri to a HTTP URL that a HTML video player can play it immediately.

import { inject, Injectable } from '@angular/core';
import { httpsCallable } from 'firebase/functions';
import { getDownloadURL, getStorage, ref } from 'firebase/storage';
import { GenerateVideoRequest } from '../types/video.type';
import { ConfigService } from './config.service';

@Injectable({
  providedIn: 'root'
})
export class GeminiService {
  private readonly storage = getStorage();
  private readonly configService = inject(ConfigService);

  async retrieveVideoUri(request: GenerateVideoRequest, methodName: string): Promise<string> {
    try {
      const functions = this.configService.functions;
      if (!functions) {
        throw new Error('Functions does not exist.');
      }

      const downloadGcsUri = httpsCallable<GenerateVideoRequest, string>(
        functions, methodName
      );
      const { data: gcsUri } = await downloadGcsUri(request);
      return gcsUri;
    } catch (err) {
        console.error(err);
        throw err;
    }
  }

  async downloadVideoAsUrl(request: GenerateVideoRequest, methodName='videos-generateVideo'): Promise<string> {
    const gcsUri = await this.retrieveVideoUri(request, methodName);

    if (!gcsUri) {
      throw new Error('Video operation completed but no URI was returned.');
    }

    return getDownloadURL(ref(this.storage, gcsUri))
      .then((url) => url)
      .catch((error) => {
        console.error(error);
        throw new Error("Unknown error occurred");
      });
  }
}
Enter fullscreen mode Exit fullscreen mode

The VideoPlayerComponent has a required videoUrl signal input that is assigned to the source of the video player to play the video.

@Component({
  selector: 'app-video-player',
  template: `
<div>
    <video [src]="videoUrl()" controls autoplay loop class="w-full rounded-md"></video>
</div>`,
  changeDetection: ChangeDetectionStrategy.OnPush,
})
export class VideoPlayerComponent {
  isGeneratingVideo = input(false);
  videoUrl = input.required<string>();
}
Enter fullscreen mode Exit fullscreen mode

This is the end of walkthrough of the demo and you should be able to generate videos in a cloud functions, store them securely in a bucket, play them in a video player in a user interface.


Conclusion

Combining Veo 3.1 with the serverless scalability of Firebase is a powerful workflow.

First, the Angular application neither installs the genai dependency nor maintains the Vertex AI environment variables in a .env file. The client application calls the cloud functions to perform the intensive tasks and waits for the results.

The cloud functions receive the arguments from the client, execute complex AI operations like generation and interpolation, and write the videos in the dedicated bucket securely. During local development, the Firebase Emulator calls the functions at http://localhost:5001 instead of the deployed ones at the Cloud Run platform.

The application can be further extended to explore other Veo 3.1 features such extending videos and generating a new video using reference images. Since these functionalities are supported in Python only, a separate codebase in Python must be initialized without overwriting the TypeScript function definitions.

Extending videos is interesting because a Veo-generated video can be further extended by 7 seconds up to 20 times. The duration of the final video can be a maximum of 148 seconds (~ 8s + 20 * 7s) or roughly 2.5 minutes.

Resources

Top comments (0)