<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: top duke</title>
    <description>The latest articles on DEV Community by top duke (@top_duke_7ca8148e5df75726).</description>
    <link>https://dev.to/top_duke_7ca8148e5df75726</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/top_duke_7ca8148e5df75726"/>
    <language>en</language>
    <item>
      <title>Building an AI Video Generation Workflow With Queues, Webhooks, and Review States</title>
      <dc:creator>top duke</dc:creator>
      <pubDate>Fri, 15 May 2026 06:38:32 +0000</pubDate>
      <link>https://dev.to/top_duke_7ca8148e5df75726/building-an-ai-video-generation-workflow-with-queues-webhooks-and-review-states-4ec2</link>
      <guid>https://dev.to/top_duke_7ca8148e5df75726/building-an-ai-video-generation-workflow-with-queues-webhooks-and-review-states-4ec2</guid>
      <description>&lt;p&gt;AI video generation looks simple from the user interface: enter a prompt, upload an image, wait a bit, and get a video. The backend is not that simple.&lt;/p&gt;

&lt;p&gt;If you are building a product that calls an AI video API, you need to handle long-running jobs, retries, provider callbacks, file storage, user-facing status, and review before anything gets published. This article walks through one practical architecture for that workflow.&lt;/p&gt;

&lt;p&gt;I will use SeeVido as an example of the kind of external AI video service a product might call, but the pattern applies to any provider that accepts prompt-to-video or image-to-video requests.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Problem: Video Generation Is Not a Normal Request
&lt;/h2&gt;

&lt;p&gt;Most web requests are short. A user clicks a button, your server does some work, and the response comes back quickly.&lt;/p&gt;

&lt;p&gt;AI video generation is different. It can take time, fail halfway, produce an output that needs review, or return a callback after the user has left the page. Treating it like a normal synchronous request usually creates a poor user experience and a hard-to-debug system.&lt;/p&gt;

&lt;p&gt;Instead, model video generation as a job.&lt;/p&gt;

&lt;p&gt;A job has:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;an owner;&lt;/li&gt;
&lt;li&gt;an input type, such as text-to-video or image-to-video;&lt;/li&gt;
&lt;li&gt;a prompt or source asset;&lt;/li&gt;
&lt;li&gt;a provider request ID;&lt;/li&gt;
&lt;li&gt;a current status;&lt;/li&gt;
&lt;li&gt;an event history;&lt;/li&gt;
&lt;li&gt;one or more output artifacts;&lt;/li&gt;
&lt;li&gt;a review decision.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;That gives your product a reliable way to answer the most common support question: "What happened to my video?"&lt;/p&gt;

&lt;h2&gt;
  
  
  A Useful Architecture
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsq65rbn9z1tgywx7hb47.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsq65rbn9z1tgywx7hb47.png" alt="AI Video Generation Page Common Sample" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A minimal production-friendly system can look like this:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Frontend submits a prompt or source image.&lt;/li&gt;
&lt;li&gt;API server creates a job record in the database.&lt;/li&gt;
&lt;li&gt;Worker picks up the job from a queue.&lt;/li&gt;
&lt;li&gt;Worker sends the request to the AI video provider.&lt;/li&gt;
&lt;li&gt;Provider returns a request ID.&lt;/li&gt;
&lt;li&gt;Provider sends status updates to a webhook.&lt;/li&gt;
&lt;li&gt;Completed files are copied to object storage.&lt;/li&gt;
&lt;li&gt;A review dashboard approves or rejects the output.&lt;/li&gt;
&lt;li&gt;User sees the final status in the app.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The important decision is separation. The API request should create the job, not wait for the generated video.&lt;/p&gt;

&lt;h2&gt;
  
  
  Suggested Job States
&lt;/h2&gt;

&lt;p&gt;Keep the status model boring and explicit.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;queued -&amp;gt; submitted -&amp;gt; processing -&amp;gt; completed -&amp;gt; approved
                                      -&amp;gt; rejected
                                      -&amp;gt; failed
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You may also need:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;canceled
expired
retrying
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Avoid using a single vague status such as &lt;code&gt;done&lt;/code&gt;. A generated video may be technically completed but still not approved for publishing.&lt;/p&gt;

&lt;h2&gt;
  
  
  Database Tables
&lt;/h2&gt;

&lt;p&gt;You do not need a complex event-sourcing system to start. A current-state table plus an event table is usually enough.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="n"&gt;video_generation_jobs&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="nb"&gt;BIGINT&lt;/span&gt; &lt;span class="k"&gt;PRIMARY&lt;/span&gt; &lt;span class="k"&gt;KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;public_id&lt;/span&gt; &lt;span class="n"&gt;UNIQUEIDENTIFIER&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;user_id&lt;/span&gt; &lt;span class="nb"&gt;BIGINT&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;workflow_type&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;40&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;provider_name&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;80&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;provider_request_id&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;current_status&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;40&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;source_asset_id&lt;/span&gt; &lt;span class="nb"&gt;BIGINT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;prompt_hash&lt;/span&gt; &lt;span class="nb"&gt;VARBINARY&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;32&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;created_at_utc&lt;/span&gt; &lt;span class="n"&gt;DATETIME2&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;updated_at_utc&lt;/span&gt; &lt;span class="n"&gt;DATETIME2&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;completed_at_utc&lt;/span&gt; &lt;span class="n"&gt;DATETIME2&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="n"&gt;video_generation_events&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="nb"&gt;BIGINT&lt;/span&gt; &lt;span class="k"&gt;PRIMARY&lt;/span&gt; &lt;span class="k"&gt;KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;job_id&lt;/span&gt; &lt;span class="nb"&gt;BIGINT&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;event_type&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;80&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;previous_status&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;40&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;new_status&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;40&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;event_source&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;40&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;message&lt;/span&gt; &lt;span class="n"&gt;NVARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1000&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;error_code&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;created_at_utc&lt;/span&gt; &lt;span class="n"&gt;DATETIME2&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The job table answers "where is it now?" The event table answers "how did it get there?"&lt;/p&gt;

&lt;h2&gt;
  
  
  API Endpoint: Create the Job
&lt;/h2&gt;

&lt;p&gt;The first endpoint should validate the request, create a job, and return immediately.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;/api/video-jobs&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;workflowType&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;prompt&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;sourceAssetId&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;text-to-video&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;image-to-video&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;includes&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;workflowType&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;status&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;400&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;error&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Unsupported workflow type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;

  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;job&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;db&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;videoJobs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;create&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;userId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;user&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="nx"&gt;workflowType&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;providerName&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;seevido&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;currentStatus&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;queued&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="nx"&gt;sourceAssetId&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;promptHash&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;hashPrompt&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;prompt&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;

  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;queue&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;publish&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;video.generate&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;jobId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;

  &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;status&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;202&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;jobId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;publicId&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;status&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;queued&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;202 Accepted&lt;/code&gt; response is intentional. It tells the frontend that the request has been accepted, not completed.&lt;/p&gt;

&lt;h2&gt;
  
  
  Worker: Submit to the Provider
&lt;/h2&gt;

&lt;p&gt;The worker owns provider communication. This keeps slow or unreliable external calls away from the request-response path.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;queue&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;consume&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;video.generate&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="nx"&gt;jobId&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;job&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;db&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;videoJobs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;findById&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;jobId&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;job&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="nx"&gt;job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;currentStatus&lt;/span&gt; &lt;span class="o"&gt;!==&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;queued&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;return&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;markStatus&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;submitted&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Worker picked up job&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;providerRequest&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;seevidoClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;createVideo&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
      &lt;span class="na"&gt;workflowType&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;workflowType&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;prompt&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;loadPrompt&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
      &lt;span class="na"&gt;sourceAsset&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;loadSourceAsset&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;sourceAssetId&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
      &lt;span class="na"&gt;webhookUrl&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;APP_URL&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/webhooks/video-provider`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;

    &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;db&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;videoJobs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;update&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;providerRequestId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;providerRequest&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;currentStatus&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;processing&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;

    &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;addEvent&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;provider_request_created&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;newStatus&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;processing&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;message&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Provider request created&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;markStatus&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;failed&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;errorCode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;code&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;provider_submit_failed&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The provider client can point to SeeVido or any other AI video API. Keep it behind an interface so you can swap providers without rewriting the queue and review logic.&lt;/p&gt;

&lt;h2&gt;
  
  
  Webhooks Need Idempotency
&lt;/h2&gt;

&lt;p&gt;Webhooks are often delivered more than once. Your handler should treat duplicate events as normal.&lt;/p&gt;

&lt;p&gt;Use a table like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="n"&gt;provider_webhook_receipts&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="nb"&gt;BIGINT&lt;/span&gt; &lt;span class="k"&gt;PRIMARY&lt;/span&gt; &lt;span class="k"&gt;KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;provider_name&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;80&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;provider_event_id&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;provider_request_id&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;received_at_utc&lt;/span&gt; &lt;span class="n"&gt;DATETIME2&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;payload_hash&lt;/span&gt; &lt;span class="nb"&gt;VARBINARY&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;32&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="k"&gt;CONSTRAINT&lt;/span&gt; &lt;span class="n"&gt;uq_provider_event&lt;/span&gt; &lt;span class="k"&gt;UNIQUE&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;provider_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;provider_event_id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then insert the webhook event before processing it.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;/webhooks/video-provider&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;event&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;inserted&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;tryInsertWebhookReceipt&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;providerName&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;seevido&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;providerEventId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;providerRequestId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;requestId&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;payloadHash&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;hashPayload&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;

  &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;inserted&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;status&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;ok&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;duplicate&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;

  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;job&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;db&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;videoJobs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;findByProviderRequestId&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;requestId&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;job&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;status&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;202&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;ok&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;

  &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;status&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;completed&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;copyArtifactsToStorage&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;outputs&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;markStatus&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;completed&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Provider completed job&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;

  &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;status&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;failed&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;markStatus&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;failed&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;errorCode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;errorCode&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;

  &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;status&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;ok&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Returning &lt;code&gt;200&lt;/code&gt; for duplicates prevents unnecessary retries.&lt;/p&gt;

&lt;h2&gt;
  
  
  Store Artifacts, Not Videos, in the Database
&lt;/h2&gt;

&lt;p&gt;Generated videos can be large. Store them in object storage and keep metadata in your database.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="n"&gt;video_generation_artifacts&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="nb"&gt;BIGINT&lt;/span&gt; &lt;span class="k"&gt;PRIMARY&lt;/span&gt; &lt;span class="k"&gt;KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;job_id&lt;/span&gt; &lt;span class="nb"&gt;BIGINT&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;artifact_type&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;40&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;storage_uri&lt;/span&gt; &lt;span class="n"&gt;NVARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1000&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;mime_type&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;file_size_bytes&lt;/span&gt; &lt;span class="nb"&gt;BIGINT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;duration_seconds&lt;/span&gt; &lt;span class="nb"&gt;DECIMAL&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;9&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;width&lt;/span&gt; &lt;span class="nb"&gt;INT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;height&lt;/span&gt; &lt;span class="nb"&gt;INT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="n"&gt;created_at_utc&lt;/span&gt; &lt;span class="n"&gt;DATETIME2&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This makes the database useful for search, support, and reporting without turning it into a media file store.&lt;/p&gt;

&lt;h2&gt;
  
  
  Add Human Review Before Publishing
&lt;/h2&gt;

&lt;p&gt;A generated video can be technically successful and still be wrong.&lt;/p&gt;

&lt;p&gt;For example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;product shape changed;&lt;/li&gt;
&lt;li&gt;label text became unreadable;&lt;/li&gt;
&lt;li&gt;face or hand details distorted;&lt;/li&gt;
&lt;li&gt;source image rights are unclear;&lt;/li&gt;
&lt;li&gt;output looks like real footage and needs disclosure;&lt;/li&gt;
&lt;li&gt;clip implies a product feature that was never approved.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Add a review state before public use.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;completed -&amp;gt; approved
completed -&amp;gt; rejected
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In many products, &lt;code&gt;completed&lt;/code&gt; should only mean "the provider returned a usable file." It should not mean "safe to publish."&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzgyp8drd6xpm1an1ten2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzgyp8drd6xpm1an1ten2.png" alt="Seevido AI Operation System" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What to Show the User
&lt;/h2&gt;

&lt;p&gt;Users do not need every internal status. Map technical states to simple UI states.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Internal Status&lt;/th&gt;
&lt;th&gt;User-Facing Copy&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;queued&lt;/td&gt;
&lt;td&gt;Waiting to start&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;submitted / processing&lt;/td&gt;
&lt;td&gt;Generating video&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;completed&lt;/td&gt;
&lt;td&gt;Ready for review&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;approved&lt;/td&gt;
&lt;td&gt;Ready to use&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;rejected&lt;/td&gt;
&lt;td&gt;Needs changes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;failed&lt;/td&gt;
&lt;td&gt;Generation failed&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;If a job fails, give the user a clear next step. "Try again with a shorter prompt" is better than "provider_error_409."&lt;/p&gt;

&lt;h2&gt;
  
  
  Observability Queries
&lt;/h2&gt;

&lt;p&gt;Track operational health from the beginning.&lt;/p&gt;

&lt;p&gt;Useful metrics include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;average time from &lt;code&gt;queued&lt;/code&gt; to &lt;code&gt;completed&lt;/code&gt;;&lt;/li&gt;
&lt;li&gt;failure rate by provider and workflow type;&lt;/li&gt;
&lt;li&gt;rejection rate by review reason;&lt;/li&gt;
&lt;li&gt;duplicate webhook count;&lt;/li&gt;
&lt;li&gt;jobs stuck in &lt;code&gt;processing&lt;/code&gt;;&lt;/li&gt;
&lt;li&gt;storage size by generated artifact type.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These metrics help you decide whether the product needs better prompts, better provider handling, or better user guidance.&lt;/p&gt;

&lt;h2&gt;
  
  
  Security and Privacy Notes
&lt;/h2&gt;

&lt;p&gt;Do not treat prompts as harmless strings. Prompts may include customer names, campaign plans, product information, or private context.&lt;/p&gt;

&lt;p&gt;Consider:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;hashing prompts in operational tables;&lt;/li&gt;
&lt;li&gt;storing raw prompts separately with tighter access controls;&lt;/li&gt;
&lt;li&gt;signing webhook requests;&lt;/li&gt;
&lt;li&gt;validating uploaded file types;&lt;/li&gt;
&lt;li&gt;scanning output files if users can download them;&lt;/li&gt;
&lt;li&gt;expiring unused generated assets;&lt;/li&gt;
&lt;li&gt;logging reviewer decisions.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is especially important if your app supports business users.&lt;/p&gt;

&lt;h2&gt;
  
  
  Final Checklist
&lt;/h2&gt;

&lt;p&gt;Before shipping an AI video workflow, make sure you have:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;a job table;&lt;/li&gt;
&lt;li&gt;an event table;&lt;/li&gt;
&lt;li&gt;a queue;&lt;/li&gt;
&lt;li&gt;a worker;&lt;/li&gt;
&lt;li&gt;webhook idempotency;&lt;/li&gt;
&lt;li&gt;object storage;&lt;/li&gt;
&lt;li&gt;artifact metadata;&lt;/li&gt;
&lt;li&gt;review states;&lt;/li&gt;
&lt;li&gt;user-friendly status copy;&lt;/li&gt;
&lt;li&gt;failure and stuck-job monitoring.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;AI video generation is not just an API call. It is an asynchronous workflow with provider latency, callbacks, files, review decisions, and user expectations.&lt;/p&gt;

&lt;p&gt;Whether you use &lt;a href="https://seevido.com/" rel="noopener noreferrer"&gt;SeeVido&lt;/a&gt; or another provider, the backend pattern is the same: create a job, process it through a queue, handle webhooks idempotently, store artifacts outside the database, and review the output before publishing.&lt;/p&gt;

&lt;p&gt;That architecture gives your users a better experience and gives your team a system that can be debugged when something goes wrong.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>api</category>
      <category>webdev</category>
      <category>architecture</category>
    </item>
    <item>
      <title>Bringing Photos to Life: The Magic of AI Dance Video Generators</title>
      <dc:creator>top duke</dc:creator>
      <pubDate>Wed, 04 Feb 2026 03:46:48 +0000</pubDate>
      <link>https://dev.to/top_duke_7ca8148e5df75726/bringing-photos-to-life-the-magic-of-ai-dance-video-generators-5amm</link>
      <guid>https://dev.to/top_duke_7ca8148e5df75726/bringing-photos-to-life-the-magic-of-ai-dance-video-generators-5amm</guid>
      <description>&lt;p&gt;Picture this: You've got an old photo of your grandma from her younger days, or maybe a goofy selfie with your best friend. Now imagine hitting a button and watching them bust out moves like they're on a dance floor—swaying to a waltz, popping and locking to hip-hop, or just grooving in a way that feels totally alive. That's the thrill of &lt;a href="https://aifacefy.com/ai-dance-video/" rel="noopener noreferrer"&gt;AIFacefy dance video generators&lt;/a&gt;. These aren't just gimmicks; they're game-changers for anyone who loves creating fun, shareable content without needing a film crew or fancy software. Tools like the one from AI Facefy are making it dead simple to turn static snaps into dynamic videos that capture attention and spark joy.&lt;/p&gt;

&lt;p&gt;I've dived into this tech, and it's fascinating how it's evolved from clunky animations to something that feels genuinely natural. It's not about robots taking over creativity—it's about giving everyday folks the power to tell stories through movement. Let's break it down, from how it works to why it's worth trying, and explore the real-world ways it's shaking things up.&lt;/p&gt;

&lt;h2&gt;
  
  
  How We Got Here: The Rise of Smart Video Tools
&lt;/h2&gt;

&lt;p&gt;Video creation used to be a hassle. You'd need hours of shooting, editing software that costs a fortune, and skills that take years to hone. But AI has flipped the script. Starting with basic image recognition, it's now at a point where it can understand human poses, predict movements, and blend them seamlessly into video. Think of it as teaching a computer to "see" a photo's body language and then choreograph a dance around it.&lt;/p&gt;

&lt;p&gt;What makes modern generators stand out is their smarts. They use things like pose detection to map out arms, legs, and expressions, then layer on user inputs to customize everything. It's built on years of machine learning progress, where algorithms learn from tons of real dance footage to make animations that don't look fake. And it's not stopping there—this tech ties into broader trends, like AI helping with everything from photo edits to virtual avatars. The result? Videos that pop with personality, all from a single upload.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Nuts and Bolts: What Makes AIFacefy Dance Videos Tick
&lt;/h2&gt;

&lt;p&gt;At its core, an &lt;a href="https://aifacefy.com/ai-dance-video/" rel="noopener noreferrer"&gt;AIfacefy dance video generator&lt;/a&gt; takes your photo and breathes life into it. You start with a clear shot—front-facing or half-body works best, with a natural pose so the AI can get a good read on the structure. It scans for key points like joints and contours, ensuring every twist and turn feels right.&lt;/p&gt;

&lt;p&gt;The real fun is in the customization. You type in a description—like "a slow, elegant ballet under starry skies" or "upbeat salsa with tons of energy"—and the AI interprets it to shape the style, speed, and mood. This isn't random; it's controlled to keep movements smooth and continuous, avoiding those awkward jerks that scream "computer-made." Features like continuity controls lock in the character's look from start to finish, making the whole thing flow like a pro edit.&lt;/p&gt;

&lt;p&gt;Under the hood, it's all about balance—lightweight enough for quick use but powerful for polished results. Unlike bulkier tools, these focus on character-driven animation, perfect for short clips that shine on social media. It's tech that's thoughtful, prioritizing what users actually want: easy, reliable ways to create without the headaches.&lt;/p&gt;

&lt;h2&gt;
  
  
  Your Quick Guide to Making a Dance Video
&lt;/h2&gt;

&lt;p&gt;Getting started is a breeze, even if you've never touched video editing before. Here's how it goes down:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Pick and Upload Your Photo: Grab a sharp image of one person—make sure the body's fully visible and the pose isn't too wild. This sets the stage for smooth animations.&lt;/li&gt;
&lt;li&gt;Describe the Dance: Jot down what you envision. Want a relaxed vibe? Lively jumps? The prompt guides the AI, turning your words into motion.&lt;/li&gt;
&lt;li&gt;Tweak and Generate: Set the video length, aspect ratio, or other basics. Double-check, then let the magic happen.&lt;/li&gt;
&lt;li&gt;Check It Out and Save: Watch the preview to see if it's spot-on. If not, tweak and regenerate. Once it's good, download and share.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;It's that straightforward. And with extras like history tracking for past creations, you can build on what you've done. No experience needed—just your imagination.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Bother? The Real Perks of AI Dance Videos
&lt;/h2&gt;

&lt;p&gt;Let's be real: in a world flooded with content, standing out matters. These generators cut through the noise by making creation effortless. Forget lugging cameras around or spending days editing—you're done in minutes, with results that look pro.&lt;/p&gt;

&lt;p&gt;The movements are a highlight: natural, coherent, and tailored to feel human. No floppy arms or weird distortions—just dances that match real-life flow, boosting watchability. For social media, that's gold; shorter, engaging clips mean more views and shares.&lt;/p&gt;

&lt;p&gt;But it's deeper than that. It opens doors for inclusivity—folks who can't dance due to age or ability can see themselves (or loved ones) moving freely. Creatively, it's a playground: mix cultures, revive old photos, or prototype ideas without commitment. Businesses love it too for quick promos or ads that pop. And environmentally? Less physical production means a smaller footprint.&lt;/p&gt;

&lt;p&gt;What sells me is the blend of tech and heart. AI handles the grunt work, but your prompts add the soul, making each video uniquely yours. It's convincing because it works—I've seen creators worldwide turning mundane pics into memorable moments.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where It Shines: Everyday Wins with AI Dance
&lt;/h2&gt;

&lt;p&gt;The applications are endless, and that's what makes it exciting. On social platforms, people are transforming family shots into fun dances for posts that get likes pouring in. It's perfect for keeping things fresh without constant new photos.&lt;/p&gt;

&lt;p&gt;Dance lovers use it to showcase styles or test choreography virtually—no studio required. Educators demo moves for classes, while event planners create custom clips for weddings or parties. Imagine animating a historical figure in a modern routine for a history lesson, or preserving cultural dances from old images.&lt;/p&gt;

&lt;p&gt;For personal touches, it's unbeatable: commemorate milestones with dancing tributes, or just entertain friends with silly edits. Users rave about how it sparks conversations and brings smiles. One story I came across? A creator turned baby photos into imitations of viral dances, blending nostalgia with trends. It's not just useful—it's inspiring.&lt;/p&gt;

&lt;h2&gt;
  
  
  Looking Ahead: What's Next for Dance and AI
&lt;/h2&gt;

&lt;p&gt;The future's bright, with AI getting even smarter. Expect integrations like real-time music syncing or AR overlays, where your dances jump into the real world. Collaborative features could let friends co-create, or pros use it for film pre-vis.&lt;/p&gt;

&lt;p&gt;Sure, there are questions about originality, but the upside is huge: more access to the arts for everyone. It could redefine entertainment, from virtual concerts to personalized stories. As tools evolve, they'll keep focusing on what matters—fun, natural results that empower creators.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping It Up
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://aifacefy.com/ai-dance-video/" rel="noopener noreferrer"&gt;AIFacefy dance video generators&lt;/a&gt; aren't just tools; they're invitations to play with creativity in ways we couldn't before. From a simple photo to a full-on dance clip, they make the impossible easy and the ordinary extraordinary. If you're curious, give it a shot—upload that pic and see what happens. Trust me, once you start, you'll wonder how you ever lived without it.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>nanobanana</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Bringing Photos to Life: Exploring AI Facefy's Photo to Video Tool</title>
      <dc:creator>top duke</dc:creator>
      <pubDate>Fri, 16 Jan 2026 03:35:38 +0000</pubDate>
      <link>https://dev.to/top_duke_7ca8148e5df75726/bringing-photos-to-life-exploring-ai-facefys-photo-to-video-tool-4jhe</link>
      <guid>https://dev.to/top_duke_7ca8148e5df75726/bringing-photos-to-life-exploring-ai-facefys-photo-to-video-tool-4jhe</guid>
      <description>&lt;p&gt;In today's world, where everyone’s glued to their screens, turning old photos into short videos can really make your content pop. AI Facefy's Photo to Video tool does just that—it uses smart AI to add movement, feelings, and sound to your still images. Whether you're sharing family stories, posting on social media, or putting together something for work, this thing makes it simple to whip up videos from pictures. It's got quick processing and ways to add emotion, changing how we hang onto and share our moments. Let's dive into what this tool is all about, from how it works to where it shines in real life.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbdcotcz3oaa4ic3g25jh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbdcotcz3oaa4ic3g25jh.png" alt=" " width="800" height="397"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What's Behind &lt;a href="https://aifacefy.com/photo-to-video/" rel="noopener noreferrer"&gt;AI Facefy's Photo to Video Tool&lt;/a&gt;?
&lt;/h2&gt;

&lt;p&gt;AI Facefy is one of those platforms that's all about AI for images and videos, with stuff like turning text into clips or fun effects like face dancing or hugging animations. Their Photo to Video tool is a standout—it's online, you can try it for free, and it turns plain photos into videos with sound.&lt;br&gt;
The way it works is pretty cool: upload a photo, and the AI adds motion, like making faces smile or backgrounds shift a bit, plus a soundtrack to match. It's more than just moving parts; it's about telling a story. They use different AI setups like Wan 2.5 Fast, Veo 3.1, or Kling Motion Control, so you get options and solid quality. You can make up to 300 videos, which is great if you're using it a lot. What I like is how it picks up on the vibe in your photo and amps up the emotions, making a quick snap feel like a mini-movie. Oh, and there's a 20% off deal going on right now if you want to jump in.&lt;/p&gt;

&lt;h2&gt;
  
  
  Standout Features of This Photo to Video Tool
&lt;/h2&gt;

&lt;p&gt;This tool comes with some handy bits that make it easy to use and effective. Here's what stands out:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Adding Movement and Emotion
&lt;/h3&gt;

&lt;p&gt;It takes your still shots and gives them life with natural-looking animations. The AI spots faces and other details, then adds things like gentle smiles or slight turns, making everything feel more real and engaging—perfect for breathing new life into old pictures.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Sharp Results with Personal Touches
&lt;/h3&gt;

&lt;p&gt;You end up with clear videos in 720p, and you can tweak the length (say, 5 seconds) or the shape (like 16:9). Throw in music, text, or effects to make it your own. It even lets you blend a few photos into one video, like stringing together family shots for a nice montage.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Quick Turnaround and Sound Options
&lt;/h3&gt;

&lt;p&gt;The AI crunches through it fast, so you're not waiting around—results show up in minutes. Adding audio is a big plus; pick a track or voiceover to really pull at the heartstrings. Whether it's a talking character or a basic clip, it comes out looking pro.&lt;br&gt;
The whole setup is straightforward, with a clean interface where you can pull from past videos or tweak prompts to get exactly what you want.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step-by-Step: How to Use the Photo to Video Tool
&lt;/h2&gt;

&lt;p&gt;It's dead simple to get going, even if you're not tech-savvy. Here's the rundown:&lt;/p&gt;

&lt;p&gt;Upload Your Stuff: Pick some clear photos and upload them. You can do a bunch at once for videos with multiple images.&lt;br&gt;
Make It Yours: Choose effects, add music or text, and pick an AI model like Wan 2.6 or Vidu Q1. Set the resolution, length, and ratio. Give it a prompt, like "Make this family pic feel happy and lively."&lt;br&gt;
Create and Grab It: Click generate, wait a bit, and download your video or share it right away.&lt;/p&gt;

&lt;p&gt;No need for fancy editing know-how. If you're into other creative stuff, check out their text-to-video or niche tools like muscle growth effects or venom transformations.&lt;br&gt;
Why Bother with AI Facefy's Video Maker?&lt;br&gt;
It's not just convenient—there are real upsides:&lt;/p&gt;

&lt;p&gt;Saves Time: Faster than messing with regular video software.&lt;br&gt;
Tells Better Stories: Motion and sound keep the original feel but make it more shareable and emotional.&lt;br&gt;
Works for Anyone: Good for personal stuff, online posts, or business needs, with results that don't look amateur.&lt;br&gt;
Easy on the Wallet: Free to start, with upgrades if you need more, and that discount helps.&lt;/p&gt;

&lt;p&gt;When visuals are key to grabbing attention, this helps your content cut through the noise.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where This Tool Really Fits In
&lt;/h2&gt;

&lt;p&gt;You can use it in all sorts of ways:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Social Media Posts
&lt;/h3&gt;

&lt;p&gt;Make quick, fun videos for Instagram, TikTok, or Facebook. Animate product shots or your own pics to get more likes and shares—think viral dance clips.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Keeping Memories Alive
&lt;/h3&gt;

&lt;p&gt;Turn trips, parties, or family photos into little videos that capture the moment. Merge portraits for tributes, like handshake clips for events or kissing animations for sweet stuff.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Business Boosts
&lt;/h3&gt;

&lt;p&gt;Companies can create pro-looking videos to show off products. Add effects to highlight features, jazzing up marketing—like hug videos for feel-good campaigns or fun labubu generators.&lt;br&gt;
It mixes old-school charm with new tricks for stuff that sticks.&lt;/p&gt;

&lt;h2&gt;
  
  
  How Much and How to Get It
&lt;/h2&gt;

&lt;p&gt;You can try the &lt;a href="https://aifacefy.com/photo-to-video/" rel="noopener noreferrer"&gt;Photo to Video tool&lt;/a&gt; for free on AI Facefy, no strings attached. For unlimited use and extras, they have plans (check their pricing page). That 20% off makes it a good deal to start.&lt;/p&gt;

&lt;h2&gt;
  
  
  Picking AI Facefy for Turning Images into Videos
&lt;/h2&gt;

&lt;p&gt;Among all the AI options out there, this one nails the balance of speed, quality, and fun. Whether you're freshening up old pics or making fashion content with their face tools, it delivers. With tech like Kling Motion Control and Veo 3.1, it's keeping up with the latest.&lt;br&gt;
Want to give your photos some motion? Check out &lt;a href="https://aifacefy.com/photo-to-video/" rel="noopener noreferrer"&gt;AI Facefy's Photo to Video tool&lt;/a&gt; and see how it turns stills into stories. The free trial and easy setup mean you can start playing around today.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>python</category>
      <category>productivity</category>
    </item>
    <item>
      <title>AI Facefy Beauty Test: The Gold Standard in Facial Analysis</title>
      <dc:creator>top duke</dc:creator>
      <pubDate>Mon, 19 May 2025 10:57:19 +0000</pubDate>
      <link>https://dev.to/top_duke_7ca8148e5df75726/ai-facefy-beauty-test-the-gold-standard-in-facial-analysis-4edg</link>
      <guid>https://dev.to/top_duke_7ca8148e5df75726/ai-facefy-beauty-test-the-gold-standard-in-facial-analysis-4edg</guid>
      <description>&lt;p&gt;AI Facefy’s Beauty Test leads the pack with its multidimensional approach to facial attractiveness scoring. Leveraging computer vision and deep learning models trained on over 100,000 global facial images, this tool evaluates facial symmetry, golden ratio alignment, skin quality, and even aura traits derived from lighting and expressions.&lt;/p&gt;

&lt;p&gt;Url: &lt;a href="https://aifacefy.com/ai-beauty-test/" rel="noopener noreferrer"&gt;https://aifacefy.com/ai-beauty-test/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Key Features:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Facial Feature Mapping: The AI detects 78 facial landmarks, measuring distances between eyes, nose width, and lip proportions against classical aesthetic ratios like the Marquardt Phi Mask.&lt;/li&gt;
&lt;li&gt;Skin Analysis: Algorithms assess skin texture, detecting wrinkles, dark circles, and pigmentation with 94% accuracy compared to dermatological evaluations.&lt;/li&gt;
&lt;li&gt;Cultural Aesthetic Models: Users can compare their results against beauty standards from East Asian, Western, or multicultural datasets, including similarity scores to virtual idols and celebrities.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>aitool</category>
      <category>ai</category>
    </item>
    <item>
      <title>Ghibli Style AI Art Generator - Your Personal Anime Creation</title>
      <dc:creator>top duke</dc:creator>
      <pubDate>Tue, 29 Apr 2025 08:53:23 +0000</pubDate>
      <link>https://dev.to/top_duke_7ca8148e5df75726/ghibli-style-ai-art-generator-your-personal-anime-creation-51pp</link>
      <guid>https://dev.to/top_duke_7ca8148e5df75726/ghibli-style-ai-art-generator-your-personal-anime-creation-51pp</guid>
      <description>&lt;p&gt;Ghibli Style AI Art Generator - Your Personal Anime Creation&lt;/p&gt;

&lt;p&gt;Url: &lt;a href="https://aifacefy.com/ghibli-style/" rel="noopener noreferrer"&gt;https://aifacefy.com/ghibli-style/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Ghibli Style AI Art Generator is an easy-to-use online tool that transforms your photos or text prompts into enchanting artwork inspired by Studio Ghibli films. Powered by advanced AI, it captures the whimsical, dreamy, and nostalgic essence of Ghibli’s animation style-complete with soft colors, magical landscapes, and expressive characters.&lt;/p&gt;

&lt;p&gt;Key Features:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Image/Text to Ghibli Style Art: Upload landscapes, portraits, pets or any elements to instantly generate Miyazaki-style artistic illustrations.&lt;/li&gt;
&lt;li&gt;Animation Atmosphere Enhancement: AI intelligently recognizes image content and automatically adds soft lighting, cinematic composition, and vintage colors to create an animated atmosphere.&lt;/li&gt;
&lt;li&gt;Emotional Detail Rendering: Enhance character expressions through eyes, micro-expressions, and layered lighting to fill the image with storytelling elements.&lt;/li&gt;
&lt;li&gt;High Resolution Output: Export high-resolution images suitable for poster printing, social media sharing, or as project materials.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Use Cases: &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Game Design: Indie game developers can turn their sketches or photos into beautiful Ghibli-style backgrounds and characters, ready to use in games. This saves time and makes games look magical.&lt;/li&gt;
&lt;li&gt;Travel Marketing: Hotels and travel businesses can transform their photos into charming, nostalgic Ghibli-inspired images to attract more visitors and create unique promotional materials.&lt;/li&gt;
&lt;li&gt;Memory Keepsakes: Families and caregivers can convert old or special photos into animated storybook-style pictures, creating touching keepsakes that bring memories to life.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;X: &lt;a href="https://x.com/aifacefy" rel="noopener noreferrer"&gt;https://x.com/aifacefy&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Youtube: &lt;a href="https://www.youtube.com/@aifacefy" rel="noopener noreferrer"&gt;https://www.youtube.com/@aifacefy&lt;/a&gt;&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>ai</category>
      <category>productivity</category>
      <category>google</category>
    </item>
  </channel>
</rss>
