DEV Community

Cover image for How We Built an AI-Powered Packaging Design Workflow (And the Numbers Behind It)
don chen
don chen

Posted on

How We Built an AI-Powered Packaging Design Workflow (And the Numbers Behind It)

TL;DR: We integrated AI image generation into our packaging design process in early 2025. After helping 46 brands, here's what actually works, what doesn't, and why AI is a communication tool — not a replacement for designers.

The Real Problem: It's Never Been About Design Skills

Let me start with something nobody talks about enough.

In custom luxury packaging — jewelry boxes, watch cases, perfume packaging — the bottleneck is almost never "the designer isn't skilled enough." The real problem is much more boring:

Clients can't describe what they want in words. And designers aren't mind readers.

Client: "I want something premium but also modern, maybe with an eco feel?"
Designer: *disappears for 3 days, returns with 2 sketches*
Client: "Yeah... not exactly what I had in mind."
*repeat for 6 weeks*
Enter fullscreen mode Exit fullscreen mode

I've seen this pattern play out hundreds of times at Tancy Packaging, where we've been manufacturing custom packaging since 2008. The talent was never the issue. The issue was that visual ideas don't translate well into language, and every round of back-and-forth burns time and budget.

That's where AI actually helps. Not by replacing designers — but by acting as a rapid visual translation layer between client and designer.

Our Three-Stage Workflow (With Real Timelines)

After 18 months of iteration, here's the workflow we landed on:

Stage 1 — AI-Powered Ideation (Days 1-2)

Instead of a designer vanishing to produce 2-3 concept sketches over several days, we now generate 20-30 reference images in a single afternoon session — often with the client sitting right there with us.

The key enabler here is something I don't see discussed enough: a structured prompt library.

We built ours incrementally over the past year+. Every prompt is organized by:

Category Example Values
Product type watch boxes, jewelry boxes, perfume boxes, cosmetic sets
Material wood grain, PU leather, matte paperboard, velvet interior
Style vibe minimalist, vintage luxe, eco-friendly, art deco
Finish soft-touch lamination, foil stamping, embossing, spot UV

A typical prompt from our library:

Rigid gift box with magnetic closure, soft-touch matte black exterior,
brushed gold foil stamping logo on lid, velvet-lined interior tray
for jewelry, photorealistic product photography lighting
Enter fullscreen mode Exit fullscreen mode

Why the prompt library matters: Most teams treat AI prompts as throwaway experiments. Type something, see what comes out, move on. That's fine for exploration, but it's terrible for consistency. By saving and categorizing every prompt that produced good results, we built a reusable system where new projects start at ~80% of optimal instead of zero.

The client participation piece is critical too. Instead of saying "I want elegant" and hoping we guess correctly, clients browse prompt categories, pick combinations like "elegant + wood grain + warm tones", and immediately see what that looks like. It turns vague language into concrete visuals.

Stage 2 — Human Refinement (Days 3-7)

Here's the part most AI-in-design articles gloss over: those AI images never make it into production. Ever.

Once the client picks a direction from Stage 1, our human team takes over completely:

🔄 Workflow Flow:

AI References → Direction Set → 2D Artwork (Illustrator/Photoshop) → 3D Modeling (Blender/KeyShot) → CAD Validation (Manufacturability Check) → Client Review → Physical Sample

(If client requests tweaks at review stage, loop back to 3D modeling step)

What happens in each step:

  • 2D artwork: Dielines, print files, color separation in Illustrator/Photoshop
  • 3D modeling: Photorealistic renders from multiple angles in Blender or KeyShot
  • CAD validation: Structural integrity checks — will this box actually hold together?

The 3D modeling step is where the magic happens for client communication. Once modeled, you can show exactly how:

  • Foil stamping catches light from different angles
  • The interior cushion sits when the lid opens
  • Whether the hinge mechanism feels smooth or clunky

AI cannot do this. Not because of some philosophical reason, but because this level of detail control requires understanding physical material properties, manufacturing tolerances, and human-object interaction.

Stage 3 — Physical Prototyping (Days 8-14)

Because so much gets validated digitally first, the first physical sample typically lands at 85-90% approval rate. Historically it was closer to 60%.

Sample rounds dropped from 4 iterations to 1-2 on average.

A Real Case Study

Earlier this year, a European jewelry brand came to us for holiday collection gift boxes.

Traditional timeline (our old workflow):

  • Concept development: 2 weeks
  • Design refinement: 2 weeks
  • Sample iterations: 4 rounds (~3 weeks)
  • Total: ~6 weeks

With AI-assisted workflow:

Phase Days What Happened
AI ideation 1-2 Client nailed down "brushed gold foil on matte black with soft interior reveal" — a specificity level that normally took 2 weeks of email ping-pong
3D modeling + render 3-6 Rendered from all angles; client requested interior tray tweaks; updated same day
First sample 10 Client approved with only minor color adjustments
Production green-light 14 Made their shipping window

Two weeks instead of six.

What AI Still Can't Do (Honest Assessment)

After 46 brands and hundreds of design cycles, here's where AI consistently falls short:

1. Manufacturability Blindness

AI generates visually stunning designs that would be impossible to produce at any reasonable cost. It doesn't understand die-cut limitations, minimum wall thicknesses, or assembly constraints.

2. Brand Strategy Void

An AI doesn't know whether your positioning is "accessible luxury" vs "ultra-exclusive." Those two directions require fundamentally different approaches to materials, finishes, and unboxing experience.

3. Supply Chain Reality

Client: "Can we use this gorgeous textured paper?"
Reality: "12-week lead time from Vietnam, MOQ 5,000 units,
         and the texture disappears under soft-touch lamination."
Enter fullscreen mode Exit fullscreen mode

AI has no concept of lead times, MOQs, or material interactions during finishing processes.

4. Tactile Judgment

You cannot determine from an image whether a finish feels premium or cheap when held. This sounds obvious, but it's the #1 reason AI-generated concepts need human reinterpretation before production.

The Numbers So Far

Metric Value
Brands served with this workflow 46
Time from brief to production-ready design 10-14 days (was 4-6 weeks)
Average sample iterations 1-2 rounds (was 3-4)
First-sample approval rate ~85-90% (was ~60%)
Project types Jewelry boxes, watch cases, perfume packaging, cosmetic sets, gift boxes

Practical Takeaways

If you're working at the intersection of AI and physical product design, here are 3 things I'd suggest:

1. Use AI for volume, not quality

AI excels at generating lots of options quickly. It's terrible at picking the right one. Your value as a professional is in curating and refining, not in how many images you can generate.

2. Build a prompt library from day one

Every time you find a prompt that produces good results for your domain, save it. Categorize it. This compound advantage grows surprisingly fast — after 6 months, our prompt library covered roughly 80% of incoming client requests without starting from scratch.

3. Be honest about AI's limits with clients

The vendors who oversell AI capabilities end up disappointing clients when the physical product doesn't match the AI render. We frame it explicitly: "AI helps us explore directions fast. Humans make sure it can actually be manufactured." Clients appreciate the transparency.

Wrapping Up

The headline number — 46 brands — sounds impressive until you realize each one ended up with packaging unique to them. Not because AI customized anything magically, but because AI handled the repetitive exploration phase, freeing our designers to focus on the parts requiring human judgment: brand alignment, material selection, manufacturability, and those tiny details that separate "good enough" from "wow."

If you're doing anything similar (or completely different) with AI in product design workflows, I'd love to hear about it — this field is moving fast and we're all still figuring out the best practices together.


Tags: ai, design, productivity, workflow


This article reflects our team's experience at Tancy Packaging since early 2025. We're a custom packaging manufacturer based in Guangzhou, serving 3,000+ brands across 30+ countries since 2008.

Top comments (0)