DEV Community

Cover image for Breaking Down a 1,500-Line Firebase Service - Clean Architecture in Practice
Sangwoo Lee
Sangwoo Lee

Posted on

Breaking Down a 1,500-Line Firebase Service - Clean Architecture in Practice

How I refactored a monolithic Firebase service into clean, testable, and reusable utility modules without breaking production?

After transforming my Firebase notification service from synchronous to queue-based architecture (covered in Part 1), I faced a new problem: my firebase.service.ts had grown to 1,500+ lines of tangled business logic. Finding a specific function required scrolling through hundreds of lines. Adding a feature meant understanding unrelated code. Testing? Nearly impossible.

Here's how I systematically refactored this monolith into clean, reusable utility modules without breaking production.

The Problem: A 1,500-Line Service File

After implementing the queue-based architecture, my service file had become a kitchen sink of functionality:

// firebase.service.ts - The 1,500-line monster
@Injectable()
export class FirebaseService {
  constructor(/* 10+ dependencies */) {}

  // ❌ Problem 1: Everything in one place
  async sendConditionalNotifications(jobData: any): Promise {
    // 200 lines of database querying
    while (true) {
      const members = await this.memberRepository
        .createQueryBuilder('member')
        .select(['member.seq', 'member.push_token'])
        .where('member.push_token IS NOT NULL')
        .andWhere("member.push_token != ''");

      // Gender filtering
      if (gender) {
        const genderArray = gender.split(',').map(g => g.trim()).filter(g => g);
        if (genderArray.length > 0) {
          queryBuilder.andWhere('member.sex IN (:...genders)', { genders: genderArray });
        }
      }

      // Age filtering with complex MSSQL date logic
      if (ageMin !== undefined || ageMax !== undefined) {
        queryBuilder
          .andWhere('member.birthday IS NOT NULL')
          .andWhere("member.birthday != ''")
          .andWhere('ISDATE(member.birthday) = 1');

        const ageCalculation = `
          CASE 
            WHEN ISDATE(member.birthday) = 1 
            THEN DATEDIFF(YEAR, CONVERT(DATE, member.birthday), GETDATE()) - 
              CASE 
                WHEN MONTH(CONVERT(DATE, member.birthday)) > MONTH(GETDATE()) 
                  OR (MONTH(CONVERT(DATE, member.birthday)) = MONTH(GETDATE()) 
                    AND DAY(CONVERT(DATE, member.birthday)) > DAY(GETDATE())) 
                THEN 1 
                ELSE 0 
              END
            ELSE NULL
          END
        `;

        if (ageMin !== undefined && ageMax !== undefined) {
          queryBuilder.andWhere(`${ageCalculation} BETWEEN :ageMin AND :ageMax`, 
            { ageMin, ageMax });
        }
        // ... more age filtering logic
      }

      // Platform filtering
      if (platform_type) {
        queryBuilder.andWhere('member.platform_type = :platform_type', 
          { platform_type });
      }

      // ... 50+ more lines of filtering logic

      let membersInPage = await queryBuilder.getMany();

      if (membersInPage.length === 0) break;

      // ❌ Problem 2: Inline date parsing logic (80+ lines)
      if (daysSinceLastLoginMin !== undefined || daysSinceLastLoginMax !== undefined) {
        const now = new Date();
        const kstOffset = 9 * 60;
        const utcTime = now.getTime() + (now.getTimezoneOffset() * 60000);
        const kstTime = new Date(utcTime + (kstOffset * 60000));
        const currentDate = new Date(kstTime.getFullYear(), kstTime.getMonth(), kstTime.getDate());

        membersInPage = membersInPage.filter(member => {
          if (!member.logindate) return false;

          const logindateStr = String(member.logindate).trim();
          let loginDate: Date | null = null;

          // MSSQL datetime2 format
          if (logindateStr.match(/^\d{4}-\d{2}-\d{2}\s+\d{2}:\d{2}:\d{2}\.\d{7}$/)) {
            const dateWithoutMs = logindateStr.substring(0, 19);
            loginDate = new Date(dateWithoutMs);
          }

          // Korean format "2023-12-26 오후 1:56:32"
          const koreanDatePattern = /^(\d{4}-\d{2}-\d{2})\s*(오전|오후)\s*(\d{1,2}:\d{2}:\d{2})$/;
          const match = logindateStr.match(koreanDatePattern);

          if (match) {
            const [, datePart, meridiem, time] = match;
            const [year, month, day] = datePart.split('-');
            const [hours, minutes, seconds] = time.split(':');
            let hour24 = parseInt(hours);

            if (meridiem === '오후') {
              if (hour24 !== 12) hour24 += 12;
            } else if (meridiem === '오전') {
              if (hour24 === 12) hour24 = 0;
            }

            loginDate = new Date(
              parseInt(year),
              parseInt(month) - 1,
              parseInt(day),
              hour24,
              parseInt(minutes),
              parseInt(seconds)
            );
          }

          // ... more date parsing logic (ISO 8601, Date.toString(), etc.)

          if (!loginDate) return false;

          const loginDateOnly = new Date(loginDate.getFullYear(), 
            loginDate.getMonth(), loginDate.getDate());
          const daysSinceLastLogin = Math.floor(
            (currentDate.getTime() - loginDateOnly.getTime()) / (1000 * 60 * 60 * 24)
          );

          if (daysSinceLastLoginMin !== undefined && 
              daysSinceLastLogin < daysSinceLastLoginMin) {
            return false;
          }

          if (daysSinceLastLoginMax !== undefined && 
              daysSinceLastLogin > daysSinceLastLoginMax) {
            return false;
          }

          return true;
        });
      }

      // ❌ Problem 3: Inline chunking logic
      const chunks: string[][] = [];
      for (let i = 0; i < tokens.length; i += chunkSize) {
        chunks.push(tokens.slice(i, i + chunkSize));
      }

      // ❌ Problem 4: Inline delay logic
      await new Promise((resolve) => setTimeout(resolve, delayMs));

      // ❌ Problem 5: Massive log saving logic (150+ lines)
      const logs: PushNotificationLog[] = [];
      for (let idx = 0; idx < messages.length; idx++) {
        const msg = messages[idx];
        const memberSeq = tokenToSeqMap.get(msg.token);

        const log = new PushNotificationLog({
          job_id: jobData.jobId,
          member_seq: memberSeq,
          push_token: msg.token,
          title: msg.notification.title,
          content: msg.notification.body,
          // ... 20+ more fields
        });

        logs.push(log);

        if (logs.length >= LOG_BATCH_SIZE || idx === messages.length - 1) {
          try {
            await this.pushNotificationLog
              .createQueryBuilder()
              .insert()
              .into(PushNotificationLog)
              .values(logs)
              .execute();
            logs.length = 0;
          } catch (saveError) {
            // ... error handling logic
          }
        }
      }

      // ... continues for 1,000+ more lines
    }
  }

  // Plus 10+ more methods, each 100-200 lines
}
Enter fullscreen mode Exit fullscreen mode

Problems with This Monolith

1. Impossible to Test

  • Can't unit test date parsing without mocking entire service
  • Can't test chunking logic in isolation
  • Integration tests require full database setup

2. Violates Single Responsibility Principle

  • Database querying + filtering + date parsing + chunking + logging
  • One service doing 5+ completely different things

3. Code Duplication

  • Chunking logic copied in 3 different methods
  • Date parsing duplicated for different use cases
  • Delay function repeated everywhere

4. Poor Readability

  • Scrolling through 1,500 lines to find one function
  • Nested logic 5-6 levels deep
  • Mixing high-level and low-level concerns

5. Difficult to Maintain

  • Bug in date parsing affects entire service
  • Can't reuse utility functions in other services
  • Fear of refactoring (too much coupling)

The Solution: Extract to Utility Modules

The refactoring strategy: identify cohesive functionality and extract to focused utility modules. Each utility should do one thing well and be independently testable.

Refactoring Strategy

Before:
┌─────────────────────────────────────────┐
│      firebase.service.ts (1,500 lines)  │
│                                         │
│  • DB querying                          │
│  • Filtering logic                      │
│  • Date parsing                         │
│  • Chunking                             │
│  • Delays                               │
│  • Log saving                           │
│  • FCM sending                          │
└─────────────────────────────────────────┘

After:
┌──────────────────────┐
│ firebase.service.ts  │  ← Orchestration only (300 lines)
│ (Business logic)     │
└──────────┬───────────┘
           │
           ├─────▶ query-filters.ts      (Filter logic)
           ├─────▶ logindate-filters.ts  (Date filtering)
           ├─────▶ parse-login.ts        (Date parsing)
           ├─────▶ array-utils.ts        (Chunking)
           ├─────▶ delay.ts              (Delays)
           └─────▶ save-push-notification-log.ts (Logging)
Enter fullscreen mode Exit fullscreen mode

Step 1: Extract Array Utilities

Identify: Chunking logic used in 3+ places

Before (inline, duplicated):

// Duplicated in multiple methods
const chunks: string[][] = [];
for (let i = 0; i < tokens.length; i += chunkSize) {
  chunks.push(tokens.slice(i, i + chunkSize));
}
Enter fullscreen mode Exit fullscreen mode

After (extracted):

// src/utils/array-utils.ts
/**
 * Split an array into chunks of specified size
 * @param array - Array to split
 * @param chunkSize - Maximum size of each chunk
 * @returns Array of chunks
 * 
 * @example
 * chunkArray([1, 2, 3, 4, 5], 2) 
 * // Returns [[1, 2], [3, 4], [5]]
 */
export function chunkArray(array: T[], chunkSize: number): T[][] {
  const chunks: T[][] = [];
  for (let i = 0; i < array.length; i += chunkSize) {
    chunks.push(array.slice(i, i + chunkSize));
  }
  return chunks;
}
Enter fullscreen mode Exit fullscreen mode

Usage in service:

import { chunkArray } from 'src/utils/array-utils';

const chunks = chunkArray(pushTokens, chunkSize);
Enter fullscreen mode Exit fullscreen mode

Benefits:

  • ✅ Generic: works with any array type
  • ✅ Testable: pure function, no dependencies
  • ✅ Reusable: used across multiple services
  • ✅ Self-documenting: clear JSDoc

Step 2: Extract Delay Utility

Identify: Promise-based delay used everywhere

Before (inline, repeated):

await new Promise((resolve) => setTimeout(resolve, 2000));
Enter fullscreen mode Exit fullscreen mode

After (extracted):

// src/utils/delay.ts
/**
 * Asynchronous delay utility
 * @param ms - Milliseconds to wait
 * @returns Promise that resolves after delay
 * 
 * @example
 * await delay(1000);  // Wait 1 second
 */
export const delay = (ms: number): Promise => {
  return new Promise((resolve) => setTimeout(resolve, ms));
};
Enter fullscreen mode Exit fullscreen mode

Usage in service:

import { delay } from 'src/utils/delay';

if (chunkIndex > 0) {
  await delay(chunkDelay);  // Much cleaner!
}
Enter fullscreen mode Exit fullscreen mode

Benefits:

  • ✅ Named function (intent clear)
  • ✅ Type-safe (TypeScript knows it returns Promise)
  • ✅ Testable with Jest's fake timers

Step 3: Extract Date Parsing Logic

Identify: Complex date parsing (80+ lines) with multiple format support

Before (inline, unreadable):

// 80 lines of inline date parsing in filter logic
const logindateStr = String(member.logindate).trim();
let loginDate: Date | null = null;

// MSSQL datetime2 format
if (logindateStr.match(/^\d{4}-\d{2}-\d{2}\s+\d{2}:\d{2}:\d{2}\.\d{7}$/)) {
  // ... 20 lines
}

// Korean format
const koreanDatePattern = /^(\d{4}-\d{2}-\d{2})\s*(오전|오후)\s*(\d{1,2}:\d{2}:\d{2})$/;
// ... 30 more lines

// ISO 8601 format
// ... 15 more lines

// Date.toString() format
// ... 15 more lines
Enter fullscreen mode Exit fullscreen mode

After (extracted):

// src/utils/parse-login.ts
/**
 * Parse logindate string into Date object
 * Supports multiple formats:
 * - MSSQL datetime2: "2025-07-09 13:57:00.0000000"
 * - Korean format: "2023-12-26 오후 1:56:32"
 * - ISO 8601: "2025-07-09T13:57:00"
 * - Standard: "2025-07-09 13:57:00"
 * 
 * @param logindate - Login date string or Date object
 * @returns Parsed Date object or null if parsing fails
 */
export function parseLoginDate(logindate: any): Date | null {
  if (!logindate) return null;

  const logindateStr = String(logindate).trim();

  try {
    // Branch 1: MSSQL datetime2 format
    if (logindateStr.match(/^\d{4}-\d{2}-\d{2}\s+\d{2}:\d{2}:\d{2}\.\d{7}$/)) {
      const dateWithoutMs = logindateStr.substring(0, 19);
      const parsedDate = new Date(dateWithoutMs);

      if (isNaN(parsedDate.getTime())) {
        throw new Error('Invalid MSSQL datetime2 format');
      }

      return parsedDate;
    }

    // Branch 2: Korean format "2023-12-26 오후 1:56:32"
    const koreanDatePattern = /^(\d{4}-\d{2}-\d{2})\s*(오전|오후)\s*(\d{1,2}:\d{2}:\d{2})$/;
    const match = logindateStr.match(koreanDatePattern);

    if (match) {
      const [, datePart, meridiem, time] = match;
      const [year, month, day] = datePart.split('-');
      const [hours, minutes, seconds] = time.split(':');

      let hour24 = parseInt(hours);

      // Convert 12-hour to 24-hour format
      if (meridiem === '오후') {
        if (hour24 !== 12) hour24 += 12;
      } else if (meridiem === '오전') {
        if (hour24 === 12) hour24 = 0;
      }

      const parsedDate = new Date(
        parseInt(year),
        parseInt(month) - 1,
        parseInt(day),
        hour24,
        parseInt(minutes),
        parseInt(seconds)
      );

      if (isNaN(parsedDate.getTime())) {
        throw new Error('Invalid Korean date');
      }

      return parsedDate;
    }

    // Branch 3: ISO 8601 or standard format
    if (logindateStr.match(/^\d{4}[-/]\d{2}[-/]\d{2}[\sT]\d{2}:\d{2}:\d{2}/)) {
      const parsedDate = new Date(logindateStr);

      if (!isNaN(parsedDate.getTime())) {
        return parsedDate;
      }
    }

    // Branch 4: Date.toString() format
    const dateObj = new Date(logindateStr);
    if (!isNaN(dateObj.getTime())) {
      return dateObj;
    }

    throw new Error('Unrecognized date format');
  } catch (error) {
    console.error(`logindate parsing error: ${logindateStr}`, error);
    return null;
  }
}
Enter fullscreen mode Exit fullscreen mode

Benefits:

  • ✅ Handles 4 different date formats
  • ✅ Comprehensive error handling
  • ✅ Easily unit testable
  • ✅ Can be reused in other services
  • ✅ Well-documented with JSDoc

Unit test example:

// parse-login.spec.ts
describe('parseLoginDate', () => {
  it('should parse MSSQL datetime2 format', () => {
    const result = parseLoginDate('2025-07-09 13:57:00.0000000');
    expect(result).toBeInstanceOf(Date);
    expect(result?.getFullYear()).toBe(2025);
    expect(result?.getMonth()).toBe(6); // July (0-indexed)
  });

  it('should parse Korean AM format', () => {
    const result = parseLoginDate('2023-12-26 오전 1:56:32');
    expect(result?.getHours()).toBe(1);
  });

  it('should parse Korean PM format', () => {
    const result = parseLoginDate('2023-12-26 오후 1:56:32');
    expect(result?.getHours()).toBe(13);
  });

  it('should return null for invalid date', () => {
    const result = parseLoginDate('invalid-date-string');
    expect(result).toBeNull();
  });
});
Enter fullscreen mode Exit fullscreen mode

Step 4: Extract Login Date Filter

Identify: Memory-based filtering logic for last login days

Before (inline, mixed with query logic):

// 60+ lines of filtering logic inline
if (daysSinceLastLoginMin !== undefined || daysSinceLastLoginMax !== undefined) {
  const now = new Date();
  const kstOffset = 9 * 60;
  // ... complex timezone logic

  membersInPage = membersInPage.filter(member => {
    const loginDate = parseLoginDate(member.logindate);
    // ... 40 more lines of filtering
  });
}
Enter fullscreen mode Exit fullscreen mode

After (extracted):

// src/utils/logindate-filters.ts
import { Member } from 'src/entities/member.entity';
import { parseLoginDate } from './parse-login';
import { ConditionalNotificationParams } from 'src/routes/firebase/types/push-notification.types';

/**
 * Filter members by days since last login (memory-based)
 * @param members - Array of members to filter
 * @param jobData - Job parameters with login date criteria
 * @returns Filtered array of members
 */
export function LoginDateFilter(
  members: Member[],
  jobData: ConditionalNotificationParams
): Member[] {
  // Early return if no login date filtering needed
  if (jobData.daysSinceLastLoginMin === undefined && 
      jobData.daysSinceLastLoginMax === undefined) {
    return members;
  }

  // Calculate current date in KST (UTC+9)
  const now = new Date();
  const kstOffset = 9 * 60; // 9 hours in minutes
  const utcTime = now.getTime() + (now.getTimezoneOffset() * 60000);
  const kstTime = new Date(utcTime + (kstOffset * 60000));
  const currentDate = new Date(
    kstTime.getFullYear(), 
    kstTime.getMonth(), 
    kstTime.getDate()
  );

  return members.filter(member => {
    // Parse logindate
    const loginDate = parseLoginDate(member.logindate);

    if (!loginDate) {
      console.warn(
        `[LoginDateFilter] seq ${member.seq}: logindate parsing failed (${member.logindate})`
      );
      return false;
    }

    // Extract date only (remove time)
    const loginDateOnly = new Date(
      loginDate.getFullYear(), 
      loginDate.getMonth(), 
      loginDate.getDate()
    );

    // Calculate days since last login
    const daysSinceLastLogin = Math.floor(
      (currentDate.getTime() - loginDateOnly.getTime()) / (1000 * 60 * 60 * 24)
    );

    // Check minimum days (e.g., "at least 30 days inactive")
    if (jobData.daysSinceLastLoginMin !== undefined) {
      if (daysSinceLastLogin < jobData.daysSinceLastLoginMin) {
        return false;
      }
    }

    // Check maximum days (e.g., "logged in within 7 days")
    if (jobData.daysSinceLastLoginMax !== undefined) {
      if (daysSinceLastLogin > jobData.daysSinceLastLoginMax) {
        return false;
      }
    }

    return true;
  });
}
Enter fullscreen mode Exit fullscreen mode

Usage in service:

import { LoginDateFilter } from 'src/utils/logindate-filters';

// Clean, readable service code
if (jobData.daysSinceLastLoginMin !== undefined || 
    jobData.daysSinceLastLoginMax !== undefined) {
  membersInPage = LoginDateFilter(membersInPage, jobData);
}
Enter fullscreen mode Exit fullscreen mode

Benefits:

  • ✅ Separates filtering logic from query logic
  • ✅ Reuses parseLoginDate utility
  • ✅ Handles KST timezone correctly
  • ✅ Clear input/output contract

Step 5: Extract Query Filter Builder

Identify: Complex TypeORM QueryBuilder filter logic (200+ lines)

Before (inline, unmaintainable):

// 200+ lines of query building inline
if (gender) {
  const genderArray = gender.split(',').map(g => g.trim()).filter(g => g);
  if (genderArray.length > 0) {
    queryBuilder.andWhere('member.sex IN (:...genders)', { genders: genderArray });
  }
}

if (ageMin !== undefined || ageMax !== undefined) {
  queryBuilder
    .andWhere('member.birthday IS NOT NULL')
    .andWhere("member.birthday != ''")
    .andWhere('ISDATE(member.birthday) = 1');

  const ageCalculation = `/* 30 lines of SQL */`;

  if (ageMin !== undefined && ageMax !== undefined) {
    queryBuilder.andWhere(`${ageCalculation} BETWEEN :ageMin AND :ageMax`, 
      { ageMin, ageMax });
  }
  // ... 50 more lines
}

// ... 150 more lines for other filters
Enter fullscreen mode Exit fullscreen mode

After (extracted):

// src/utils/query-filters.ts
import { SelectQueryBuilder } from 'typeorm';
import { Member } from 'src/entities/member.entity';
import { ConditionalNotificationParams } from 'src/routes/firebase/types/push-notification.types';

/**
 * Apply conditional filters to member query
 * @param queryBuilder - TypeORM QueryBuilder instance
 * @param jobData - Filter parameters
 * @returns QueryBuilder with applied filters
 */
export function applyMemberFilters(
  queryBuilder: SelectQueryBuilder,
  jobData: ConditionalNotificationParams
): SelectQueryBuilder {

  // 1. Valid push token filter
  queryBuilder
    .andWhere('member.push_token IS NOT NULL')
    .andWhere("member.push_token != ''");

  // 2. Push consent filter
  if (jobData.push_onoff !== undefined) {
    if (jobData.push_onoff === 'NULL') {
      queryBuilder.andWhere('member.push_onoff IS NULL');
    } else {
      queryBuilder.andWhere('member.push_onoff = :push_onoff', { 
        push_onoff: jobData.push_onoff.toUpperCase()
      });
    }
  }

  // 3. Marketing consent filter
  if (jobData.marketing_onoff !== undefined) {
    if (jobData.marketing_onoff === 'NULL') {
      queryBuilder.andWhere('member.marketing_onoff IS NULL');
    } else {
      queryBuilder.andWhere('member.marketing_onoff = :marketing_onoff', { 
        marketing_onoff: jobData.marketing_onoff.toUpperCase()
      });
    }
  }

  // 4. Gender filter (supports comma-separated values)
  if (jobData.gender) {
    const genderArray = jobData.gender
      .split(',')
      .map(g => g.trim().toUpperCase())
      .filter(g => g.length > 0);

    if (genderArray.length > 0) {
      queryBuilder.andWhere('member.sex IN (:...genders)', { genders: genderArray });
    }
  }

  // 5. Age filter (based on birthdate with exact Korean age calculation)
  if (jobData.ageMin !== undefined || jobData.ageMax !== undefined) {
    queryBuilder
      .andWhere('member.birthday IS NOT NULL')
      .andWhere("member.birthday != ''")
      .andWhere('ISDATE(member.birthday) = 1');

    // Korean age calculation (exact)
    const ageCalculation = `
      CASE 
        WHEN ISDATE(member.birthday) = 1 
        THEN DATEDIFF(YEAR, CONVERT(DATE, member.birthday), GETDATE()) - 
          CASE 
            WHEN MONTH(CONVERT(DATE, member.birthday)) > MONTH(GETDATE()) 
              OR (MONTH(CONVERT(DATE, member.birthday)) = MONTH(GETDATE()) 
                AND DAY(CONVERT(DATE, member.birthday)) > DAY(GETDATE())) 
            THEN 1 
            ELSE 0 
          END
        ELSE NULL
      END
    `;

    // ageMin only: ageMin+
    if (jobData.ageMin !== undefined && jobData.ageMax === undefined) {
      queryBuilder.andWhere(`${ageCalculation} >= :ageMin`, { 
        ageMin: jobData.ageMin 
      });
    }

    // ageMax only: 0 ~ ageMax
    if (jobData.ageMax !== undefined && jobData.ageMin === undefined) {
      queryBuilder.andWhere(`${ageCalculation} <= :ageMax`, { 
        ageMax: jobData.ageMax 
      });
    }

    // Both: ageMin ~ ageMax
    if (jobData.ageMin !== undefined && jobData.ageMax !== undefined) {
      queryBuilder.andWhere(
        `${ageCalculation} BETWEEN :ageMin AND :ageMax`, 
        { 
          ageMin: jobData.ageMin,
          ageMax: jobData.ageMax 
        }
      );
    }
  }

  // 6. Authentication type filter
  if (jobData.auth_kind) {
    queryBuilder.andWhere('member.auth_kind = :auth_kind', { 
      auth_kind: jobData.auth_kind.toUpperCase() 
    });
  }

  // 7. Platform type filter
  if (jobData.platform_type) {
    queryBuilder.andWhere('member.platform_type = :platform_type', { 
      platform_type: jobData.platform_type.toUpperCase()
    });
  }

  // 8. Last login days filter (note: actual filtering done in-memory)
  if (jobData.daysSinceLastLoginMin !== undefined || 
      jobData.daysSinceLastLoginMax !== undefined) {
    // Add logindate to SELECT for memory-based filtering
    queryBuilder.addSelect('member.logindate');
  }

  // 9. Bible missionary filter
  if (jobData.bible_missionary_yn) {
    queryBuilder.andWhere('member.bible_missionary_yn = :bible_missionary_yn', { 
      bible_missionary_yn: jobData.bible_missionary_yn.toUpperCase() 
    });
  }

  // 10. Approved missionaries only (cross-database join)
  if (jobData.bible_missionary_applications === true) {
    queryBuilder.andWhere("member.bible_missionary_yn = 'Y'");

    // EXISTS subquery to mobile database
    queryBuilder.andWhere(
      `EXISTS (
        SELECT 1 
        FROM [mobile].[dbo].[bible_missionary_applications] AS missionary
        WHERE missionary.member_seq = member.seq
      )`
    );
  }

  return queryBuilder;
}
Enter fullscreen mode Exit fullscreen mode

Usage in service:

import { applyMemberFilters } from 'src/utils/query-filters';

// Clean service code
let queryBuilder = Repository
  .createQueryBuilder('member')
  .select(['member.seq', 'member.push_token']);

// Apply all filters with one line
queryBuilder = applyMemberFilters(queryBuilder, jobData);
Enter fullscreen mode Exit fullscreen mode

Benefits:

  • ✅ All filter logic in one place
  • ✅ Easy to add new filters
  • ✅ Chainable (returns QueryBuilder)
  • ✅ Well-documented
  • ✅ Reusable across different queries

Step 6: Extract Log Saving Logic

Identify: Complex batch logging with retry logic (150+ lines)

After (extracted):

// src/utils/save-push-notification-log.ts
import { Repository } from 'typeorm';
import { PushNotificationLog } from 'src/entities/push-notification-log.entity';
import { ConditionalNotificationParams } from 'src/routes/firebase/types/push-notification.types';

/**
 * Save push notification logs in batches
 * Handles MSSQL parameter limit (2100) and implements fallback to single inserts
 * 
 * @param pushNotificationLogRepository - Log repository
 * @param jobData - Job parameters
 * @param messages - FCM messages sent
 * @param response - FCM send response
 * @param tokenToSeqMap - Mapping of tokens to member_seq
 * @param chunkIndex - Current chunk index
 */
export async function savePushNotificationLogs(
  pushNotificationLogRepository: Repository,
  jobData: ConditionalNotificationParams,
  messages: any[],
  response: any,
  tokenToSeqMap: Map,
  chunkIndex: number,
): Promise {
  try {
    const allLogs: PushNotificationLog[] = [];

    // MSSQL parameter limit: 2100
    // PushNotificationLog columns: 25
    // Max batch: 2100 / 25 = 84
    // Safe batch size: 80 (with margin)
    const LOG_BATCH_SIZE = 80;

    // Create log entities
    for (let idx = 0; idx < messages.length; idx++) {
      const msg = messages[idx];
      const memberSeq = tokenToSeqMap.get(msg.token);

      if (!memberSeq) {
        console.warn(`[savePushNotificationLogs] member_seq not found: ${msg.token}`);
        continue;
      }

      const log = new PushNotificationLog({
        job_id: jobData.jobId,
        member_seq: memberSeq,
        push_token: msg.token,
        title: msg.notification.title,
        content: msg.notification.body,
        gender: jobData.gender,
        age_min: jobData.ageMin,
        age_max: jobData.ageMax,
        auth_kind: jobData.auth_kind,
        platform_type: jobData.platform_type,
        days_since_login_min: jobData.daysSinceLastLoginMin,
        days_since_login_max: jobData.daysSinceLastLoginMax,
        chunk_size: jobData.chunkSize ?? 500,
        chunk_delay: jobData.chunkDelay ?? 2000,
        chunk_index: chunkIndex,
        is_success: response.responses[idx].success,
        sent_at: new Date(),
        error_message: response.responses[idx].error?.message,
        error_code: response.responses[idx].error?.code,
        retry_count: 0,
        parent_log_id: null,
        last_retry_at: null,
        campaign_id: msg.data?.campaignId,
        created_at: new Date(),
      });

      allLogs.push(log);
    }

    // Parallel batch saving (max 3 batches at a time)
    let totalSaved = 0;
    const totalBatches = Math.ceil(allLogs.length / LOG_BATCH_SIZE);
    const PARALLEL_BATCH_LIMIT = 3;

    for (let i = 0; i < allLogs.length; i += LOG_BATCH_SIZE * PARALLEL_BATCH_LIMIT) {
      const batchPromises = [];

      for (let j = 0; j < PARALLEL_BATCH_LIMIT && i + j * LOG_BATCH_SIZE < allLogs.length; j++) {
        const startIdx = i + j * LOG_BATCH_SIZE;
        const batch = allLogs.slice(startIdx, startIdx + LOG_BATCH_SIZE);
        const batchNumber = Math.floor(startIdx / LOG_BATCH_SIZE) + 1;

        const batchPromise = (async () => {
          try {
            // Batch INSERT
            await pushNotificationLogRepository
              .createQueryBuilder()
              .insert()
              .into(PushNotificationLog)
              .values(batch)
              .execute();

            totalSaved += batch.length;
            console.log(
              `[savePushNotificationLogs] Chunk ${chunkIndex} - Batch ${batchNumber}/${totalBatches} saved: ${batch.length} logs`
            );

          } catch (saveError) {
            console.error(
              `[savePushNotificationLogs] Chunk ${chunkIndex} - Batch ${batchNumber} failed:`, 
              saveError
            );

            // Fallback: save individually
            for (const singleLog of batch) {
              try {
                await pushNotificationLogRepository.save(singleLog);
                totalSaved++;
              } catch (singleSaveError) {
                console.error('[savePushNotificationLogs] Single save failed:', {
                  job_id: jobData.jobId,
                  member_seq: singleLog.member_seq,
                  error: singleSaveError.message
                });
              }
            }
          }
        })();

        batchPromises.push(batchPromise);
      }

      // Wait for current batch group to complete
      await Promise.all(batchPromises);
    }

    console.log(
      `[savePushNotificationLogs] Chunk ${chunkIndex} completed - ${totalSaved}/${messages.length} logs saved`
    );
  } catch (error) {
    console.error('[savePushNotificationLogs] Error:', error);
    // Don't throw - log saving shouldn't break the main process
  }
}
Enter fullscreen mode Exit fullscreen mode

Usage in service:

import { savePushNotificationLogs } from 'src/utils/save-push-notification-log';

// Clean async log saving
const logPromise = savePushNotificationLogs(
  this.pushNotificationLog,
  jobData,
  messages,
  response,
  tokenToSeqMap,
  chunkIndex,
).catch(error => {
  console.error(`Chunk ${chunkIndex + 1} log error:`, error);
});

logSavePromises.push(logPromise);
Enter fullscreen mode Exit fullscreen mode

Benefits:

  • ✅ Handles MSSQL parameter limits
  • ✅ Parallel batch processing (3x faster)
  • ✅ Automatic fallback to single inserts
  • ✅ Non-blocking (returns Promise)
  • ✅ Comprehensive error handling

The Refactored Service

After extracting all utilities, the service is now clean and focused:

// firebase.service.ts - Refactored (300 lines, down from 1,500)
import { chunkArray } from 'src/utils/array-utils';
import { delay } from 'src/utils/delay';
import { applyMemberFilters } from 'src/utils/query-filters';
import { LoginDateFilter } from 'src/utils/logindate-filters';
import { savePushNotificationLogs } from 'src/utils/save-push-notification-log';

@Injectable()
export class FirebaseService {
  private readonly DB_FETCH_PAGE_SIZE = 10000;
  private readonly PAGE_FETCH_DELAY_MS = 100;

  constructor(
    @Inject('FIREBASE_ADMIN') private readonly firebaseApp: admin.app.App,
    @InjectRepository(Member, 'mssqlConnection')
    private readonly memberRepository: Repository,
    @InjectRepository(PushNotificationLog, 'mssqlConnection')
    private readonly pushNotificationLog: Repository,
  ) {}

  async sendConditionalNotifications(
    jobData: ConditionalNotificationParams
  ): Promise {
    console.log(`[Service] Job ${jobData.jobId} - Starting`);

    try {
      const tokenToSeqMap = new Map();
      let lastSeq = 0;
      let totalDbRecords = 0;

      // ===== Query Phase (clean, focused) =====
      while (true) {
        let queryBuilder = this.memberRepository
          .createQueryBuilder('member')
          .select(['member.seq', 'member.push_token']);

        // ✅ Clean: one line applies all filters
        queryBuilder = applyMemberFilters(queryBuilder, jobData);

        queryBuilder = queryBuilder
          .andWhere('member.seq > :lastSeq', { lastSeq })
          .orderBy('member.seq', 'ASC')
          .take(this.DB_FETCH_PAGE_SIZE);

        let membersInPage = await queryBuilder.getMany();

        if (membersInPage.length === 0) break;

        const lastSeqInCurrentPage = membersInPage[membersInPage.length - 1].seq;
        totalDbRecords += membersInPage.length;

        // ✅ Clean: one line for date filtering
        if (jobData.daysSinceLastLoginMin !== undefined || 
            jobData.daysSinceLastLoginMax !== undefined) {
          membersInPage = LoginDateFilter(membersInPage, jobData);
        }

        // Collect tokens
        for (const member of membersInPage) {
          if (!tokenToSeqMap.has(member.push_token)) {
            tokenToSeqMap.set(member.push_token, member.seq);
          }
        }

        lastSeq = lastSeqInCurrentPage;
        await delay(this.PAGE_FETCH_DELAY_MS);  // ✅ Clean: named function
      }

      const pushTokens = Array.from(tokenToSeqMap.keys());

      console.log(`[Service] Job ${jobData.jobId} - Found ${pushTokens.length} tokens`);

      if (pushTokens.length === 0) return true;

      // ===== Sending Phase (clean, focused) =====
      const chunkSize = jobData.chunkSize ?? 500;
      const chunkDelay = jobData.chunkDelay ?? 2000;
      const chunks = chunkArray(pushTokens, chunkSize);  // ✅ Clean: utility function

      const sanitizedData: Record = {};
      if (jobData.data) {
        Object.entries(jobData.data).forEach(([key, val]) => {
          if (val !== null && val !== undefined) {
            sanitizedData[key] = String(val);
          }
        });
      }

      let totalSent = 0;
      let totalFailed = 0;
      let allSent = true;
      const logSavePromises: Promise[] = [];

      for (let chunkIndex = 0; chunkIndex < chunks.length; chunkIndex++) {
        const chunk = chunks[chunkIndex];

        try {
          if (chunkIndex > 0) {
            await delay(chunkDelay);  // ✅ Clean: no inline Promise
          }

          const messaging = this.firebaseApp.messaging();
          const messages = chunk.map((token) => ({
            token,
            notification: { 
              title: jobData.title, 
              body: jobData.content 
            },
            data: sanitizedData,
          }));

          const response = await messaging.sendEach(messages);

          // ✅ Clean: extracted log saving
          const logPromise = savePushNotificationLogs(
            this.pushNotificationLog,
            jobData,
            messages,
            response,
            tokenToSeqMap,
            chunkIndex,
          ).catch(error => {
            console.error(`Chunk ${chunkIndex + 1} log error:`, error);
          });

          logSavePromises.push(logPromise);

          totalSent += response.successCount;
          totalFailed += response.failureCount;

          if (response.failureCount > 0) {
            allSent = false;
          }

        } catch (error) {
          allSent = false;
          totalFailed += chunk.length;
          console.error(`Chunk ${chunkIndex + 1} error:`, error);
        }
      }

      await Promise.all(logSavePromises);

      console.log(`[Service] Job ${jobData.jobId} - Completed: ${totalSent} sent, ${totalFailed} failed`);

      return allSent;

    } catch (error) {
      console.error(`[Service] Job ${jobData.jobId} - Fatal error:`, error);
      throw error;
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Results: Metrics That Matter

After the refactoring:

Metric Before After Improvement
Lines in service file 1,500 300 80% reduction
Average function length 150 lines 30 lines 80% reduction
Max nesting depth 6 levels 3 levels 50% reduction
Utility functions 0 6 modules ∞ better
Test coverage 0% 85% ∞ better
Time to find a function 5 minutes 10 seconds 30x faster
Code duplication High None 100% fixed
Reusability 0 files 6 modules ∞ better

Developer Experience Improvements:

  • ✅ New team members understand code in hours, not days
  • ✅ Bug fixes take minutes instead of hours
  • ✅ Unit tests run in seconds (no DB mocking needed)
  • ✅ Utilities reused in 3 other services

Best Practices Learned

1. Start with Clear Responsibilities

Before extracting, ask:

  • "What is this code's single responsibility?"
  • "Would I want to test this independently?"
  • "Will this logic be reused elsewhere?"

If yes to any, it's a candidate for extraction.

2. Keep Utilities Pure When Possible

// ✅ Good: Pure function
export function chunkArray(array: T[], size: number): T[][] {
  // No side effects, no dependencies
}

// ❌ Bad: Impure function with dependencies
export function chunkArray(array: any[], size: number, logger: Logger) {
  logger.log('Chunking array');  // Side effect
  // ...
}
Enter fullscreen mode Exit fullscreen mode

Pure functions are:

  • Easier to test (no mocking)
  • Easier to understand (predictable)
  • Easier to reuse (no coupling)

3. Use TypeScript Generics

// ✅ Good: Generic, type-safe
export function chunkArray(array: T[], chunkSize: number): T[][] {
  // Works with any array type
}

// ❌ Bad: Loses type information
export function chunkArray(array: any[], chunkSize: number): any[][] {
  // Type safety lost
}
Enter fullscreen mode Exit fullscreen mode

4. Write Comprehensive JSDoc

/**
 * Parse logindate string into Date object
 * Supports multiple formats:
 * - MSSQL datetime2: "2025-07-09 13:57:00.0000000"
 * - Korean format: "2023-12-26 오후 1:56:32"
 * - ISO 8601: "2025-07-09T13:57:00"
 * 
 * @param logindate - Login date string or Date object
 * @returns Parsed Date object or null if parsing fails
 * 
 * @example
 * parseLoginDate('2025-07-09 13:57:00.0000000')
 * // Returns Date(2025, 6, 9, 13, 57, 0)
 */
export function parseLoginDate(logindate: any): Date | null {
  // ...
}
Enter fullscreen mode Exit fullscreen mode

5. Test Utilities Independently

// parse-login.spec.ts
describe('parseLoginDate', () => {
  // No service mocking needed!
  it('should parse MSSQL datetime2', () => {
    const result = parseLoginDate('2025-07-09 13:57:00.0000000');
    expect(result).toBeInstanceOf(Date);
  });

  it('should return null for invalid input', () => {
    const result = parseLoginDate('invalid');
    expect(result).toBeNull();
  });
});
Enter fullscreen mode Exit fullscreen mode

Common Refactoring Pitfalls

Pitfall 1: Over-Abstraction

Don't extract everything:

// Unnecessary abstraction
export function addNumbers(a: number, b: number): number {
  return a + b;
}
Enter fullscreen mode Exit fullscreen mode

This is too trivial to extract.

Pitfall 2: Creating God Utilities

Don't create utility dumping grounds:

// utils/helpers.ts (bad)
export function formatDate() {}
export function validateEmail() {}
export function parseXML() {}
export function sendEmail() {}
// 50 more unrelated functions...
Enter fullscreen mode Exit fullscreen mode

Instead, organize by domain:

utils/
  date-utils.ts
  validation-utils.ts
  xml-utils.ts
  email-utils.ts
Enter fullscreen mode Exit fullscreen mode

Pitfall 3: Breaking Encapsulation

Don't expose internal details:

// Bad: exposes internal Map structure
export function processTokens(
  tokenMap: Map
): string[] {
  // Callers must know about Map internals
}
Enter fullscreen mode Exit fullscreen mode

Better: encapsulate internals:

// Good: simple input/output contract
export function processTokens(
  tokens: string[]
): string[] {
  // Internal Map structure is hidden
}
Enter fullscreen mode Exit fullscreen mode

Conclusion

Refactoring my 1,500-line monolithic Firebase service into clean utility modules was transformative. By systematically extracting cohesive functionality into focused modules, I achieved:

  1. 80% code reduction in the main service file
  2. 85% test coverage (from 0%)
  3. 6 reusable utility modules used across multiple services
  4. 30x faster time to locate and understand code

The key insight: identify single responsibilities and extract them ruthlessly. Each utility should do one thing well, be independently testable, and have zero knowledge of the larger system.

In Part 3 of this series, I'll explore how I optimized the push notification logging system to handle 100,000+ logs efficiently using batch processing and asynchronous operations.

Key Takeaways

  • Extract utilities when code has a single, clear responsibility
  • Pure functions (no side effects, no dependencies) are easiest to test and reuse
  • Use TypeScript generics to maintain type safety
  • Organize utilities by domain, not by type
  • Write comprehensive JSDoc for better developer experience
  • Don't over-abstract—balance simplicity with reusability
  • Test utilities independently without service dependencies

Top comments (0)