ããŠãããŒã ã®çãããç§ã¯Integraã§ããå°ãç©è°ãéžããããããªãããšããäŒãããŸã: ã¢ãã¯äžå¿ã®ãã¹ãã¹ã€ãŒãã¯ã誀ã£ãå®å¿æãäžããŠããŸã ã確ãã«ãã¢ãã¯ã¯é«éã§äºæž¬å¯èœããããŠç°¡åã«ã»ããã¢ããã§ããŸããããããæ¬çªç°å¢ã§ã·ã¹ãã ãå®éã«ã©ã®ããã«åäœãããã«ã€ããŠãã¢ãã¯ã¯åãã€ããŠããŸãã
äœå¹Žãã®éããååã«ãã¹ãããããã¢ããªã±ãŒã·ã§ã³ãæ¬çªç°å¢ã§åŽ©å£ããã®ãèŠãŠããŸããããã®çç±ã¯ãçµ±åãã€ã³ããå¹»æ³çãªã¢ãã¯ã«å¯ŸããŠæ€èšŒãããŠããããã§ããç§ã¯å®ãµãŒãã¹ãã¹ãã®åŒ·åãªæ¯æè ãšãªããŸãããçŽç²äž»çŸ©è ã ããã§ã¯ãªããå®çšäž»çŸ©è ã ããã§ããå®éã«éèŠãªãã°ãæããããã¹ããæ¬²ããã®ã§ãã
ãã®ã¬ã€ãã§ã¯ãå®ãµãŒãã¹ã䜿ã£ãçµ±åãã¹ããžã®äœç³»çãªã¢ãããŒãã説æããŸããããŒã¿ããŒã¹ã¯ãšãªãæ©èœããããAPIã³ãŒã«ãæåããããã¡ãã»ãŒãžãã¥ãŒãã¡ãã»ãŒãžãé ä¿¡ããããå®éã«æããŠããããã¹ãã§ããç°å¢ã»ããã¢ãããèªèšŒæ å ±ç®¡çãã¯ãªãŒã³ã¢ããæŠç¥ããããŠCI/CDãã€ãã©ã€ã³ãçããããšãªã90ã95%ã®ã«ãã¬ããžãéæããæ¹æ³ã«ã€ããŠèª¬æããŸãã
å®ãµãŒãã¹ãã¢ãã¯ã«åãçç±(ã»ãšãã©ã®å Žå)
ãŸããéšå±ã®äžã®è±¡ã«å¯ŸåŠããŸããããMike Cohnã2009幎ã«å°å ¥ãããã¹ããã©ãããã¯ãäœäžä»£ãã®éçºè ããäžéšã®çµ±åãã¹ããå°ãªããããŠããããã¹ããåºç€ãšããæ¹åã«å°ããŠããŸãããããã¯ä»ã§ãå¥å šãªã¢ããã€ã¹ã§ããããããããŒã ãééã£ãŠããç¹ã¯ã ãã¹ãŠ ã®çµ±åãã¹ããã¢ãã¯åãããäŸåé¢ä¿ã«çœ®ãæããå¹ççã ãšèããŠããããšã§ãã
ã¢ãã¯ãã¡ãŒã¹ããã¹ãã®åé¡ç¹
ããŒã¿ããŒã¹ãã¢ãã¯åãããšãããŒã¿ããŒã¹ã§ã¯ãªãã¢ãã¯ããã¹ãããŠããŸããHTTPã¯ã©ã€ã¢ã³ããã¢ãã¯åãããšãfetch()ãæ£ããåŒã³åºããããšãæ€èšŒããŠããã®ã§ãã£ãŠããªã¢ãŒãAPIãå®éã«ã³ãŒããæåŸ
ããããŒã¿ãè¿ãããšãæ€èšŒããŠããã®ã§ã¯ãããŸããã
ã¢ãã¯ãæãããããªããã®:
-
ã¹ããŒãã®äžäžèŽ : ã¢ãã¯ã¯
user.firstNameãè¿ããŸãããAPIã¯å®éã«ã¯user.first_nameãéä¿¡ããŸã - ãããã¯ãŒã¯é害 : ã¿ã€ã ã¢ãŠããæ¥ç¶ãªã»ãããDNSé害âã¢ãã¯ã©ã³ãã§ã¯èŠããŸãã
- ããŒã¿ããŒã¹å¶çŽ : ã¢ãã¯ã¯éè€ã¡ãŒã«ãåãã§åãå ¥ããŸãããPostgreSQLã¯äžæå¶çŽéåãã¹ããŒããŸã
- èªèšŒãã㌠: OAuthããŒã¯ã³ãæéåãã«ãªãããªãã¬ãã·ã¥ããŒã¯ã³ã倱æããAPIããŒãã¬ãŒãå¶éãããŸã
- ã·ãªã¢ã©ã€ãŒãŒã·ã§ã³ã®åé¡ : ãã®JavaScript Dateãªããžã§ã¯ãã¯ãããªããæãããã«ã¯ã·ãªã¢ã©ã€ãºãããŸãã
Philipp Hauerã2019幎ã®èšäºã§éåŒã«è¿°ã¹ãããã«:ãçµ±åãã¹ãã¯ãæ¬çªç°å¢ãšåãããã«ããã¹ãŠã®ã¯ã©ã¹ãšã¬ã€ã€ãŒãäžç·ã«ãã¹ãããŸããããã«ãããã¯ã©ã¹ã®çµ±åã«ããããã°ãæ€åºãããå¯èœæ§ãã¯ããã«é«ããªãããã¹ããããæå³ã®ãããã®ã«ãªããŸããã
ã¢ãã¯ãé©åãªå Žå
ç§ã¯çä¿¡è ã§ã¯ãããŸãããçµ±åãã¹ãã«ãããŠãã¢ãã¯ãæ£åœãªå Žé¢ããããŸã:
- é害ã·ããªãªã®ãã¹ã : Toxiproxyã®ãããªãããã¯ãŒã¯ã·ãã¥ã¬ãŒã¿ã¯ãå¶åŸ¡ãããæ¹æ³ã§ã¬ã€ãã³ã·ãšéå®³ãæ³šå ¥ã§ããŸã
- å¶åŸ¡ã§ããªããµãŒãããŒãã£ãµãŒãã¹ : Stripeã®æ¬çªAPIãšçµ±åããŠããå Žåãå®éã®èª²éã§ã¯ãªãããã¹ãã¢ãŒããå¿ èŠã§ããã
- é ããŸãã¯é«äŸ¡ãªæäœ : MLã¢ãã«ã®ãã¬ãŒãã³ã°ã«5åãããå Žåãã»ãšãã©ã®ãã¹ãã§æšè«ãã¢ãã¯åããŸã
- ç¹å®ã®ã³ã³ããŒãã³ãã®åé¢ : ãµãŒãã¹Bã倱æãããšãã®ãµãŒãã¹Aã®åäœããã¹ãããå ŽåãBã®ã¬ã¹ãã³ã¹ãã¢ãã¯åããŸã
éèŠãªåå: å¢çã§ã¢ãã¯åããçµ±åããã¹ããã ã
åãã€ããªããã¹ãç°å¢ã®ã»ããã¢ãã
æ¬çªç°å¢ããã©ãŒãªã³ã°ãããã¹ãç°å¢ã¯ãå®ãµãŒãã¹ãã¹ãã«ãšã£ãŠè²ããŸããããããããæ¬çªç°å¢ããã©ãŒãªã³ã°ããããšã¯ããAWSã€ã³ãã©å šäœãè€è£œããããšããæå³ã§ã¯ãããŸãããåã ã€ã³ã¿ãŒãã§ãŒã¹ ãæã€åã ã¿ã€ã ã®ãµãŒãã¹ãæã€ããšãæå³ããŸãã
ã³ã³ããé©åœ
DockerãšTestcontainersã®ãããã§ãå®éã®ããŒã¿ããŒã¹ãã¡ãã»ãŒãžãã¥ãŒãããã«ã¯è€éãªãµãŒãã¹ãæ°ç§ã§èµ·åã§ããŸããã¢ãã³ãªãã¹ãç°å¢ã¯æ¬¡ã®ããã«ãªããŸã:
// testSetup.ts - Environment bootstrapping
import { GenericContainer, StartedTestContainer } from 'testcontainers';
import { Pool } from 'pg';
import Redis from 'ioredis';
export class TestEnvironment {
private postgresContainer: StartedTestContainer;
private redisContainer: StartedTestContainer;
private dbPool: Pool;
private redisClient: Redis;
async setup(): Promise<void> {
// Start PostgreSQL with exact production version
this.postgresContainer = await new GenericContainer('postgres:15-alpine')
.withEnvironment({
POSTGRES_USER: 'testuser',
POSTGRES_PASSWORD: 'testpass',
POSTGRES_DB: 'testdb',
})
.withExposedPorts(5432)
.start();
// Start Redis with production configuration
this.redisContainer = await new GenericContainer('redis:7-alpine')
.withExposedPorts(6379)
.start();
// Initialize real clients
const pgPort = this.postgresContainer.getMappedPort(5432);
this.dbPool = new Pool({
host: 'localhost',
port: pgPort,
user: 'testuser',
password: 'testpass',
database: 'testdb',
});
const redisPort = this.redisContainer.getMappedPort(6379);
this.redisClient = new Redis({ host: 'localhost', port: redisPort });
// Run migrations on real database
await this.runMigrations();
}
async cleanup(): Promise<void> {
await this.dbPool.end();
await this.redisClient.quit();
await this.postgresContainer.stop();
await this.redisContainer.stop();
}
getDbPool(): Pool {
return this.dbPool;
}
getRedisClient(): Redis {
return this.redisClient;
}
private async runMigrations(): Promise<void> {
// Run your actual migration scripts
// This ensures test DB schema matches production
const migrationSQL = await readFile('./migrations/001_initial.sql', 'utf-8');
await this.dbPool.query(migrationSQL);
}
}
éèŠãªæŽå¯ : æ¬çªç°å¢ãš å šãåãPostgreSQLããŒãžã§ã³ ã䜿çšããŠããããšã«æ³šç®ããŠãã ãããããŒãžã§ã³ã®äžäžèŽã¯ããç§ã®ãã·ã³ã§ã¯åäœããããã°ã®äžè¬çãªåå ã§ãã
ç°å¢èšå®æŠç¥
ãã¹ãç°å¢ã¯æ¬çªç°å¢ãšã¯ç°ãªãèšå®ãå¿
èŠã§ãããåã æ§é ã§ããå¿
èŠããããŸããæšå¥šãããã¿ãŒã³ã¯æ¬¡ã®ãšããã§ã:
// config/test.ts
export const testConfig = {
database: {
// Provided by Testcontainers at runtime
host: process.env.TEST_DB_HOST || 'localhost',
port: parseInt(process.env.TEST_DB_PORT || '5432'),
// Safe credentials for testing
user: 'testuser',
password: 'testpass',
},
externalAPIs: {
// Use sandbox/test modes of real services
stripe: {
apiKey: process.env.STRIPE_TEST_KEY, // sk_test_...
webhookSecret: process.env.STRIPE_TEST_WEBHOOK_SECRET,
},
sendgrid: {
apiKey: process.env.SENDGRID_TEST_KEY,
// Use SendGrid's sandbox mode
sandboxMode: true,
},
},
// Feature flags for test scenarios
features: {
enableRateLimiting: true, // Test rate limits!
enableCaching: true, // Test cache invalidation!
enableRetries: true, // Test retry logic!
},
};
APIèªèšŒæ å ±ã®ç®¡ç: æ£ããæ¹æ³
å€ãã®ããŒã ãã€ãŸããã®ã¯ããã§ããã³ãŒãããŒã¹ã«ãã¹ãAPIããŒãããŒãã³ãŒãã£ã³ã°ããããããã«æªãããšã«ããã¹ãã§æ¬çªããŒã䜿çšãããããŸããã©ã¡ããã»ãã¥ãªãã£äžã®æªå€¢ã§ãã
ã·ãŒã¯ã¬ãã管çã®éå±€
-
ããŒã«ã«éçº : ãã¹ãèªèšŒæ
å ±ãå«ã
.env.testãã¡ã€ã«ã䜿çšããŸã(gitignored!) - CI/CDãã€ãã©ã€ã³ : CIãããã€ããŒã®ããŒã«ã(GitHub SecretsãGitLab CI/CD倿°ãªã©)ã«ã·ãŒã¯ã¬ãããä¿åããŸã
- å ±æãã¹ãç°å¢ : å°çšã®ã·ãŒã¯ã¬ãããããŒãžã£ãŒ(AWS Secrets ManagerãHashiCorp Vault)ã䜿çšããŸã
å
ç¢ãªèªèšŒæ
å ±ããŒããã¿ãŒã³ã¯æ¬¡ã®ãšããã§ã:
// lib/testCredentials.ts
import { config } from 'dotenv';
export class TestCredentialManager {
private credentials: Map<string, string> = new Map();
constructor() {
// Load from .env.test if present (local dev)
config({ path: '.env.test' });
// Override with CI environment variables if present
this.loadFromEnvironment();
// Validate required credentials
this.validate();
}
private loadFromEnvironment(): void {
const requiredCreds = [
'STRIPE_TEST_KEY',
'SENDGRID_TEST_KEY',
'AWS_TEST_ACCESS_KEY',
'AWS_TEST_SECRET_KEY',
];
requiredCreds.forEach((key) => {
const value = process.env[key];
if (value) {
this.credentials.set(key, value);
}
});
}
private validate(): void {
const missing: string[] = [];
// Check for essential credentials
if (!this.credentials.has('STRIPE_TEST_KEY')) {
missing.push('STRIPE_TEST_KEY');
}
if (missing.length > 0) {
console.warn(
`â ïž Missing test credentials: ${missing.join(', ')}\n` +
`Some integration tests will be skipped.\n` +
`See README.md for credential setup instructions.`
);
}
}
get(key: string): string | undefined {
return this.credentials.get(key);
}
has(key: string): boolean {
return this.credentials.has(key);
}
// Fail gracefully when credentials are missing
requireOrSkip(key: string, testFn: () => void): void {
if (!this.has(key)) {
console.log(`âïž Skipping test - missing ${key}`);
return;
}
testFn();
}
}
// Usage in tests
const credManager = new TestCredentialManager();
describe('Stripe Payment Integration', () => {
it('should process payment with real Stripe API', async () => {
credManager.requireOrSkip('STRIPE_TEST_KEY', async () => {
const stripe = new Stripe(credManager.get('STRIPE_TEST_KEY')!);
const paymentIntent = await stripe.paymentIntents.create({
amount: 1000,
currency: 'usd',
payment_method_types: ['card'],
});
expect(paymentIntent.status).toBe('requires_payment_method');
});
});
});
éèŠãªåå : èªèšŒæ å ±ãæ¬ èœããŠããå Žåããã¹ãã¯ã¹ã€ãŒãå šäœãã¯ã©ãã·ã¥ãããã®ã§ã¯ãªãã åªé ã«å£å ããå¿ èŠããããŸããããã«ãããéçºè ã¯ããŒã«ã«ã§éšåçãªãã¹ãã¹ã€ãŒããå®è¡ã§ããCIã¯å®å šãªããããªãŒãå®è¡ã§ããŸãã
CI/CDçµ±åãã¿ãŒã³
GitHub Actionsã¯ãŒã¯ãããŒã§ã¯:
# .github/workflows/test.yml
name: Integration Tests
on: [push, pull_request]
jobs:
integration-tests:
runs-on: ubuntu-latest
env:
# Inject secrets from GitHub Secrets
STRIPE_TEST_KEY: ${{ secrets.STRIPE_TEST_KEY }}
SENDGRID_TEST_KEY: ${{ secrets.SENDGRID_TEST_KEY }}
AWS_TEST_ACCESS_KEY: ${{ secrets.AWS_TEST_ACCESS_KEY }}
AWS_TEST_SECRET_KEY: ${{ secrets.AWS_TEST_SECRET_KEY }}
steps:
- uses: actions/checkout@v3
- name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: '18'
- name: Install dependencies
run: npm ci
- name: Run integration tests
run: npm run test:integration
- name: Upload coverage reports
uses: codecov/codecov-action@v3
with:
files: ./coverage/integration-coverage.json
ã¯ãªãŒã³ã¢ããæŠç¥: åªçæ§ã®å¿ é äºé
çå®ã®ç匟ããäŒãããŸã: ãã¹ããåªçã§ãªãå Žåãä¿¡é Œã§ããŸãã ãåªçãã¹ãã¯ã以åã®å®è¡ã«é¢ä¿ãªããå®è¡ãããã³ã«åãçµæãçæããŸãã
åªçæ§ã«å¯Ÿããæå€§ã®è
åšã¯? æ±ãç¶æ
ããã¹ãAãã¡ãŒã«test@example.comãæã€ãŠãŒã¶ãŒãäœæãããã¹ãBã¯ãã®ã¡ãŒã«ãå©çšå¯èœã§ãããšæ³å®ããŸãããã¹ãBã¯å€±æããŸãããã¹ãAãã¯ãªãŒã³ã¢ããããªãã£ãããšã«æ°ä»ããŸã§ã1æéãããã°ããŸãã
ã»ããã¢ããåãã¿ãŒã³(æšå¥š)
çŽæã«åããŠããã¹ã å ã«ã¯ãªãŒã³ã¢ããããããšã¯ã åŸ ã«ã¯ãªãŒã³ã¢ãããããããä¿¡é Œæ§ãé«ããªããŸã:
// tests/integration/userService.test.ts
describe('UserService Integration', () => {
let testEnv: TestEnvironment;
let userService: UserService;
beforeAll(async () => {
testEnv = new TestEnvironment();
await testEnv.setup();
});
afterAll(async () => {
await testEnv.cleanup();
});
beforeEach(async () => {
// CLEAN BEFORE, not after
// This ensures tests start from known state
await cleanDatabase(testEnv.getDbPool());
userService = new UserService(testEnv.getDbPool());
});
it('should create user with unique email', async () => {
const user = await userService.createUser({
email: 'test@example.com',
name: 'Test User',
});
expect(user.id).toBeDefined();
expect(user.email).toBe('test@example.com');
});
it('should reject duplicate email', async () => {
await userService.createUser({
email: 'duplicate@example.com',
name: 'User One',
});
await expect(
userService.createUser({
email: 'duplicate@example.com',
name: 'User Two',
})
).rejects.toThrow('Email already exists');
});
});
async function cleanDatabase(pool: Pool): Promise<void> {
// Truncate tables in correct order (respecting foreign keys)
await pool.query('TRUNCATE users, orders, payments CASCADE');
}
ãªãåã«ã¯ãªãŒã³ã¢ããããã®ã? ãã¹ããå®è¡äžã«ã¯ã©ãã·ã¥ããå ŽåãåŸã®ã¯ãªãŒã³ã¢ããã¯å®è¡ãããŸãããããŒã¿ããŒã¹ã¯æ±ããŸãŸã§ããæ¬¡ã®ãã¹ãå®è¡ã¯äžæè°ã«å€±æããŸããåã®ã¯ãªãŒã³ã¢ããã§ã¯ããã¹ãŠã®ãã¹ããæ¢ç¥ã®ç¶æ ããéå§ããŸãã
å€éšãµãŒãã¹çšã®Try-Finallyãã¿ãŒã³
ç°¡åã«ãªã»ããã§ããªãå€éšAPIããµãŒãã¹ã®å Žåãtry-finallyãããã¯ã䜿çšããŸã:
it('should send email via SendGrid', async () => {
const testEmailId = `test-${Date.now()}@example.com`;
let emailSent = false;
try {
// Arrange
const sendgrid = new SendGridClient(testConfig.sendgridApiKey);
// Act
await sendgrid.send({
to: testEmailId,
from: 'noreply@example.com',
subject: 'Test Email',
text: 'This is a test',
});
emailSent = true;
// Assert
const emails = await sendgrid.searchEmails({
to: testEmailId,
limit: 1,
});
expect(emails).toHaveLength(1);
} finally {
// Cleanup - even if test fails
if (emailSent) {
await sendgrid.deleteEmail(testEmailId);
}
}
});
䞊åãã¹ãå®è¡ã®åŠç
ã¢ãã³ãªãã¹ãã©ã³ããŒã¯ãé床ã®ããã«ãã¹ãã䞊åå®è¡ããŸãããã¹ãAãããã¹ãBãã¯ãšãªããŠãããŠãŒã¶ãŒãåé€ãããŸã§ã¯çŽ æŽãããããšã§ãã解決çã¯? ããŒã¿ã®åé¢ :
// testDataFactory.ts
export class TestDataFactory {
private static counter = 0;
static uniqueEmail(): string {
return `test-${process.pid}-${TestDataFactory.counter++}@example.com`;
}
static uniqueUserId(): string {
return `user-${process.pid}-${TestDataFactory.counter++}`;
}
static async createIsolatedUser(pool: Pool): Promise<User> {
const email = TestDataFactory.uniqueEmail();
const result = await pool.query(
'INSERT INTO users (email, name) VALUES ($1, $2) RETURNING *',
[email, `Test User ${TestDataFactory.counter}`]
);
return result.rows[0];
}
}
// Usage ensures no collisions between parallel tests
it('test A with isolated data', async () => {
const user = await TestDataFactory.createIsolatedUser(pool);
// Test uses user, no other test can access this user
});
it('test B with isolated data', async () => {
const user = await TestDataFactory.createIsolatedUser(pool);
// Runs in parallel with test A, zero conflicts
});
ãšã©ãŒã·ããªãªã®ãã¹ã: å®ãµãŒãã¹ãèŒãå Žæ
ã¢ãã¯ã¯ããããŒãã¹ãã¹ããç°¡åã«ããŸããå®ãµãŒãã¹ã¯ é害ãã¹ã ãå¯èœã«ããŸãããããŠãé害ãã¹ãã¯ãæ¬çªç°å¢ãã¯ã©ãã·ã¥ããããã°ãèŠã€ããå Žæã§ãã
ãããã¯ãŒã¯é害ã·ãã¥ã¬ãŒã·ã§ã³
Toxiproxyã®ãããªããŒã«ã䜿çšãããšãå®éã®ãµãŒãã¹åŒã³åºãã«ãããã¯ãŒã¯éå®³ãæ³šå
¥ã§ããŸã:
import { Toxiproxy } from 'toxiproxy-node-client';
describe('Payment Service - Network Resilience', () => {
let toxiproxy: Toxiproxy;
let paymentService: PaymentService;
beforeAll(async () => {
toxiproxy = new Toxiproxy('http://localhost:8474');
// Create proxy for Stripe API
await toxiproxy.createProxy({
name: 'stripe_api',
listen: '0.0.0.0:6789',
upstream: 'api.stripe.com:443',
});
});
it('should retry on network timeout', async () => {
// Inject 5-second latency
await toxiproxy.addToxic({
proxy: 'stripe_api',
type: 'latency',
attributes: { latency: 5000 },
});
const start = Date.now();
await expect(
paymentService.processPayment({ amount: 1000 })
).rejects.toThrow('Request timeout');
const duration = Date.now() - start;
// Verify retry logic kicked in (3 retries = ~15 seconds)
expect(duration).toBeGreaterThan(15000);
});
it('should handle connection reset', async () => {
// Inject connection reset
await toxiproxy.addToxic({
proxy: 'stripe_api',
type: 'reset_peer',
attributes: { timeout: 0 },
});
await expect(
paymentService.processPayment({ amount: 1000 })
).rejects.toThrow('Connection reset');
});
afterEach(async () => {
// Remove toxics between tests
await toxiproxy.removeToxic({ proxy: 'stripe_api' });
});
});
ã¬ãŒãå¶éãšã¹ããããªã³ã°
ã·ã¹ãã ãAPIã¬ãŒãå¶éãã©ã®ããã«åŠçãããããã¹ãããŸã:
it('should respect rate limits', async () => {
const apiClient = new ExternalAPIClient(testConfig.apiKey);
const results: Array<'success' | 'throttled'> = [];
// Hammer the API with 100 requests
const requests = Array.from({ length: 100 }, async () => {
try {
await apiClient.getData();
results.push('success');
} catch (error) {
if (error.statusCode === 429) {
results.push('throttled');
} else {
throw error;
}
}
});
await Promise.allSettled(requests);
// Verify rate limiting kicked in
expect(results.filter(r => r === 'throttled').length).toBeGreaterThan(0);
// Verify some requests succeeded (we're not completely blocked)
expect(results.filter(r => r === 'success').length).toBeGreaterThan(0);
});
90-95%ã®ã«ãã¬ããžãéæãã: å®çšçãªç®æš
æ°åã«ã€ããŠè©±ããŸãããã100%ã®ã«ãã¬ããžã¯ç¡é§ãªåªåã§ãâæ©èœãæžãããããã¹ããç¶æããããšã«å€ãã®æéãè²»ããããšã«ãªããŸãããããã80%æªæºã§ã¯ãç²ç®çã«é£ãã§ããããšã«ãªããŸããã¹ã€ãŒãã¹ãããã¯? ãã¹ãã¿ã€ãã®æŠç¥çããã¯ã¹ã§90ã95%ã®ã«ãã¬ããž ã
ã¢ãã³ãªãã¹ãé å
Guillermo Rauchã®æåãªåŒçš:ããã¹ããæžããå€ããããã»ãšãã©çµ±åããå®éã«ã¯ãã®ããã«ãªããŸã:
- 50-60% ãŠããããã¹ã : é«éã§çŠç¹ãçµã£ããåé¢ãããããžãã¹ããžãã¯ã®ãã¹ã
- 30-40% çµ±åãã¹ã : å®ãµãŒãã¹ãã³ã³ããŒãã³ãéã®ã€ã³ã¿ã©ã¯ã·ã§ã³ã®ãã¹ã
- 5-10% E2Eãã¹ã : å®å šãªã·ã¹ãã ãã¹ããéèŠãªãŠãŒã¶ãŒãžã£ãŒããŒ
ã°ã©ãã£ãã¯ææ¡1 : çµ±åãã¹ããæŠç¥çãªäžéå±€ãšããŠç€ºãä¿®æ£ããããã¹ããã©ãããã§ããå®ããŒã¿ããŒã¹ãããå®APIãããå®ã¡ãã»ãŒãžãã¥ãŒãã®ã³ãŒã«ã¢ãŠãããããŸãã
åªå ãã¹ãã«ãã¬ããžã®ã£ãã
çµ±åãã¹ãã以äžã®é«äŸ¡å€é åã«éäžãããŸã:
- èªèšŒ/èªå¯ãã㌠: ããŒã¯ã³ã®æŽæ°ãæš©éãã§ãã¯ãã»ãã·ã§ã³ç®¡ç
- ããŒã¿æ°žç¶æ§ : ããŒã¿ããŒã¹ãã©ã³ã¶ã¯ã·ã§ã³ãå¶çŽéåããã€ã°ã¬ãŒã·ã§ã³
- å€éšAPIçµ±å : æ¯æãåŠçãã¡ãŒã«é ä¿¡ããµãŒãããŒãã£ããŒã¿
- ã¡ãã»ãŒãžãã¥ãŒæäœ : ã€ãã³ãå ¬éãã¡ãã»ãŒãžæ¶è²»ããããã¬ã¿ãŒåŠç
- ãã£ãã·ã¥ç¡å¹å : ãã£ãã·ã¥ã¯ãã€æŽæ°ãããã?ãã£ãã·ã¥ãã¹æã«äœãèµ·ããã?
éèŠãªããšã枬å®ãã
ã³ãŒãã«ãã¬ããžããŒã«ã¯åãã€ããŸããå®è¡ãããè¡ãæããŠãããŸãããæ€èšŒãããåäœã¯æããŠãããŸããã çµ±åã«ãã¬ããž ãå¥ã
ã«è¿œè·¡ããŸã:
// package.json
{
"scripts": {
"test:unit": "jest --coverage --coverageDirectory=coverage/unit",
"test:integration": "jest --config=jest.integration.config.js --coverage --coverageDirectory=coverage/integration",
"test:coverage": "node scripts/mergeCoverage.js"
}
}
// scripts/mergeCoverage.js
import { mergeCoverageReports } from 'coverage-merge';
const unitCoverage = require('../coverage/unit/coverage-summary.json');
const integrationCoverage = require('../coverage/integration/coverage-summary.json');
const merged = mergeCoverageReports([unitCoverage, integrationCoverage]);
console.log('Combined Coverage Report:');
console.log(`Lines: ${merged.total.lines.pct}%`);
console.log(`Statements: ${merged.total.statements.pct}%`);
console.log(`Functions: ${merged.total.functions.pct}%`);
console.log(`Branches: ${merged.total.branches.pct}%`);
// Fail if below threshold
if (merged.total.lines.pct < 90) {
console.error('â Coverage below 90% threshold');
process.exit(1);
}
ã°ã©ãã£ãã¯ææ¡2 : ã¢ãžã¥ãŒã«å¥ã®ãŠããã察統åã«ãã¬ããžã®å èš³ã瀺ãã«ãã¬ããžããã·ã¥ããŒãã®ã¢ãã¯ã¢ããã§ãçµ±åãã¹ããããªã¹ã¯ã®ãããé å(ããŒã¿ããŒã¹ãå€éšAPI)ã匷調衚瀺ããŸãã
CI/CDçµ±å: ã©ãã§ãå®è¡ããããã¹ã
CI/CDã§ã®çµ±åãã¹ãã¯é£ããã§ãããŠããããã¹ããããé ããã€ã³ãã©ã¹ãã©ã¯ãã£ãå¿ èŠã§ãèªèšŒæ å ±ãå¿ èŠã§ããããããæ¬çªç°å¢ã®åã®æåŸã®é²åŸ¡ç·ã§ããããŸãã
ãã«ãã¹ããŒãžãã€ãã©ã€ã³
# .github/workflows/full-pipeline.yml
name: Full Test Pipeline
on:
push:
branches: [main, develop]
pull_request:
branches: [main]
jobs:
unit-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: '18'
- run: npm ci
- run: npm run test:unit
- uses: codecov/codecov-action@v3
with:
files: ./coverage/unit/coverage-final.json
flags: unit
integration-tests:
runs-on: ubuntu-latest
# Only run on main/develop or when PR is marked ready
if: github.ref == 'refs/heads/main' || github.ref == 'refs/heads/develop' || github.event.pull_request.draft == false
services:
# GitHub Actions provides service containers
postgres:
image: postgres:15-alpine
env:
POSTGRES_USER: testuser
POSTGRES_PASSWORD: testpass
POSTGRES_DB: testdb
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 5432:5432
redis:
image: redis:7-alpine
options: >-
--health-cmd "redis-cli ping"
--health-interval 10s
--health-timeout 5s
--health-retries 5
ports:
- 6379:6379
env:
TEST_DB_HOST: localhost
TEST_DB_PORT: 5432
STRIPE_TEST_KEY: ${{ secrets.STRIPE_TEST_KEY }}
SENDGRID_TEST_KEY: ${{ secrets.SENDGRID_TEST_KEY }}
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: '18'
- run: npm ci
- run: npm run db:migrate:test
- run: npm run test:integration
- uses: codecov/codecov-action@v3
with:
files: ./coverage/integration/coverage-final.json
flags: integration
e2e-tests:
runs-on: ubuntu-latest
needs: [unit-tests, integration-tests]
# Only run E2E on main branch or when explicitly requested
if: github.ref == 'refs/heads/main' || contains(github.event.pull_request.labels.*.name, 'run-e2e')
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: '18'
- run: npm ci
- run: npm run test:e2e
éèŠãªãã¿ãŒã³ :
- ãŠããããã¹ãã¯ãã¹ãŠã®ã³ãããã§å®è¡ãããŸã(é«éãã£ãŒãããã¯)
- çµ±åãã¹ãã¯main/developããã³æºåå®äºã®PRã§å®è¡ãããŸã(ããŒãžåã«çµ±åãã°ããã£ãã)
- E2Eãã¹ãã¯mainãŸãã¯æç€ºçã«èŠæ±ãããå Žåã«ã®ã¿å®è¡ãããŸã(é ããå æ¬ç)
ã°ã©ãã£ãã¯ææ¡3 : ãã«ãã¹ããŒãžã¢ãããŒããšæ¡ä»¶(ã©ã®ãã¹ãããã€å®è¡ããã)ã瀺ãCI/CDãã€ãã©ã€ã³ãããŒãã£ãŒãã§ãã€ã³ãã©ã¹ãã©ã¯ãã£ã»ããã¢ãã(ã³ã³ãã)ãšã·ãŒã¯ã¬ããæ³šå ¥ãã€ã³ããå«ã¿ãŸãã
æé©å: ãã£ãã·ã¥ãããäŸåé¢ä¿
å®è¡ããšã«Dockerã€ã¡ãŒãžãåæ§ç¯ããçµ±åãã¹ãã¯æéãç¡é§ã«ããŸããç©æ¥µçã«ãã£ãã·ã¥ããŸã:
- name: Cache Docker layers
uses: actions/cache@v3
with:
path: /tmp/.buildx-cache
key: ${{ runner.os }}-buildx-${{ hashFiles('**/Dockerfile') }}
restore-keys: |
${{ runner.os }}-buildx-
- name: Pull Docker images
run: |
docker pull postgres:15-alpine
docker pull redis:7-alpine
CIã§ã®äžŠåå®è¡
ç¬ç«ããçµ±åãã¹ãã¹ã€ãŒãã䞊åå®è¡ããŸã:
integration-tests:
strategy:
matrix:
test-suite: [database, api, messaging, cache]
steps:
- run: npm run test:integration:${{ matrix.test-suite }}
ã°ã©ãã£ãã¯ææ¡4 : ããŒã¿ããŒã¹ãAPIãã¡ãã»ãŒãžã³ã°ããã£ãã·ã¥ãã¹ããåæå®è¡ããããšã«ããæéç¯çŽã匷調衚瀺ãããã·ãªã¢ã«å¯ŸäžŠåå®è¡ã瀺ããã¹ãå®è¡ã¿ã€ã ã©ã€ã³ã
å®äžçã®çµ±åãã¹ãäŸ
çŸå®çãªeã³ããŒã¹ãã§ãã¯ã¢ãŠããããŒã§ãã¹ãŠããŸãšããŸããã:
// tests/integration/checkout.test.ts
import { TestEnvironment } from '../testSetup';
import { CheckoutService } from '../../src/services/CheckoutService';
import { StripePaymentProcessor } from '../../src/payments/StripePaymentProcessor';
import { SendGridEmailService } from '../../src/email/SendGridEmailService';
import { TestDataFactory } from '../testDataFactory';
import { TestCredentialManager } from '../testCredentials';
describe('Checkout Integration', () => {
let testEnv: TestEnvironment;
let checkoutService: CheckoutService;
let credManager: TestCredentialManager;
beforeAll(async () => {
testEnv = new TestEnvironment();
await testEnv.setup();
credManager = new TestCredentialManager();
});
afterAll(async () => {
await testEnv.cleanup();
});
beforeEach(async () => {
// Clean state before each test
await testEnv.getDbPool().query('TRUNCATE orders, payments, users CASCADE');
});
it('should complete full checkout with real payment and email', async () => {
credManager.requireOrSkip('STRIPE_TEST_KEY', async () => {
credManager.requireOrSkip('SENDGRID_TEST_KEY', async () => {
// Arrange: Create test user with isolated data
const user = await TestDataFactory.createIsolatedUser(testEnv.getDbPool());
const paymentProcessor = new StripePaymentProcessor(
credManager.get('STRIPE_TEST_KEY')!
);
const emailService = new SendGridEmailService(
credManager.get('SENDGRID_TEST_KEY')!
);
checkoutService = new CheckoutService(
testEnv.getDbPool(),
paymentProcessor,
emailService
);
const cart = {
items: [
{ productId: 'prod_123', quantity: 2, price: 1999 },
{ productId: 'prod_456', quantity: 1, price: 4999 },
],
};
let orderId: string;
try {
// Act: Process checkout with REAL Stripe payment
const result = await checkoutService.processCheckout({
userId: user.id,
cart,
paymentMethod: {
type: 'card',
cardToken: 'tok_visa', // Stripe test token
},
});
orderId = result.orderId;
// Assert: Verify order created in REAL database
const orderResult = await testEnv.getDbPool().query(
'SELECT * FROM orders WHERE id = $1',
[orderId]
);
expect(orderResult.rows).toHaveLength(1);
expect(orderResult.rows[0].status).toBe('completed');
expect(orderResult.rows[0].total_amount).toBe(8997);
// Assert: Verify payment recorded
const paymentResult = await testEnv.getDbPool().query(
'SELECT * FROM payments WHERE order_id = $1',
[orderId]
);
expect(paymentResult.rows).toHaveLength(1);
expect(paymentResult.rows[0].status).toBe('succeeded');
expect(paymentResult.rows[0].provider).toBe('stripe');
// Assert: Verify email sent via REAL SendGrid
const emails = await emailService.searchEmails({
to: user.email,
subject: 'Order Confirmation',
limit: 1,
});
expect(emails).toHaveLength(1);
expect(emails[0].body).toContain(orderId);
} finally {
// Cleanup: Cancel order and refund payment
if (orderId) {
await checkoutService.cancelOrder(orderId);
}
}
});
});
});
it('should handle payment failure gracefully', async () => {
credManager.requireOrSkip('STRIPE_TEST_KEY', async () => {
const user = await TestDataFactory.createIsolatedUser(testEnv.getDbPool());
const paymentProcessor = new StripePaymentProcessor(
credManager.get('STRIPE_TEST_KEY')!
);
checkoutService = new CheckoutService(
testEnv.getDbPool(),
paymentProcessor,
new SendGridEmailService(credManager.get('SENDGRID_TEST_KEY')!)
);
const cart = {
items: [{ productId: 'prod_789', quantity: 1, price: 9999 }],
};
// Act: Use Stripe's test token for declined card
await expect(
checkoutService.processCheckout({
userId: user.id,
cart,
paymentMethod: {
type: 'card',
cardToken: 'tok_chargeDeclined', // Stripe test token for declined
},
})
).rejects.toThrow('Payment declined');
// Assert: Verify order marked as failed
const orderResult = await testEnv.getDbPool().query(
'SELECT * FROM orders WHERE user_id = $1',
[user.id]
);
expect(orderResult.rows).toHaveLength(1);
expect(orderResult.rows[0].status).toBe('payment_failed');
// Assert: No successful payment recorded
const paymentResult = await testEnv.getDbPool().query(
'SELECT * FROM payments WHERE status = $1',
['succeeded']
);
expect(paymentResult.rows).toHaveLength(0);
});
});
});
ãã®ãã¹ãã¯ä»¥äžãæ€èšŒããŸã:
- å®PostgreSQLããŒã¿ããŒã¹æäœ(泚æäœæãæ¯æãèšé²)
- å®Stripeæ¯æãåŠç(ãã¹ãã¢ãŒãã䜿çš)
- å®SendGridã¡ãŒã«é ä¿¡(ãµã³ãããã¯ã¹ã¢ãŒãã䜿çš)
- 倱æããæ¯æãã«ããé©åãªãšã©ãŒåŠç
- ãã¹ã倱ææã§ãå®å šãªã¯ãªãŒã³ã¢ãã
ã°ã©ãã£ãã¯ææ¡5 : ãã¹ãã³ãŒãâããŒã¿ããŒã¹âStripe APIâSendGrid APIéã®ã€ã³ã¿ã©ã¯ã·ã§ã³ã瀺ããã§ãã¯ã¢ãŠããããŒã®ã·ãŒã±ã³ã¹å³ã§ãã¢ãµãŒã·ã§ã³ãã€ã³ããšã¯ãªãŒã³ã¢ããã¹ãããã®æ³šéããããŸãã
äžè¬çãªèœãšã穎ãšãã®åé¿æ¹æ³
å®ãµãŒãã¹ãã¹ãã®é·å¹Žã®çµéšãããããŒã ãé¥ãçœ ã¯æ¬¡ã®ãšããã§ã:
èœãšã穎1: ã¿ã€ãã³ã°ã«ããäžå®å®ãªãã¹ã
åé¡ : ããŒã«ã«ã§ã¯æåãããã¹ãããCIã§ã©ã³ãã ã«å€±æããŸãã
解決ç : ä»»æã®ã¿ã€ã ã¢ãŠãã䜿çšããªãã§ãã ãããæç€ºçãªåŸ
æ©ã䜿çšããŸã:
// â æªãäŸ: ä»»æã®ã¿ã€ã ã¢ãŠã
await sleep(1000);
expect(order.status).toBe('completed');
// â
è¯ãäŸ: æ¡ä»¶ãåŸ
ã€
await waitFor(
async () => {
const order = await getOrder(orderId);
return order.status === 'completed';
},
{ timeout: 5000, interval: 100 }
);
èœãšã穎2: ãã¹ãããŒã¿ã®æ±æ
åé¡ : ãã¹ããäºãã«å¹²æžããã©ã³ãã ãªå€±æãçºçããŸãã
解決ç : äžæã®èå¥å+ãã¹ãåã®ã¯ãªãŒã³ã¢ãã(åè¿°ã®ãšãã)ã
èœãšã穎3: ãã¹ãããã©ãŒãã³ã¹ã®ç¡èŠ
åé¡ : çµ±åã¹ã€ãŒãã30åããããéçºè ãå®è¡ããªããªããŸãã
解決ç : 䞊ååãäŸåé¢ä¿ã®ãã£ãã·ã¥ãæéäºç®ã®èšå®:
// jest.integration.config.js
module.exports = {
testTimeout: 10000, // 10 seconds max per test
maxWorkers: '50%', // Use half CPU cores for parallel execution
setupFilesAfterEnv: ['<rootDir>/tests/testSetup.ts'],
};
ãã¹ãã10ç§ãè¶ ããå Žåãæé©åãå¿ èŠããE2Eãã¹ãã«ãªãå¿ èŠããããŸãã
èœãšã穎4: ãšããžã±ãŒã¹ã®éå°ãªãã¹ã
åé¡ : 1000ã®ãã¹ãã90%ãåãããããŒãã¹ããã¹ãããŸãã
解決ç : ãšããžã±ãŒã¹ã«ãã¹ããããªãã¯ã¹ã䜿çšããŸã:
describe.each([
{ input: 'valid@email.com', expected: true },
{ input: 'invalid', expected: false },
{ input: 'no@domain', expected: false },
{ input: '', expected: false },
{ input: null, expected: false },
])('Email validation', ({ input, expected }) => {
it(`should return ${expected} for "${input}"`, async () => {
const result = await validateEmail(input);
expect(result).toBe(expected);
});
});
çµè«: ä¿¡é Œãç²åŸãããã¹ã
å®ãµãŒãã¹ãã¹ãã¯å®ç§ãã«ã€ããŠã§ã¯ãããŸããã ä¿¡é Œ ã«ã€ããŠã§ããçµ±åãã¹ããæåããå Žåãæ¬çªç°å¢ãžã®ãããã€ã«å¿«é©ããæããã¹ãã§ãã倱æããå Žåãã¢ãã¯ã®äžäžèŽã§ã¯ãªããå®éã®ãã°ããã£ãããããšä¿¡é Œããå¿ èŠããããŸãã
ãã®ä¿¡é Œãæ§ç¯ããããã®äœç³»çãªãã§ãã¯ãªã¹ãã¯æ¬¡ã®ãšããã§ã:
- ç°å¢ã»ããã¢ãã : æ¬çªãµãŒãã¹ããã©ãŒãªã³ã°ããããã«ã³ã³ããã䜿çšãã
- èªèšŒæ å ±ç®¡ç : ã»ãã¥ã¢ãªã·ãŒã¯ã¬ãããæ¬ èœæã®åªé ãªå£å
- ã¯ãªãŒã³ã¢ããæŠç¥ : ãã¹ãåã«ã¯ãªãŒã³ã¢ãããå€éšãµãŒãã¹ã«ã¯try-finallyã䜿çš
- ããŒã¿ã®åé¢ : ãã¹ãã®å¹²æžãé²ãããã®äžæã®èå¥å
- ãšã©ãŒã·ããªãª : å®ãµãŒãã¹ã·ãã¥ã¬ãŒã·ã§ã³ã§é害ãã¿ã€ã ã¢ãŠããã¬ãŒãå¶éããã¹ã
- ã«ãã¬ããžç®æš : æŠç¥çãªãã¹ãé åã§90ã95%ãç®æã
- CI/CDçµ±å : ãã£ãã·ã³ã°ãšäžŠååãåãããã«ãã¹ããŒãžãã€ãã©ã€ã³
å®ãµãŒãã¹ã䜿ã£ãçµ±åãã¹ãã¯ãã¢ãã¯ãããå€ãã®ã»ããã¢ãããå¿ èŠã§ããé ããªããŸããè€éã«ãªããŸããããããæ£ããè¡ããããšããåäœãããšæãããšãåäœããããšãç¥ã£ãŠãããã®éãã«ãªããŸãã
ãããå®ããŒã¿ããŒã¹ãå®APIããããŠå®éã®èªä¿¡ãæã£ãŠãã¹ããã«è¡ããŸãããã
çµ±åãã¹ãã¢ãŒããã¯ãã£
å®ãµãŒãã¹çšã®ä¿®æ£ããããã¹ããã©ããã
åŸæ¥ã®ãã¹ããã©ãããã¯ããŒã¹ã«ãŠããããã¹ãã匷調ããŠããŸãããå®ãµãŒãã¹çµ±åãã¹ãã«ã¯ç°ãªããã©ã³ã¹ãå¿ èŠã§ã:
è€éãªå€éšãµãŒãã¹ã®çžäºäœçšããã¹ãããå Žåãçµ±åãã¹ãã¯ãã倧ããªã·ã§ã¢ãå ããŸãã
å®ãµãŒãã¹ãã¹ãç°å¢ãããŒ
æ¬çªã°ã¬ãŒãã®çµ±åãã¹ãã¯ããã®ã©ã€ããµã€ã¯ã«ã«åŸããŸã:
ããã«ããããã¹ããåé¢ããåªçã§ããããšãä¿èšŒãããCI/CDãã€ãã©ã€ã³ã§ç¢ºå®ã«å®è¡ãããŸãã
åèæç®
: [1] Cohn, M. (2009). Succeeding with Agile: Software Development Using Scrum. The Testing Pyramid
: [2] Hauer, P. (2019). Focus on Integration Tests Instead of Mock-Based Tests. https://phauer.com/2019/focus-integration-tests-mock-based-tests/
: [3] Hauer, P. (2019). Integration testing tools and practices. Focus on Integration Tests Instead of Mock-Based Tests
: [4] Stack Overflow Community. (2018). Is it considered a good practice to mock in integration tests? https://stackoverflow.com/questions/52107522/
: [5] Server Fault Community. Credentials management within CI/CD environment. https://serverfault.com/questions/924431/
: [6] Rojek, M. (2021). Idempotence in Software Testing. https://medium.com/@rojek.mac/idempotence-in-software-testing-b8fd946320c5
: [7] Software Engineering Stack Exchange. Cleanup & Arrange practices during integration testing to avoid dirty databases. https://softwareengineering.stackexchange.com/questions/308666/
: [8] Stack Overflow Community. What strategy to use with xUnit for integration tests when knowing they run in parallel? https://stackoverflow.com/questions/55297811/
: [9] LinearB. Test Coverage Demystified: A Complete Introductory Guide. https://linearb.io/blog/test-coverage-demystified
: [10] Web.dev. Pyramid or Crab? Find a testing strategy that fits. https://web.dev/articles/ta-strategies
Originally published at kanaeru.ai


Top comments (0)