π¦ Business Scenario (Very Common in FinTech)
Imagine a payment processing service.
Before processing a payment, the system must validate:
- Account Status (active / blocked)
- Balance Check
- Fraud Risk Check
Each validation:
- Calls a different internal service
- Takes 300β800 ms
- Is independent
β Bad approach (sequential):
- Total time β 2 seconds
- High latency β SLA breach
β Good approach (parallel using ExecutorService):
- All checks run in parallel
- Total time β max(800 ms)
π§ Why ExecutorService Here?
- Controlled thread pool (avoid thread explosion)
- Parallel execution
- Better SLA
- Clean error handling
This is exactly where ExecutorService shines.
Architecture Flow
Client
|
v
Payment API
|
+-- Account Validation (Thread-1)
+-- Balance Check (Thread-2)
+-- Fraud Check (Thread-3)
|
v
Final Decision
1οΈβ£ ExecutorService Configuration
package com.example.config;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
@Configuration
public class ExecutorConfig {
@Bean
public ExecutorService executorService() {
// Controlled pool for validation tasks
return Executors.newFixedThreadPool(3);
}
}
2οΈβ£ Validation Service (Parallel Tasks)
package com.example.service;
import org.springframework.stereotype.Service;
import java.util.concurrent.Callable;
@Service
public class ValidationTasks {
public Callable<Boolean> accountCheck() {
return () -> {
Thread.sleep(500); // simulate service call
return true; // account is active
};
}
public Callable<Boolean> balanceCheck() {
return () -> {
Thread.sleep(700); // simulate service call
return true; // sufficient balance
};
}
public Callable<Boolean> fraudCheck() {
return () -> {
Thread.sleep(800); // simulate service call
return true; // low risk
};
}
}
3οΈβ£ Payment Orchestrator Service
This is where ExecutorService is actually used.
package com.example.service;
import org.springframework.stereotype.Service;
import java.util.List;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Future;
@Service
public class PaymentValidationService {
private final ExecutorService executorService;
private final ValidationTasks tasks;
public PaymentValidationService(
ExecutorService executorService,
ValidationTasks tasks) {
this.executorService = executorService;
this.tasks = tasks;
}
public boolean validatePayment() throws Exception {
List<Future<Boolean>> results = executorService.invokeAll(
List.of(
tasks.accountCheck(),
tasks.balanceCheck(),
tasks.fraudCheck()
)
);
// If any validation fails β reject payment
for (Future<Boolean> result : results) {
if (!result.get()) {
return false;
}
}
return true;
}
}
π Why invokeAll()?
- Submits multiple tasks together
- Waits until all complete
- Clean & readable for orchestration logic
4οΈβ£ REST Controller
package com.example.controller;
import com.example.service.PaymentValidationService;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RestController;
@RestController
public class PaymentController {
private final PaymentValidationService service;
public PaymentController(PaymentValidationService service) {
this.service = service;
}
@PostMapping("/validate-payment")
public String validatePayment() throws Exception {
boolean valid = service.validatePayment();
return valid
? "Payment validation successful"
: "Payment validation failed";
}
}
5οΈβ£ curl Request
curl -X POST http://localhost:8080/validate-payment
6οΈβ£ Response
Payment validation successful
β± Performance Comparison
| Approach | Approx Time |
|---|---|
| Sequential | ~2.0 sec |
| ExecutorService (parallel) | ~0.8 sec |
β‘ 60%+ latency reduction
This scenario demonstrates:
β
Real business problem
β
Parallelism (not async hype)
β
Controlled concurrency
β
ExecutorService best usage
β
SLA-driven design
β Common Mistakes (Say This in Interview)
β Creating new ExecutorService per request
β Unlimited thread pools
β Blocking everything blindly
β Ignoring timeouts
β No graceful shutdown
Top comments (0)