A technical deep-dive into building bank-level infrastructure using modern Java, React, and multi-cloud architecture
Building a fintech platform that processes sensitive financial data for millions of users while maintaining bank-level security is one of the most challenging engineering problems you can tackle. Over the past year, my team and I have been building the core infrastructure for an AI-powered financial platform using Java Spring Boot, React, and Amazon Bedrock. I wanted to share the key architectural decisions and lessons learned.
The Technical Challenge
When users connect their financial accounts to our platform through Plaid, we need to process their entire financial life in real-time:
- Multiple data sources per user (average 4-6 financial accounts)
- Real-time transaction processing (millions of events daily)
- AI model inference via Amazon Bedrock for fraud detection and categorization
- 99.98% uptime requirement (this is people's money we're talking about)
- Global compliance across AWS, GCP, and Azure regions
Architecture Overview: Event-Driven Intelligence
We built our platform using an event-driven architecture that can scale horizontally across multiple cloud providers while maintaining data consistency and real-time processing capabilities.
Core Components
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ Data Sources │ │ Intelligence │ │ Application │
│ │ │ Layer │ │ Layer │
│ • Plaid API │────▶• Amazon Bedrock│────▶• React Frontend │
│ • Bank APIs │ │ • Fraud Models │ │ • Spring Boot │
│ • Stripe │ │ • Categorization│ │ • REST APIs │
│ • Manual Entry │ │ • Notifications │ │ • WebSockets │
└─────────────────┘ └─────────────────┘ └─────────────────┘
Data Ingestion Layer:
- Plaid SDK for secure bank connections and transaction sync
- Stripe Connect for payment processing
- Amazon SES/SNS for email and push notifications
- CloudFlare for CDN and DDoS protection
Intelligence Layer:
- Amazon Bedrock for natural language processing and financial insights
- Custom Java services for business logic and data processing
- PostgreSQL with Supabase for managed database services
- Redis for caching and session management
Application Layer:
- React frontend with Redux for state management
- Spring Boot microservices for backend APIs
- GitHub Actions for CI/CD pipeline
- Multi-cloud deployment across AWS, GCP, and Azure
The AI Pipeline: From Transaction to Insight
Real-Time Transaction Processing with Spring Boot
@RestController
@RequestMapping("/api/transactions")
public class TransactionController {
@Autowired
private TransactionService transactionService;
@Autowired
private BedrockService bedrockService;
@PostMapping("/process")
public ResponseEntity<TransactionResponse> processTransaction(
@RequestBody TransactionRequest request) {
try {
// Step 1: Validate and normalize transaction
Transaction transaction = transactionService.normalizeTransaction(request);
// Step 2: Fraud detection using Amazon Bedrock
FraudScore fraudScore = bedrockService.detectFraud(transaction);
// Step 3: Category classification
Category category = bedrockService.categorizeTransaction(transaction);
// Step 4: Update user financial profile
transactionService.updateUserProfile(transaction, category);
// Step 5: Generate insights if needed
if (transactionService.shouldGenerateInsight(transaction)) {
bedrockService.generatePersonalizedInsight(transaction.getUserId(), transaction);
}
return ResponseEntity.ok(new TransactionResponse(transaction, fraudScore, category));
} catch (Exception e) {
log.error("Error processing transaction", e);
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).build();
}
}
}
Amazon Bedrock Integration Service
@Service
public class BedrockService {
private final BedrockRuntimeClient bedrockClient;
private final RedisTemplate<String, Object> redisTemplate;
public BedrockService() {
this.bedrockClient = BedrockRuntimeClient.builder()
.region(Region.US_EAST_1)
.build();
}
public FraudScore detectFraud(Transaction transaction) {
String cacheKey = "fraud_score:" + transaction.generateHash();
// Check Redis cache first
FraudScore cached = (FraudScore) redisTemplate.opsForValue().get(cacheKey);
if (cached != null) {
return cached;
}
// Prepare prompt for Bedrock
String prompt = buildFraudDetectionPrompt(transaction);
InvokeModelRequest request = InvokeModelRequest.builder()
.modelId("anthropic.claude-v2")
.contentType("application/json")
.body(SdkBytes.fromUtf8String(prompt))
.build();
InvokeModelResponse response = bedrockClient.invokeModel(request);
FraudScore score = parseFraudResponse(response.body().asUtf8String());
// Cache result for 5 minutes
redisTemplate.opsForValue().set(cacheKey, score, Duration.ofMinutes(5));
return score;
}
private String buildFraudDetectionPrompt(Transaction transaction) {
return String.format("""
Analyze this financial transaction for fraud indicators:
Amount: $%.2f
Merchant: %s
Location: %s
Time: %s
User spending patterns: %s
Return a JSON object with:
- score: 0-100 fraud likelihood
- reasons: array of risk factors
- confidence: 0-100 confidence level
""",
transaction.getAmount(),
transaction.getMerchant(),
transaction.getLocation(),
transaction.getTimestamp(),
getUserSpendingProfile(transaction.getUserId())
);
}
}
React Frontend with Financial Data Visualization
import React, { useState, useEffect } from 'react';
import { useSelector, useDispatch } from 'react-redux';
import { fetchTransactions, processTransaction } from '../store/transactionSlice';
const FinancialDashboard = () => {
const dispatch = useDispatch();
const { transactions, loading, insights } = useSelector(state => state.transactions);
const [selectedTimeRange, setSelectedTimeRange] = useState('30d');
useEffect(() => {
dispatch(fetchTransactions({ timeRange: selectedTimeRange }));
}, [dispatch, selectedTimeRange]);
const handleTransactionSync = async () => {
try {
// Trigger Plaid sync
const response = await fetch('/api/plaid/sync', {
method: 'POST',
headers: {
'Authorization': `Bearer ${localStorage.getItem('jwt')}`,
'Content-Type': 'application/json'
}
});
if (response.ok) {
dispatch(fetchTransactions({ timeRange: selectedTimeRange }));
}
} catch (error) {
console.error('Error syncing transactions:', error);
}
};
return (
<div className="financial-dashboard">
<div className="dashboard-header">
<h1>Financial Overview</h1>
<button onClick={handleTransactionSync} disabled={loading}>
{loading ? 'Syncing...' : 'Sync Accounts'}
</button>
</div>
<div className="insights-section">
{insights.map(insight => (
<div key={insight.id} className="insight-card">
<h3>{insight.title}</h3>
<p>{insight.description}</p>
<span className="confidence">Confidence: {insight.confidence}%</span>
</div>
))}
</div>
<div className="transactions-list">
{transactions.map(transaction => (
<TransactionItem
key={transaction.id}
transaction={transaction}
onCategorize={(category) =>
dispatch(processTransaction({
transactionId: transaction.id,
category
}))
}
/>
))}
</div>
</div>
);
};
export default FinancialDashboard;
Security Architecture: Bank-Level Standards
Data Encryption and JWT Authentication
@Configuration
@EnableWebSecurity
public class SecurityConfig {
@Autowired
private JwtAuthenticationEntryPoint jwtAuthenticationEntryPoint;
@Bean
public PasswordEncoder passwordEncoder() {
return new BCryptPasswordEncoder();
}
@Bean
public JwtAuthenticationFilter jwtAuthenticationFilter() {
return new JwtAuthenticationFilter();
}
@Override
protected void configure(HttpSecurity http) throws Exception {
http.csrf().disable()
.authorizeRequests()
.antMatchers("/api/auth/**").permitAll()
.antMatchers("/api/public/**").permitAll()
.anyRequest().authenticated()
.and()
.exceptionHandling()
.authenticationEntryPoint(jwtAuthenticationEntryPoint)
.and()
.sessionManagement()
.sessionCreationPolicy(SessionCreationPolicy.STATELESS);
http.addFilterBefore(jwtAuthenticationFilter(),
UsernamePasswordAuthenticationFilter.class);
}
}
Plaid Integration with Secure Token Management
@Service
public class PlaidService {
private final PlaidApi plaidClient;
private final EncryptionService encryptionService;
public PlaidService() {
this.plaidClient = new PlaidApi(ApiClient.builder()
.clientId(plaidClientId)
.secret(plaidSecret)
.environment(PlaidEnvironment.SANDBOX) // Use PRODUCTION for live
.build());
}
public LinkTokenCreateResponse createLinkToken(String userId) {
LinkTokenCreateRequest request = new LinkTokenCreateRequest()
.clientName("Cent Capital")
.language("en")
.countryCodes(Arrays.asList(CountryCode.US))
.user(new LinkTokenCreateRequestUser().clientUserId(userId))
.products(Arrays.asList(Products.TRANSACTIONS, Products.ACCOUNTS));
try {
return plaidClient.linkTokenCreate(request);
} catch (ApiException e) {
log.error("Error creating Plaid link token", e);
throw new RuntimeException("Failed to create link token");
}
}
public void exchangePublicToken(String publicToken, String userId) {
ItemPublicTokenExchangeRequest request =
new ItemPublicTokenExchangeRequest().publicToken(publicToken);
try {
ItemPublicTokenExchangeResponse response =
plaidClient.itemPublicTokenExchange(request);
// Encrypt and store access token
String encryptedToken = encryptionService.encrypt(response.getAccessToken());
userService.updatePlaidAccessToken(userId, encryptedToken);
// Sync initial transactions
syncTransactions(userId, response.getAccessToken());
} catch (ApiException e) {
log.error("Error exchanging public token", e);
throw new RuntimeException("Failed to exchange token");
}
}
}
Scaling Challenges and Solutions
Challenge 1: Multi-Cloud Database Management
Problem: Maintaining data consistency across AWS, GCP, and Azure deployments.
Solution: PostgreSQL with read replicas and Supabase for managed services:
@Configuration
public class DatabaseConfig {
@Bean
@Primary
public DataSource primaryDataSource() {
HikariConfig config = new HikariConfig();
config.setJdbcUrl(primaryDbUrl);
config.setUsername(dbUsername);
config.setPassword(dbPassword);
config.setMaximumPoolSize(20);
config.setConnectionTimeout(30000);
return new HikariDataSource(config);
}
@Bean
public DataSource readOnlyDataSource() {
HikariConfig config = new HikariConfig();
config.setJdbcUrl(readOnlyDbUrl);
config.setUsername(dbUsername);
config.setPassword(dbPassword);
config.setMaximumPoolSize(10);
config.setReadOnly(true);
return new HikariDataSource(config);
}
@Bean
public JdbcTemplate readOnlyJdbcTemplate() {
return new JdbcTemplate(readOnlyDataSource());
}
}
Challenge 2: Amazon Bedrock Rate Limiting
Problem: Managing API costs and rate limits with Amazon Bedrock.
Solution: Intelligent caching and request batching:
@Component
public class BedrockRateLimiter {
private final RateLimiter rateLimiter = RateLimiter.create(10.0); // 10 requests per second
private final Cache<String, Object> responseCache = CacheBuilder.newBuilder()
.maximumSize(1000)
.expireAfterWrite(5, TimeUnit.MINUTES)
.build();
public <T> T executeWithRateLimit(String cacheKey, Supplier<T> operation) {
// Check cache first
T cachedResult = (T) responseCache.getIfPresent(cacheKey);
if (cachedResult != null) {
return cachedResult;
}
// Apply rate limiting
rateLimiter.acquire();
T result = operation.get();
responseCache.put(cacheKey, result);
return result;
}
}
Performance Optimizations
Database Optimization with JPA
@Entity
@Table(name = "financial_transactions")
@NamedEntityGraph(
name = "Transaction.withUserAndCategory",
attributeNodes = {
@NamedAttributeNode("user"),
@NamedAttributeNode("category")
}
)
public class Transaction {
@Id
@GeneratedValue(strategy = GenerationType.UUID)
private UUID id;
@Column(nullable = false, precision = 10, scale = 2)
private BigDecimal amount;
@Column(nullable = false)
private LocalDateTime transactionDate;
@ManyToOne(fetch = FetchType.LAZY)
@JoinColumn(name = "user_id")
private User user;
@ManyToOne(fetch = FetchType.LAZY)
@JoinColumn(name = "category_id")
private Category category;
// Getters and setters...
}
@Repository
public interface TransactionRepository extends JpaRepository<Transaction, UUID> {
@EntityGraph("Transaction.withUserAndCategory")
@Query("SELECT t FROM Transaction t WHERE t.user.id = :userId " +
"AND t.transactionDate >= :startDate " +
"ORDER BY t.transactionDate DESC")
List<Transaction> findUserTransactionsWithDetails(
@Param("userId") UUID userId,
@Param("startDate") LocalDateTime startDate
);
}
Redis Caching Strategy
@Service
public class FinancialInsightCacheService {
@Autowired
private RedisTemplate<String, Object> redisTemplate;
@Cacheable(value = "user_insights", key = "#userId", unless = "#result == null")
public List<FinancialInsight> getUserInsights(UUID userId) {
// This method will be cached for 10 minutes
return generateInsights(userId);
}
@CacheEvict(value = "user_insights", key = "#userId")
public void invalidateUserInsights(UUID userId) {
// Clear cache when user data changes
}
@Scheduled(fixedRate = 300000) // Every 5 minutes
public void refreshHighPriorityInsights() {
// Pre-warm cache for active users
List<UUID> activeUsers = userService.getActiveUsers();
activeUsers.parallelStream()
.forEach(this::getUserInsights);
}
}
Monitoring and Observability
Custom Metrics with Micrometer
@Component
public class FinancialMetrics {
private final Counter transactionProcessedCounter;
private final Timer fraudDetectionTimer;
private final Gauge activeUsersGauge;
public FinancialMetrics(MeterRegistry meterRegistry) {
this.transactionProcessedCounter = Counter.builder("transactions.processed")
.description("Number of transactions processed")
.tag("status", "success")
.register(meterRegistry);
this.fraudDetectionTimer = Timer.builder("fraud.detection.duration")
.description("Time spent on fraud detection")
.register(meterRegistry);
this.activeUsersGauge = Gauge.builder("users.active")
.description("Number of active users")
.register(meterRegistry, this, FinancialMetrics::getActiveUserCount);
}
public void recordTransactionProcessed() {
transactionProcessedCounter.increment();
}
public Timer.Sample startFraudDetectionTimer() {
return Timer.start(fraudDetectionTimer);
}
private double getActiveUserCount() {
return userService.getActiveUserCount();
}
}
Deployment with GitHub Actions
# .github/workflows/deploy.yml
name: Deploy to Production
on:
push:
branches: [ main ]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up JDK 17
uses: actions/setup-java@v3
with:
java-version: '17'
distribution: 'temurin'
- name: Run tests
run: ./mvnw test
- name: Run security scan
run: ./mvnw org.owasp:dependency-check-maven:check
deploy:
needs: test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Build Docker image
run: |
docker build -t centcapital/api:${{ github.sha }} .
- name: Deploy to AWS
run: |
aws eks update-kubeconfig --name prod-cluster
kubectl set image deployment/api api=centcapital/api:${{ github.sha }}
kubectl rollout status deployment/api
- name: Deploy to GCP
run: |
gcloud container clusters get-credentials prod-cluster --zone us-central1-a
kubectl set image deployment/api api=centcapital/api:${{ github.sha }}
- name: Deploy to Azure
run: |
az aks get-credentials --resource-group prod-rg --name prod-cluster
kubectl set image deployment/api api=centcapital/api:${{ github.sha }}
Key Takeaways for Fintech Developers
Choose Proven Technologies: Java Spring Boot and React provide enterprise-grade reliability and a large talent pool.
Leverage Cloud AI Services: Amazon Bedrock gives you cutting-edge AI without the infrastructure overhead.
Multi-Cloud is Worth the Complexity: Redundancy and cost optimization benefits outweigh the operational complexity.
Security is Non-Negotiable: Implement JWT authentication, encrypt sensitive data, and audit everything.
Cache Aggressively: Financial data doesn't change frequently—use Redis to reduce API costs and improve performance.
Monitor Everything: Custom metrics help you understand user behavior and system performance.
What's Next
We're currently processing transactions for our beta users and preparing to scale our infrastructure for millions of users. The technical challenges ahead include:
- Real-time fraud detection with sub-100ms response times
- Advanced financial modeling using Amazon Bedrock's latest models
- International expansion with multi-currency support
- Open banking integration for European markets
Tech Stack Summary:
- Frontend: React, Redux, CloudFlare
- Backend: Java, Spring Boot, Spring Security
- Database: PostgreSQL, Supabase, Redis
- AI/ML: Amazon Bedrock, Amazon SES/SNS
- Payment: Stripe, Plaid
- Infrastructure: AWS, GCP, Azure, Kubernetes
- DevOps: GitHub, GitHub Actions
- Monitoring: Micrometer, CloudWatch, Google Analytics
To learn more about Cent Capital and join our mission to democratize financial wellness, visit our official website. Stay connected with our journey and get daily insights on our social channels, including Twitter (X), LinkedIn, Facebook, Instagram, Threads, TikTok, and Tumblr. You can read our long-form content on Medium and Substack, and listen to The Smarter Cents Podcast on Spotify, Apple Podcasts, and Amazon Music. Watch our latest videos on YouTube and join our community conversation on Reddit. For a deeper dive into our business, track our progress on Crunchbase, AngelList, F6S, and Product Hunt. Explore our code on GitHub, see our tech stack on StackShare, and check out our reviews and listings on G2, Clutch, SaaSHub, Yelp, and Foursquare.
Top comments (0)