A fully serverless AI powered news app built with AWS Lambda, DynamoDB, Bedrock, and a SwiftUI client.
What if your news feed actually understood what you care about?
Not just broad categories like tech or sports, but specific interests like AI agents, Formula 1, or decentralized finance.
That question led me to build NewsAI, a fully serverless, AI powered personalized news application running entirely on AWS with a native iOS client built using SwiftUI.
In this article I walk through
- The architecture
- The AWS services involved
- How the Strands Agents SDK connects everything to Claude on Amazon Bedrock
This project demonstrates how modern AI infrastructure can power real world developer applications with almost zero operational overhead.
Architecture Overview
- Everything is serverless
- No servers to manage
- No containers to orchestrate
Just Lambda functions, DynamoDB, and Claude doing the reasoning through Amazon Bedrock.
AWS Services Used
| Service | Purpose |
|---|---|
| AWS Lambda | Handles the application logic including news retrieval and personalization |
| Amazon API Gateway | Exposes the REST API endpoints |
| Amazon DynamoDB | Stores user topic preferences |
| Amazon Bedrock | Provides Claude models for AI reasoning |
| Strands Agents SDK | Simplifies building AI agents on top of Bedrock |
| IAM | Provides secure permissions between services |
Infrastructure is deployed using Serverless Framework v3.
Personalization Pipeline
1. User Selects Interests
The iOS application presents topic categories:
When a user selects a topic, subinterest chips appear:

These preferences are sent to the backend and stored in DynamoDB.
The handler distinguishes between new and returning users. New users get a full record created, while returning users receive a partial update that only patches the fields they changed.
if (!existing) {
const preferences = {
userId,
topics,
interestDescription,
language: "en",
maxArticles: 10,
updatedAt: new Date().toISOString(),
}
await putPreferences(preferences)
} else {
await updatePreferences(userId, { topics, interestDescription })
}
When users select subinterests like LLMs or Agents under the AI topic, those selections are combined into an interestDescription field such as "Specifically interested in: LLMs, Agents". This description is later passed directly into the AI prompt and is what enables fine grained relevance scoring beyond broad topic matching.
2. Fetch Raw News
When the user opens the application the API endpoint /news is called.
The backend retrieves preferences and fetches news articles from the NewsAPI service using axios. Requests for each topic run in parallel using Promise.allSettled, which means a failure in one topic does not break the entire request.
const requests = preferences.topics.map(topic =>
axios.get(`https://newsapi.org/v2/everything`, {
params: { q: topic, language, pageSize, sortBy: "publishedAt", apiKey },
})
)
const responses = await Promise.allSettled(requests)
3. AI Personalization Using Strands Agents
The raw articles are sent to an AI agent powered by Claude.
The agent performs three tasks:
- Score each article from 0 to 100 based on relevance
- Generate a one sentence summary
- Match the article to the user topics
Example agent configuration:
import { Agent, BedrockModel } from "@strands-agents/sdk"
const agent = new Agent({
model: new BedrockModel({
modelId: "eu.anthropic.claude-sonnet-4-6",
region: "eu-north-1",
maxTokens: 4096,
}),
systemPrompt: `You are a news personalization engine. Given a user's topic
preferences, their specific interest description, and a list of news articles,
you score each article (0-100) on relevance to the user, write a one-sentence
summary, and identify which of the user's topics it matches.
When the user provides a detailed interest description, use it to score more
precisely. Articles that closely match their specific interests should score
much higher than articles that only match the broad topic category.`,
})
The user prompt includes both the broad topics and the specific interest description, which is what allows the agent to differentiate between a general AI article and one specifically about LLM agents.
User preferences:
Topics of interest: technology, ai
Specific interests: Specifically interested in: LLMs, Agents
Articles to evaluate (30 total):
[...]
Only include articles with relevanceScore >= 30.
Sort by relevanceScore descending.
The agent instance is cached at the module level, so warm Lambda invocations reuse the same agent without reinitializing the Bedrock connection.
4. Return Ranked Results
- Articles scoring below 30 are filtered out
- Remaining articles are sorted by relevance and returned to the mobile client
- The SwiftUI interface displays them as a modern native news feed
Why Use the Strands Agents SDK
Many AI frameworks exist today. This project intentionally keeps the stack minimal.
Benefits include:
- Native Bedrock integration
- Automatic retry and streaming handling
- Agent reuse across Lambda warm invocations
- Very small API surface
In practice the actual model invocation is a single call:
const result = await agent.invoke(prompt)
The surrounding code handles prompt construction and response parsing, but the SDK abstracts away all Bedrock connection management, retries, and streaming.
Amazon Bedrock Inference Profiles
Claude models on Bedrock require inference profiles for scalable usage.
Instead of using the raw model identifier, a regional profile is used.
Example
eu.anthropic.claude-sonnet-4-6
This allows AWS to automatically route inference requests across European regions while keeping data within the EU.
This is important for compliance sensitive applications.
The SwiftUI Client
The client application is written using native SwiftUI and includes
- Glass style interface using ultraThinMaterial
- Interactive topic picker
- Built in article reader using SafariViewController
- Automatic dark mode support
- Local preference persistence
Example SwiftUI interaction
ForEach(Topic.all) { topic in
TopicCard(topic: topic) {
toggleSelection(topic)
}
}
Infrastructure as Code
The backend infrastructure is defined in a single Serverless configuration file.
service: news-ai
provider:
name: aws
runtime: nodejs20.x
region: eu-north-1
Deployment requires only
sls deploy
For local development the project includes serverless-offline, which lets you run the full API locally without deploying.
sls offline start
This starts a local server at http://localhost:3000 with both endpoints available.
Lessons Learned
- AI powered personalization benefits significantly from detailed interest signals
- Broad categories such as AI are too vague
- Subinterest signals such as LLMs or Agents dramatically improve relevance scoring because they flow directly into the agent prompt as a specific interest description
- Warm Lambda reuse of agents also improves performance significantly
Future Improvements
- Push notifications for breaking news
- Tracking reading history
- Additional sources such as RSS and Hacker News
- Agent tools for contextual search and fact verification
Try It Yourself
- Clone the repository: https://github.com/rashwanlazkani/news-ai
- Set your
NEWS_API_KEYenvironment variable which can be retrieved from https://newsapi.org - Run
sls deploy - Enable Bedrock access from the AWS Console
- Set the
API_BASE_URLin the iOS project'sInfo.plistto your API Gateway URL - Build and run the iOS client
You now have a complete AI powered application running entirely on AWS.
Final Thoughts
Modern AI infrastructure has made it possible to build production ready intelligent applications with minimal infrastructure.
Combining:
- AWS Lambda
- Amazon DynamoDB
- Amazon Bedrock
- Strands Agents SDK
Creates an extremely powerful developer stack.
Add SwiftUI on the client side and you have a modern AI application architecture.
If you enjoyed this article and want to see more deep dives into serverless AI architectures, follow along for future projects.



Top comments (0)