As a software engineer working in healthcare, I’ve seen first-hand how patients want instant answers but compliance makes everything tricky. We can’t just throw PHI (Protected Health Information) into ChatGPT and call it a day.
So I decided to build a HIPAA-friendly chatbot using AWS Lambda (serverless backend) and Amazon Bedrock (LLM service) — with data masking to keep sensitive info safe.
📌 This post walks you through:
- Why chatbots in healthcare are challenging 🚑
- How to design a compliant architecture 🔐
- Code snippets for AWS Lambda + Bedrock ⚙️
- Tips for keeping PHI secure 🛡️
🚨 The Challenge: AI + Healthcare = Risk
- Chatbots are great for FAQs, triage, and scheduling.
- But if you send raw PHI (like names, MRNs, diagnoses) to an LLM… that’s a compliance nightmare.
- HIPAA requires minimum necessary access and strict controls.
🏗️ Architecture at a Glance
Flow:
- Patient message → API Gateway
- Lambda pre-processor → scrubs PHI (names, DOB, SSNs)
- Bedrock LLM → processes the masked query
- Lambda post-processor → reinserts placeholders if needed
- Response → returned to patient securely
API Gateway → Lambda (mask PHI) → Bedrock → Lambda (restore placeholders) → Patient
⚙️ Step 1: Setting up AWS Lambda
A basic Python Lambda handler:
import boto3
import re
bedrock = boto3.client('bedrock-runtime')
def mask_phi(text):
# toy example: replace dates + names
text = re.sub(r'\d{2}/\d{2}/\d{4}', '[DATE]', text)
text = re.sub(r'\b(Alice|Bob|John)\b', '[NAME]', text)
return text
def lambda_handler(event, context):
user_input = event['queryStringParameters']['q']
masked_input = mask_phi(user_input)
response = bedrock.invoke_model(
modelId="anthropic.claude-v2",
contentType="application/json",
accept="application/json",
body=f'{{"prompt":"{masked_input}"}}'
)
return {
"statusCode": 200,
"body": response['body'].read().decode('utf-8')
}
🤖 Step 2: Talking to Bedrock Safely
- Always send masked input only.
- Example:
Input:
John Smith has a fever since 09/21/2025. Should he see a doctor?
Masked:
[NAME] has a fever since [DATE]. Should they see a doctor?
🔐 Step 3: Post-Processing
If you need to restore placeholders (like Hello [NAME]
), you can map them back safely from session state.
✅ Why This Matters
- Patients get instant responses
- Engineers stay HIPAA-compliant
- Serverless (Lambda) keeps costs low
- Bedrock provides enterprise-grade LLMs without exposing PHI
🚀 Next Steps
- Add DynamoDB to store chat history (encrypted)
- Plug in Cognito for authentication
- Expand PHI scrubbing with Amazon Comprehend Medical
🙌 Closing Thoughts
As engineers, we often think "just ship the feature" — but in healthcare, privacy is the feature.
This project taught me that it’s possible to marry AI innovation with compliance if we design carefully.
👉 👉 Repo link: GitHub – hipaa-chatbot-bedrock
Top comments (0)