A fintech company in Bangladesh approached us with what seemed like a simple request: "Build us a customer support chatbot that speaks Bengali and English."
Simple, right? Just translate responses and you're done.
Wrong.
The real challenge wasn't translation—it was code-switching.
Bangladeshi users don't speak "pure Bengali" or "pure English" in conversations. They mix both languages naturally in a single sentence:
"Amar account e balance koto ache?"
"Transaction failed hoyeche, ki korbo?"
"Minimum balance maintain korte hobe koto?"
When we deployed our first version with simple Bengali/English detection, the results were catastrophic.
Why Standard Translation Failed
The chatbot would:
Detect "Amar account" as Bengali → Respond fully in Bengali
User writes "transaction history" → Bot switches to full English
User mixes again → Bot gets confused and asks "Which language do you prefer?"
User frustration: Immediate.
Conversation abandonment rate: 47%
Support ticket escalation: 3x higher than expected
The Real Problem: Cultural Communication Patterns
This wasn't a translation problem—it was a cultural intelligence problem.
In Bangladesh, code-switching isn't random. It follows patterns:
Technical terms stay in English: "transaction," "balance," "OTP," "PIN"
Conversational structure stays in Bengali: "koto," "ki korbo," "kivabe"
Formal banking terms: English ("account statement," "interest rate")
Emotional/casual expressions: Bengali ("bujhlam na," "problem hocche")
Our chatbot needed to understand not just words, but cultural communication norms.
Failed Approaches: What Didn't Work
Attempt 1: Language Detection Per Message
Logic: Detect dominant language, respond in that language
Result: Constantly switching languages mid-conversation. Users confused.
Accuracy: 40%
Attempt 2: Ask User to Choose Language
Prompt: "Do you prefer Bengali or English?"
Result: Users don't want to choose—they want natural conversation. Increased drop-off.
Engagement: -35%
Attempt 3: Translate Everything to Bengali
Logic: Force Bengali-only responses for "local feel"
Result: Technical terms sounded awkward. "লেনদেন ব্যর্থ হয়েছে" instead of natural "Transaction failed hoyeche."
User satisfaction: Low
The Breakthrough: Code-Switching Intelligence System
We realized the bot needed to mirror how Bangladeshi users actually speak—mixing languages naturally based on context, not forcing a single language.
We built a three-layer system:
Layer 1: Contextual Language Detection
Instead of detecting "is this Bengali or English," we detect:
Technical context (banking terms, features) → Prefer English
Conversational context (questions, confirmations) → Prefer Bengali
User's language pattern in current session → Match their style
Layer 2: Term Classification Engine
We categorized vocabulary into three types:
Type 1 - Always English:
Technical: transaction, balance, OTP, PIN, account number
Banking: interest rate, statement, loan, EMI
Type 2 - Always Bengali:
Questions: koto, keno, kivabe, ki korbo
Confirmations: bujhechi, thik ache, hobe
Type 3 - Flexible (Context-Dependent):
Numbers: can be English digits or Bengali বাংলা
Dates/Time: mixed format acceptable
Layer 3: Response Mirroring
The bot analyzes user input and mirrors their code-switching pattern:
User writes: "Amar account balance koto?"
Bot responds: "Apnar account balance হলো ৳5,000।"
User writes: "Transaction history kivabe dekhbo?"
Bot responds: "Transaction history দেখতে 'Statement' option এ click করুন।"
The bot became a linguistic chameleon—adapting to each user's natural speech pattern.
The Implementation: Prompt Engineering Solution
We didn't need complex NLP models. We solved it at the prompt level:
System Instruction:
Code-Switching Protocol for Bengali-English:
Analyze user's language mixing pattern
Maintain these terms in English ALWAYS:
Technical: transaction, balance, OTP, PIN, account
Actions: login, statement, transfer, payment
Keep conversational structure in Bengali:
Questions: koto, ki, kivabe, keno
Responses: hobe, ache, dekhte parben
Mirror user's style:
If user writes "balance koto," respond "balance হলো..."
If user writes "কত টাকা আছে," respond "আপনার একাউন্টে..."
Never ask "Which language do you prefer?"
Never force full Bengali or full English
Keep it natural—like talking to a bilingual friend
The Results
Before Fix:
Conversation abandonment: 47%
Language confusion complaints: Daily
Support escalation: 3x baseline
User satisfaction: 2.1/5
Avg conversation length: 3.2 messages
After Fix:
Conversation abandonment: 12%
Language confusion: Rare (less than 5%)
Support escalation: Normal baseline
User satisfaction: 4.3/5
Avg conversation length: 8.7 messages
Business Impact:
73% reduction in abandonment rate
60% fewer support escalations
Customer satisfaction doubled
Users completing transactions in chat (instead of calling support)
Technical Insights: What We Learned
Language Detection ≠ Language Understanding
Knowing if a sentence is "Bengali" or "English" is useless when users mix both. Context matters more than classification.
Cultural Communication Has Rules
Code-switching isn't random chaos. There are predictable patterns based on:
Topic formality (banking terms vs casual chat)
Technical complexity (English for tech, Bengali for conversation)
Emotional tone (Bengali for empathy, English for instructions)
Mirroring Builds Trust
When the bot matches the user's natural speech pattern, users feel understood. They don't notice the bot is "smart"—they just feel comfortable.
Don't Force Standards on Users
Asking users to "choose a language" or "speak clearly in one language" breaks the natural flow. Adapt to them, don't make them adapt to you.
Implementation Tips for Multilingual Chatbots
If you're building chatbots for bilingual or multilingual markets:
Map Language Patterns First
Research how your target users actually speak. Record real conversations. Identify which terms stay in which language.
Create Term Libraries
Build three categories: Always Language A, Always Language B, Flexible. This guides consistent responses.
Mirror, Don't Force
Let the bot adapt to user patterns rather than forcing users into rigid language choices.
Test with Real Users, Not Translators
Native speakers who naturally code-switch will catch issues that professional translators miss.
The Core Lesson
The biggest mistake in multilingual AI: Treating languages as separate systems.
In reality, bilingual users operate in a blended communication style. They don't "switch" between languages—they flow naturally using whichever words feel right for the context.
Our chatbot stopped being a "Bengali bot" or "English bot" and became a "Bangladeshi bot"—speaking the way real Bangladeshis actually communicate.
Accuracy jumped from 40% to 92% not because we built better translation—but because we stopped translating and started mirroring.
Your Turn
Are you building chatbots for bilingual markets?
Have you encountered code-switching challenges in your conversational AI projects?
What language patterns have you noticed in your target audience?
Written by Faraz Farhan
Senior Prompt Engineer and Team Lead at PowerInAI
Building AI automation solutions that understand cultural communication
www.powerinai.com
Tags: multilingual, chatbot, conversationalai, codeswitching, localization, promptengineering
Top comments (0)