DEV Community

Cover image for How MongoDB Powers My Intelligent Job Matcher Application
Burra Sampath Mohan
Burra Sampath Mohan

Posted on

How MongoDB Powers My Intelligent Job Matcher Application

How MongoDB Powers My Intelligent Job Matcher Application

Introduction

When I built Intelligent Job Matcher, I wanted one database that could handle flexible documents, quick iteration, and multiple feature modules without rigid schema migration every time I changed a field. MongoDB became the core data layer of the project.

In this blog, I explain:

  1. Why I chose MongoDB
  2. At which level MongoDB is used
  3. Real project code snippets
  4. End-to-end data flow from UI to API to MongoDB
  5. Lessons learned and next improvements

Project Demo

Watch the complete project demo below:

Why I Chose MongoDB for This Project

I selected MongoDB because my application stores multiple types of data that evolve over time:

  1. User profiles and authentication records
  2. Job documents with titles and descriptions
  3. Resume submissions
  4. Explainability reports with dynamic fields
  5. Analytics history records
  6. Job role taxonomy entries

A document database fits this use case very well because:

  1. Structure can vary between collections
  2. Development is fast for prototype-to-product flow
  3. JSON-like documents map naturally to API payloads
  4. Read and write operations are straightforward in Python with PyMongo

Where MongoDB Is Used in the Architecture (Higher-Level Usage)

MongoDB is not used only at the storage level. In my application, it is used at higher functional layers too.

1. Infrastructure/Data Access Layer

MongoDB client is initialized, and all collections are defined centrally.

2. Authentication Layer

User registration and login directly read/write user documents.

3. Core Matching Layer

The matching engine reads jobs and stores resume submissions.

4. Business/API Layer

Analyses history, explainability history, admin modules, and job roles rely on MongoDB CRUD operations.

5. Analytics Layer

Charts are generated from analysis records stored in MongoDB.

6. Admin and Governance Layer

Admins can inspect users, analyses, and delete users with related cleanup.

This means MongoDB is the central system-of-record for the entire app, not just a passive backend component.


Code Snippet 1: MongoDB Connection and Collections

Source: utils.py

import os
from pymongo import MongoClient

mongo_uri = os.getenv("MONGODB_URI", "mongodb://localhost:27017/")
mongo_db_name = os.getenv("MONGODB_NAME", "intelligent_job_matcher")

client = MongoClient(mongo_uri)
db = client[mongo_db_name]

jobs_collection = db["jobs"]
resumes_collection = db["resumes"]
explainability_collection = db["explainability_reports"]
users_collection = db["users"]
analyses_collection = db["analyses"]
job_roles_collection = db["job_roles"]
Enter fullscreen mode Exit fullscreen mode

What This Does

  1. Opens MongoDB connection once
  2. Selects database
  3. Exposes all domain collections for reuse across modules

Code Snippet 2: MongoDB in Authentication (Register/Login)

Source: auth_views.py

@api_view(['POST'])
def register_user(request):
    username = request.data.get("username")
    password = request.data.get("password")

    if users_collection.find_one({"username": username}):
        return Response({"error": "User already exists"})

    users_collection.insert_one(
        {
            "username": username,
            "password_hash": make_password(password),
            "role": "admin" if "admin" in username.lower() else "user",
            "created_at": datetime.now(timezone.utc).isoformat(),
        }
    )

    return Response({"message": "User registered successfully"})
Enter fullscreen mode Exit fullscreen mode
@api_view(['POST'])
def login_user(request):
    username = request.data.get("username")
    password = request.data.get("password")

    user = users_collection.find_one({"username": username})

    if user is None or not check_password(password, user.get("password_hash", "")):
        return Response({"error": "Invalid username or password"})

    # JWT token creation happens after MongoDB validation
Enter fullscreen mode Exit fullscreen mode

What This Does

  1. Uses MongoDB as user identity store
  2. Stores hashed passwords instead of plain text
  3. Authentication flow depends on MongoDB read/write operations

Code Snippet 3: MongoDB in Matching Engine (Read Jobs + Store Resume)

Source: hybrid_service.py

def run_hybrid_matching(resume_text):
    resume_text = remove_bias_terms(resume_text)

    resumes_collection.insert_one({
        "resume_text": resume_text
    })

    jobs = list(jobs_collection.find({}, {"_id": 0}))

    if not jobs:
        return []

    # semantic + skill + experience + role scoring
    # final ranking and top-5 output
Enter fullscreen mode Exit fullscreen mode

What This Does

  1. Persists incoming resume text in MongoDB
  2. Fetches job documents from MongoDB as ranking input
  3. Produces scored recommendations

MongoDB directly feeds the ML/ranking logic here.


Code Snippet 4: MongoDB in Explainability and Analytics Persistence

Source: views.py

@api_view(["POST"])
@permission_classes([IsAuthenticated])
def save_explainability_record(request):
    payload = request.data or {}

    document = {
        "username": request.user.username,
        "job_title": payload.get("job_title") or "Untitled Role",
        "rank": payload.get("rank") or 1,
        "final_score": payload.get("final_score") or 0,
        "created_at": datetime.utcnow().isoformat(),
    }

    inserted = explainability_collection.insert_one(document)

    return Response(
        {"id": str(inserted.inserted_id)},
        status=201
    )
Enter fullscreen mode Exit fullscreen mode
@api_view(["POST"])
@permission_classes([IsAuthenticated])
def save_analysis_record(request):
    payload = request.data or {}

    document = {
        "username": request.user.username,
        "recommended_jobs": payload.get("recommended_jobs") or [],
        "analyzed_at": payload.get("analyzed_at") or datetime.utcnow().isoformat(),
        "created_at": datetime.utcnow().isoformat(),
    }

    inserted = analyses_collection.insert_one(document)

    return Response(
        {"id": str(inserted.inserted_id)},
        status=201
    )
Enter fullscreen mode Exit fullscreen mode

What This Does

  1. Stores explainability for transparency
  2. Stores analysis history for dashboards and charts
  3. Makes analytics reproducible from persistent data

Code Snippet 5: MongoDB in Admin and Job Roles Management

Source: views.py

@api_view(["DELETE"])
@permission_classes([IsAuthenticated])
def admin_delete_user(request, username):

    delete_user_result = users_collection.delete_one(
        {"username": username}
    )

    analyses_collection.delete_many(
        {"username": username}
    )

    explainability_collection.delete_many(
        {"username": username}
    )

    if delete_user_result.deleted_count == 0:
        return Response(
            {"error": "User not found"},
            status=404
        )

    return Response({"message": "User deleted"})
Enter fullscreen mode Exit fullscreen mode
@api_view(["POST"])
@permission_classes([IsAuthenticated])
def add_job_role(request):

    role_name = (request.data.get("name") or "").strip()

    existing = job_roles_collection.find_one({
        "name": {
            "$regex": f"^{role_name}$",
            "$options": "i"
        }
    })

    if existing:
        return Response(
            {"error": "Role already exists"},
            status=409
        )

    inserted = job_roles_collection.insert_one({
        "name": role_name,
        "created_by": request.user.username,
        "created_at": datetime.utcnow().isoformat(),
    })

    return Response(
        {"id": str(inserted.inserted_id)},
        status=201
    )
Enter fullscreen mode Exit fullscreen mode

What This Does

  1. Supports admin cleanup of linked user data
  2. Supports role taxonomy management with case-insensitive duplicate protection

Frontend to MongoDB Data Flow (Through API Layer)

The frontend never talks to MongoDB directly. It calls backend endpoints, and the backend performs MongoDB operations.

Example Frontend Calls

  1. Analyses history call: app.js:241
  2. Admin users call: app.js:1138
  3. Job roles call: app.js:1306

Data Flow

  1. User action in UI
  2. JavaScript calls API
  3. Django view executes business logic
  4. PyMongo reads/writes MongoDB
  5. API response returns to UI
  6. UI updates charts, cards, and tables

What This Proves About MongoDB Usage Level

In this application, MongoDB is used at:

  1. Data layer
  2. Authentication layer
  3. Core recommendation layer
  4. Explainability layer
  5. Analytics layer
  6. Admin operations layer
  7. Taxonomy/configuration layer

Therefore, MongoDB is used at a higher architectural level and is central to application behavior.


Lessons Learned

  1. Document model helped move fast during feature additions
  2. JSON-like records made API integration easy
  3. Flexible schema was useful for explainability payload evolution
  4. Collection-level separation improved module clarity

Next Improvements for Production

1. Add Indexes

  • users.username unique index
  • analyses.username + created_at
  • explainability_reports.username + created_at
  • job_roles.name normalized uniqueness

2. Add Environment-Based Security Hardening

  • Move secret values to environment variables
  • Lock CORS origins in production

3. Add Token Refresh Endpoint

Support long-running authenticated sessions.

4. Add Archive Policy

Archive old resumes and analyses if the dataset grows large.


Conclusion

MongoDB is the backbone of Intelligent Job Matcher. It powers persistence, security-related account data, recommendation input data, explainability, analytics, admin governance, and role management.

This architecture demonstrates how a document database can effectively support both transactional workflows and analytical features in one cohesive system.


Team Credits

Developed by:

  • Burra Sampath Mohan
  • Suyash Ram
  • Kashif
  • Keran

Faculty Guidance

Special thanks to Chanda Rajkumar Sir for valuable guidance and support throughout the project.

Top comments (0)