π‘ The Spark
The idea that an ML model delivers exactly what you train it for was the spark I needed to take my first step into AI/ML.
While working on a web development project, I explored how AI algorithms could be integrated to generate meaningful predictions. That experience became the perfect opportunity for me to:
- Dive deeper into AI/ML,
- Polish my understanding,
- And start building real-world AI features.
π The Project: Inspire Sphere
The project I worked on was Inspire Sphere β my first full-stack web app built with HTML5 + Node.js.
β¨ Itβs a quote generator + literature writing platform with user profiles.
π― The Idea Behind It
I wanted to predict the categories of quotes uploaded by users.
For this, I chose the Scikit-learn library and its submodules:
- β
sklearn.feature_extraction.text
β to process the text content of quotes. - β
TfidfVectorizer
β to generate a sparse matrix of words with TF-IDF scores (importance based on frequency & repetition).
π§ The Algorithm Used: Naive Bayes
I used Naive Bayes (MultinomialNB), which works well with word counts.
from sklearn.feature_extraction.text import TfidfVectorizer
from sklearn.naive_bayes import MultinomialNB
from sklearn.pipeline import make_pipeline
from sklearn.model_selection import train_test_split
from sklearn.metrics import classification_report, accuracy_score
import joblib
from Dataset.load import quotes, labels
X_train, X_test, y_train, y_test = train_test_split(quotes, labels, test_size = 0.2 , random_state=42)
- π The dataset (quotes + categories) was sourced from Kaggle.
- π The classifier learned to assign categories to user-generated quotes.
ποΈ Clean Code Structure
Even though Python is concise, I maintained a modular directory structure for clarity:
- Separate submodules for preprocessing, training, and model storage.
- This keeps the AI/ML branch of Inspire Sphere scalable and maintainable.
π Integration with Node.js Using FastAPI
Once trained, the model was saved as a .pkl
file using joblib
.
For integration:
- π₯οΈ I built a FastAPI POST API in Python.
app = FastAPI()
model = joblib.load(model_path)
class Quote(BaseModel):
text:str
@app.post("/predict")
def predict(quote:Quote):
prediction = model.predict([quote.text])
return {"category":prediction[0]}
- π Node.js called this API to fetch predictions.
async getCategory(quoteText) {
try {
const response = await axios.post(`${process.env.ML_SERVER_URL}/predict`, {text:quoteText});
const category = response.data.category;
return category;
}
catch (error) {
console.error("Error While Getting Category", error);
return null;
}
}
- π The API was hosted as a web service using Render.
This way, my Node.js app could consume AI predictions seamlessly.
β The Outcome
The ML model was able to:
- Predict categories of quotes uploaded by users.
- Enrich the HTML pages of Inspire Sphere with intelligent categorization.
π§ This Is Just the Start
This was just my first step into ML/AI model integration.
β οΈ The model is currently low on accuracy and needs improvement.
But it opened the door for me to explore more advanced:
- Models,
- Feature engineering,
- And AI-powered integrations.
Stay tuned β Iβll be sharing more updates as I refine Inspire Sphereβs AI/ML domain. π
π€ Over to You
Have you ever integrated an AI/ML model into a web app?
What stack did you use β and what challenges did you face?
Letβs talk in the comments π
Top comments (0)