DEV Community

Nadim Chowdhury
Nadim Chowdhury

Posted on

MoodFeed: Building an AI-Powered Social Feed That Actually Gets You

Remember scrolling through your feed at 2 AM, feeling anxious, and the algorithm keeps showing you success stories that make you feel worse? Yeah, me too. That's exactly why I built MoodFeed.

The Problem Nobody's Talking About

Social media feeds are dumb. They think because you liked one post about productivity, you want to see 50 more. They don't care if you're having the worst day of your life or celebrating a major win. Same content, different mood, wrong vibe.

I spent three months building something different. A feed that actually adapts to how you're feeling right now. Not what you liked yesterday. Not what your demographic usually engages with. But what YOU need in THIS moment.

What Makes MoodFeed Different?

The app does something simple but powerful: it asks how you're feeling (or detects it through your camera), then curates content that matches or improves that mood. Feeling stressed? Here are calming memes and chill videos. Pumped up? Motivational content and hype music. Heartbroken? We've got wholesome stories and comfort food recipes.

But here's the kicker—it learns. The more you use it, the better it gets at understanding your emotional patterns. It knows you prefer dark humor when you're down, or that nature videos calm your anxiety better than meditation content.

The Tech Behind the Magic

I built this entirely with Flutter because I wanted one codebase for iOS, Android, and potentially web. No React Native drama, no separate teams. Just clean, fast, beautiful Flutter.

The Stack

Frontend: Flutter (obviously) with Provider for state management and dio for HTTP requests. I kept it simple because complexity kills side projects.

Backend: Node.js with Express because it's fast to prototype and scales well. MongoDB for the database—perfect for storing user preferences and content metadata without rigid schemas.

AI Integration: This is where it gets interesting. I'm using OpenAI's API for sentiment analysis on user input and Hugging Face models for facial emotion detection. The content tagging system uses a combination of pre-labeled data and real-time analysis.

Infrastructure: Docker containers for everything, deployed on AWS with CloudFront for CDN. Nothing fancy, just reliable.

System Architecture: Keeping It Real

I'm not going to pretend this is some enterprise-grade system with microservices and Kubernetes. It's a clean three-tier architecture that actually makes sense:

┌─────────────────────────────────────┐
│         Flutter Mobile App          │
│  (iOS, Android, Web - One Codebase) │
└─────────────┬───────────────────────┘
              │
              │ REST API / WebSocket
              │
┌─────────────▼───────────────────────┐
│         Express.js Server           │
│  • Auth & User Management           │
│  • Content Aggregation Engine       │
│  • Mood Processing Pipeline         │
│  • Real-time Mood Updates           │
└─────────────┬───────────────────────┘
              │
         ┌────┴────┐
         │         │
┌────────▼──┐  ┌──▼──────────┐
│  MongoDB  │  │  AI Services │
│  Database │  │  • OpenAI    │
└───────────┘  │  • HuggingFace│
               └──────────────┘
Enter fullscreen mode Exit fullscreen mode

The flow is dead simple:

  1. User opens app → mood check (manual selection or camera)
  2. Mood data sent to backend → AI processes emotion
  3. Content engine queries database → filters by mood tags
  4. Personalized feed sent back → user gets exactly what they need

Folder Structure: No BS Organization

I hate messy codebases. Here's how I organized the Flutter project:

moodfeed/
├── lib/
│   ├── main.dart
│   ├── config/
│   │   ├── theme.dart
│   │   ├── routes.dart
│   │   └── constants.dart
│   ├── core/
│   │   ├── api/
│   │   │   ├── api_client.dart
│   │   │   └── endpoints.dart
│   │   ├── models/
│   │   │   ├── user_model.dart
│   │   │   ├── mood_model.dart
│   │   │   ├── content_model.dart
│   │   │   └── feed_item_model.dart
│   │   └── utils/
│   │       ├── mood_detector.dart
│   │       └── camera_helper.dart
│   ├── features/
│   │   ├── auth/
│   │   │   ├── screens/
│   │   │   │   ├── login_screen.dart
│   │   │   │   └── signup_screen.dart
│   │   │   ├── widgets/
│   │   │   └── providers/
│   │   │       └── auth_provider.dart
│   │   ├── mood/
│   │   │   ├── screens/
│   │   │   │   ├── mood_selector_screen.dart
│   │   │   │   └── mood_camera_screen.dart
│   │   │   ├── widgets/
│   │   │   │   ├── mood_card.dart
│   │   │   │   └── emotion_indicator.dart
│   │   │   └── providers/
│   │   │       └── mood_provider.dart
│   │   ├── feed/
│   │   │   ├── screens/
│   │   │   │   ├── home_feed_screen.dart
│   │   │   │   └── content_detail_screen.dart
│   │   │   ├── widgets/
│   │   │   │   ├── feed_card.dart
│   │   │   │   ├── video_player_widget.dart
│   │   │   │   └── meme_viewer.dart
│   │   │   └── providers/
│   │   │       └── feed_provider.dart
│   │   └── profile/
│   │       ├── screens/
│   │       │   ├── profile_screen.dart
│   │       │   └── mood_stats_screen.dart
│   │       └── widgets/
│   │           ├── mood_chart.dart
│   │           └── stats_card.dart
│   └── shared/
│       ├── widgets/
│       │   ├── custom_button.dart
│       │   └── loading_indicator.dart
│       └── theme/
│           └── app_colors.dart
├── assets/
│   ├── images/
│   ├── icons/
│   └── animations/
├── test/
└── pubspec.yaml
Enter fullscreen mode Exit fullscreen mode

Backend structure is equally clean:

moodfeed-backend/
├── src/
│   ├── config/
│   │   ├── database.js
│   │   └── environment.js
│   ├── models/
│   │   ├── User.js
│   │   ├── Content.js
│   │   └── MoodLog.js
│   ├── routes/
│   │   ├── auth.routes.js
│   │   ├── mood.routes.js
│   │   ├── feed.routes.js
│   │   └── user.routes.js
│   ├── controllers/
│   │   ├── authController.js
│   │   ├── moodController.js
│   │   └── feedController.js
│   ├── services/
│   │   ├── aiService.js
│   │   ├── contentService.js
│   │   └── recommendationEngine.js
│   ├── middleware/
│   │   ├── auth.middleware.js
│   │   └── errorHandler.js
│   └── utils/
│       ├── logger.js
│       └── validators.js
├── tests/
├── package.json
└── server.js
Enter fullscreen mode Exit fullscreen mode

The UI/UX Philosophy: Less is More

I'm tired of apps that assault your eyes with gradients and animations. MoodFeed's design is brutally simple:

Color Psychology: Each mood has a subtle color theme. Not in-your-face, just enough to create the right atmosphere. Calm blues for relaxation, warm oranges for energy, soft purples for reflection.

Micro-interactions: Every tap, swipe, and scroll feels intentional. No random animations that slow you down. Just smooth, purposeful transitions that guide you through the experience.

Content-First: The UI gets out of the way. Large, readable cards with clear typography. No clutter, no distractions. Your mood content is the star.

Dark Mode by Default: Because most people use this app late at night, and nobody wants to be flashbanged by a white screen at 1 AM.

Features That Make People Come Back

1. Mood Camera

Open the camera, look at your phone for 3 seconds, and the AI detects your emotion. No typing, no selections. It uses facial landmarks and expression analysis to determine if you're happy, sad, stressed, or neutral. Sounds creepy, feels magical.

2. Mood Journal

Every time you log a mood, it's saved with timestamp and context. Over time, you get insights like "You're usually stressed on Monday mornings" or "Nature content improves your mood by 40%." It's like therapy, but cheaper.

3. Smart Content Mix

The algorithm doesn't just show you one type of content. It creates a balanced mix—some matching your mood, some to gently shift it, and occasional surprises to keep things interesting. Because staying sad isn't healthy, and being hyped 24/7 isn't realistic.

4. Social Sharing Without the Pressure

You can share your daily mood stats—not the content itself. It's a conversation starter without the performative pressure of regular social media. "Had an 80% calm day" hits different than "Look at my perfect life."

5. Offline Mode

Downloaded content stays available offline. Because bad moods don't wait for good WiFi.

The Challenges Nobody Warns You About

Content Moderation: Building a mood-based feed means you CANNOT mess this up. Showing triggering content to someone in a vulnerable state can be harmful. I spent weeks implementing safety filters and content warnings.

Privacy Concerns: People are weird about facial analysis. I made everything opt-in, with clear explanations. The camera feature is optional—manual mood selection works just fine.

Content Licensing: You can't just scrape the internet for memes. I built partnerships with content creators and used APIs from platforms that allow redistribution. Legal stuff is boring but necessary.

AI Accuracy: Emotion detection isn't perfect. The model gets it wrong sometimes. I added a "This isn't right" button so users can correct it, which also helps train the model.

Performance Optimizations That Matter

Flutter is fast, but you can still screw it up. Here's what I did:

  • Lazy loading: Feed items load as you scroll. Never load everything at once like an amateur.
  • Image caching: Using cached_network_image package. Load once, cache forever.
  • Video optimization: Videos start at low quality, upgrade as they buffer. Nobody notices, everyone's happy.
  • State management efficiency: Provider updates only what needs updating. No unnecessary rebuilds.

Monetization (Because Servers Aren't Free)

I hate ads as much as you do. MoodFeed uses a freemium model:

Free tier: 50 mood checks per month, basic feed
Premium ($4.99/month): Unlimited checks, advanced analytics, custom mood categories, priority content

The conversion rate sits around 8%, which is solid for a consumer app. Turns out people will pay for something that genuinely helps them.

What's Next?

The roadmap is ambitious but focused:

Q1 2025: Apple Watch integration for quick mood logging
Q2 2025: Group mood circles—share vibes with close friends
Q3 2025: AI-generated personalized content based on mood patterns
Q4 2025: Therapist integration for users who want professional support

Lessons Learned (The Real Ones)

1. Start with MVP: My first version had 1/10th of current features. Ship fast, iterate faster.

2. User feedback is gold: Half my best features came from users DMing me suggestions.

3. AI isn't magic: It's a tool. The real magic is how you apply it to solve real problems.

4. Design matters more than you think: I rebuilt the UI three times. The difference between good and great is those tiny details.

5. Mental health tech is sensitive: Be responsible. Add disclaimers. Partner with professionals. Don't overpromise.

Want to Build This?

The full codebase isn't open source yet (still deciding), but I've written detailed docs on my approach. The tech stack I mentioned isn't the only way—you could swap Flutter for React Native, MongoDB for PostgreSQL, OpenAI for local models. The core concept matters more than the tools.

If you're building something in this space, hit me up. I'm always down to chat about mood tech, AI applications, or why Flutter is superior to React Native (fight me).

Final Thoughts

Building MoodFeed taught me that the best apps solve real problems, not imaginary ones. Social media makes people feel worse, not better. We can do better. We should do better.

This isn't about creating another dopamine slot machine. It's about using technology to actually understand and support people's emotional needs. If I can help even a few thousand people have better days through better content, that's a win.

The app's live on both stores. Go check it out, break it, tell me what sucks. That's how we make it better.


That's a wrap 🎁

Now go touch some code 👨‍💻

Catch me here → LinkedIn | GitHub | YouTube

Top comments (0)