DEV Community

Brad Hankee
Brad Hankee

Posted on

AI is now my Health Coach ... and it's working ๐Ÿ‹๏ธ

Connecting My Smart Scale to Gemini LLM for Custom Daily Meal Plans

Recently, I built a local app that connects my withings smart scale to Google's Gemini LLM and uses my real-time daily weight data to generate custom daily meal plans. The goal: hit my target body weight while preserving as much lean mass as possible.

This article breaks down the idea, the tech stack, and some key learnings โ€” including how temperature values in LLMs can drastically affect the type of outputs you get.

Why I Built This

I wanted a way to:

  • Automatically collect my weight data every morning.
  • Feed that into an LLM with a dynamic and customized prompt.
  • Get back a set of meals tailored to my caloric needs for that day โ€” with a focus on protein intake to preserve muscle.

Instead of manually calculating macros and searching for recipes, the LLM now does all of this for me.

The Tech Stack

Here's the setup:

  • Withings Smart Scale โ€“ Syncs weight data daily (total, lean mass, fat mass).
  • Python โ€“ language built with to get data and hook into LLM.
  • Gemini LLM โ€“ Takes weight, target weight, and some user preferences as input.

NOTE: Withings API is lets say... not great to use. After seeing this opinion all over I decided to make my life easy and use an applet from IFTTT to get my data into a google sheet. This made sense since this is a local app since it uses a very specific scale.

Sample of Output

The only thing I give the LLM everyday are the proteins I have and plan to eat. In this case you can see if told it 85/15 ground beef and chicken thighs.

chatbot image 1

chatbot image 2

Progress so Far

So far it's been great and I am down about 3 pounds. I added into the prompt to give me 2 options per meal so it would be somewhat flexible. I added history so that if it gives me a menu or item I do not have I can tell it and it will give me a better option such as this...

Show LLM history

Working With LLM Temperature

One of the most interesting discoveries was how temperature affects the LLM's output.

  • Low temperature (0.0โ€“0.3) โ†’ Very consistent results. The LLM mostly gives the same meal plan with minor variations. Good for keeping macros stable.
  • Medium temperature (0.5โ€“0.7) โ†’ Adds more variety to meals. You get different recipes day to day, but they still hit macro targets.
  • High temperature (0.8โ€“1.0) โ†’ Very creative. Sometimes too creative โ€” you get exotic meals or unusual combinations that technically hit macros but might not be practical.

I found that 0.5 was the sweet spot: balanced creativity without going off the rails.

Here is a good example of the differences in the temperature. The first image is at .9 and the second at .1. You can see in the .1 the numbers are more consistent with almost always referencing 5oz for the portion sizes.

Temp @ .9

temperature .9

Temp @ .1

temperature .1

Key Challenges

  • Consistency: Sometimes the LLM would under- or over-estimate calories. I had to add a check to ensure the total calories matched my target before saving the plan.
  • Variety vs. Adherence: Too much meal variety made grocery shopping harder. I added an option to limit variety to a weekly rotation.

Takeaways

  1. LLM temperature tuning matters a lot โ€” find the right level for your use case.
  2. Automating meal plans saves mental energy and reduces decision fatigue.
  3. Preserving history enables deeper personalization and makes it easier to see what works over time.

You can check out the code in this Repo if you would like. Thanks for reading!


Top comments (0)