Building Calex: How I'm Engineering an AI Wellness Companion
When I started building Calex, the goal was simple: make it genuinely easy for people to understand their own health. Not another calorie counter. Not another step tracker. Something that actually thinks with you.
The Core Problem
Most health apps are data graveyards. You log your meals, your sleep, your workouts — and then what? A number goes up or down and you're left to figure out what it means.
Calex is built around a different idea: multimodal input + LLM reasoning = actionable context.
The Architecture
The stack is Next.js on the frontend, Python (FastAPI) for the AI layer, and Firebase for real-time sync. Here's the rough shape:
User Input (text / voice / image / barcode)
↓
Input Normalization Layer (Python)
↓
LLM Reasoning (OpenAI / Gemini)
↓
Personalization Engine (user history + goals)
↓
Response + Dashboard Update
The hardest part wasn't the AI — it was the input normalization. A photo of a plate of biryani and a barcode scan of a protein bar need to end up in the same data shape before the LLM sees them.
Google Fit Integration
Wearable data changes everything. When the AI knows you slept 5 hours and walked 2000 steps, its nutritional recommendations shift. We pull from Google Fit via OAuth and feed that context into every LLM call.
What's Next
- Habit streaks with adaptive reminders
- Meal plan generation based on weekly patterns
- Mental wellness check-ins tied to sleep and activity data
If you're building in health tech, I'd love to compare notes. Reach out at contact@shuence.com.