Vibe is a wardrobe management app with AI outfit recommendations. I built it at Tamba from nothing to production in about 3 months. Here's the honest version.
The stack decision
React Native with Expo. Some teams avoid Expo because of the managed workflow constraints, but for an MVP it's the right call. The OTA updates alone are worth it — no waiting for App Store review on every bug fix.
For backend: Node.js, PostgreSQL, AWS S3 for image storage. Nothing exotic.
The AI part
The outfit recommendations are the core value prop. We use a combination of:
- User's wardrobe items (stored with metadata: color, category, formality)
- Weather data (pulled from API at recommendation time)
- Occasion context (provided by user)
Fed into a prompt to GPT-4. The engineering trick is the structured output — we force the model to return specific wardrobe item IDs rather than free text descriptions, so the UI can render actual images.
What actually slowed us down
Image handling. Users upload photos of their clothes. Processing, compressing, storing, and serving them at the right resolution for every screen size took longer than the entire AI integration.
Lesson: if your app is image-heavy, design the image pipeline first.
What Expo saved
Push notifications, OTA updates, and EAS Build. Without EAS, our CI/CD for iOS and Android would have been a project in itself. With it, merging to main triggers a build and submits to both stores automatically.
What I'd change
Start with a design system earlier. We made component decisions on the fly and paid for it with inconsistency. Even a simple token file (spacing, typography, colors) would have saved us.
Shipped anyway. Good enough.