
Overview
Munchbase is an upcoming iOS app launching on the App Store, an AI powered recipe discovery tool that turns your fridge into a personalized cookbook. Scan your fridge with your phone camera, and the app identifies every ingredient on your shelves. Set preferences like meal type, cooking time, and dietary needs, then get tailored recipes based on what you actually have. Beyond the scan, a full recipe database with filters, categories, and curated chef articles makes finding and learning new things effortless. Less waste, fewer grocery runs, more creativity with what's already in your kitchen.
Client
Melonloop
What I did
Product Design
Timeline
Q1 2026
The Problem
Most recipe apps start from the wrong place. They show you what you could cook, then leave you to figure out whether you have the ingredients. The result is a familiar cycle: you find a recipe that looks great, check your kitchen, realize you're missing three things, and either abandon the idea or make a trip to the store for items you'll use once.
On the other side, people regularly throw away food because they couldn't think of what to make with it in time. The disconnect between what's in the fridge and what's on screen means most cooking apps actually add friction to the process instead of removing it.
There was also a gap in how these apps served different types of users. Someone with 15 minutes and a picky toddler has completely different needs than someone exploring a new cuisine on a quiet Sunday. Existing tools treated recipe discovery as one single experience, forcing users to dig through irrelevant results to find something that matched their actual situation.
The Solution
I designed Munchbase around the principle that the best recipe suggestion starts with what you already have, not what you need to buy.
The core interaction is a fridge scan. You open the app, point your camera at your fridge, and the AI identifies the ingredients on your shelves. From there, a refinement step lets you confirm or adjust what was detected, set your meal type, available cooking time, and any dietary restrictions. The app then generates recipe suggestions that match your real constraints, not an idealized pantry.
But the scan is just one entry point. I designed a full recipe database alongside it, with layered filtering for cuisine type, difficulty, prep time, and dietary preferences. The two paths serve different mindsets: the scan answers "what can I make right now," while the database supports "I want to explore something new." Both lead to the same recipe experience, so the app feels cohesive regardless of how you got there.
Beyond recipes, I included a curated articles section featuring content from professional chefs. Techniques, seasonal tips, ingredient guides, and cooking fundamentals. This gives the app a learning dimension that keeps users coming back even when they're not actively cooking.
The interface prioritizes speed and clarity at every step. The scan flow is designed to take seconds, not minutes. Recipe cards surface the most decision relevant information first (time, difficulty, ingredient match percentage) so users can commit quickly. Filters behave predictably and combine without friction. The visual language stays warm and food forward, using photography and generous whitespace to make browsing feel inviting rather than transactional.
The overall goal was to close the gap between "what's in my kitchen" and "what's for dinner" with as few steps as possible, while giving users who want to go deeper a reason to stay.

