
Overview
HyNote is an AI powered note taking app that turns meetings, lectures, and conversations into structured, actionable notes. Users can record audio, snap photos, import PDFs, or paste links, and the app instantly generates summaries, flashcards, and study guides. I designed the end to end product experience across iOS, iPadOS, macOS, Apple Watch, and Apple Vision Pro, focusing on making AI output feel useful rather than overwhelming, and giving users full control over how their notes are structured and consumed.
What I did
Product Design
Timeline
2025
The Problem
People sit through hours of meetings, lectures, and conversations every day, then spend even more time trying to make sense of what was said. The existing note taking tools on the market fell into two camps: manual apps that required constant typing and AI recorders that dumped walls of unstructured text with no clear way to act on it.
Users needed to capture everything without doing anything, but they also needed the output to match the way they actually think, study, and work. A raw transcript is not useful. A generic summary often misses what matters most. And every user has a different definition of "what matters."
The Solution
I designed HyNote around the idea that capturing information and making sense of it are two separate problems that need two separate design approaches.
For capture, the experience had to be invisible. One tap recording with live transcription, support for multiple input types (audio, images, PDFs, web pages, YouTube links), and an Apple Watch companion for recording on the go. The goal was to remove every possible barrier between "something important is happening" and "I have it saved."
For output, I flipped the approach entirely. Instead of giving users a single AI generated summary, I designed a template system that lets them choose how notes get structured. Brief summary, study guide, meeting minutes, lecture takeaways, literature review, or a fully custom format. The same recording can be transformed into completely different outputs depending on the context.
The template picker surfaces right after a note is processed, keeping the interaction lightweight. Users see their raw content alongside AI generated key terms and definitions, then choose (or build) the format that fits. Flashcards and quizzes can be generated directly from any note, turning passive content into active learning material.
The interface stays minimal throughout. Clean typography, generous spacing, and a muted color palette keep the focus on content rather than chrome. Navigation follows iOS conventions closely, so the learning curve stays near zero even with the depth of AI features underneath.

