Import your design taste into AI coding tools.
Most AI coding tools default to generic Tailwind and shadcn. They have no idea what you think good design looks like, so they hand back a product that could have been designed by anyone. Taste fixes that at the source by turning the UI you already admire into a structured profile that AI coding agents can read and design from.
The mechanic is deliberately boring. You save screenshots of UI you love. Under the hood, a GPT-4o vision pipeline extracts more than 40 structured signals from each capture: colors with semantic roles, the full type scale, the spacing rhythm, the shape language, composition and density, WCAG contrast on real text/background pairs, and bounding-boxed UI elements. The server also reads the image bytes directly to ground the model's platform guess in actual device class and aspect ratio. Every save becomes a labeled example.
Across your library, those signals collapse into a compact, versioned Taste Profile: a qualitative summary of your sensibility, plus per-category scoped profiles (navigation, dashboards, forms, onboarding...), plus an explicit anti-preferences block built from designs you've rejected. Rate any AI-generated screen "nailed it", "close", or "missed" and that feedback pipes directly into the next synthesis pass, delivering prompt-level RLHF for your design taste with no model fine-tuning. Your personal taste is modeled as a 6-dimensional taste vector with confidence per dimension, and every swipe on a community design becomes a stored preference pair with an Elo score. It's a structured preference-learning system where the product happens to be the dataset.
What makes Taste different is the output. The profile exports natively as a Claude or Codex skill, so when Cursor, Claude Code, or Codex designs a screen for you, it already knows what your buttons should feel like, what spacing is yours, and what you consider slop. For tools that speak MCP, Taste also runs a first-class MCP server with full OAuth 2.0, rotating access and refresh tokens, per-token rate limits, and a complete audit log, so the taste profile becomes live context on every prompt. It's calibration, not prompting.
A design decision I kept coming back to: Taste should never be the thing you interact with. It should sit quietly in the background and make every other tool you use smarter. No tagging, no folders, no curation. You just save what you like, and the profile lives where the AI reads it, not where you stare at it.
Taste runs on every surface designers and builders actually work on. A native macOS app handles system-wide capture on a single global hotkey. A native iOS app ingests inspiration from your phone, with APNs push and full StoreKit 2 subscription lifecycle. A Figma plugin ingests frames directly. The web app at buildwithtaste.com is where you review your library, curate per-project profiles, swipe-rate community designs, and export to your AI tools. Everything runs on Supabase: Postgres with row-level security on every table, pgvector for similarity, and twelve Deno edge functions for vision, synthesis, push, StoreKit, Figma ingestion, and MCP auth.
The meta-flex: the taste profile the AI tools around me use, the one that tells them what a button should feel like, what spacing is mine, what a good list should look like, is produced by this app. Taste is the feedback loop between the UI I admire and the software I make.
Highlights
Links