Operationalizing an AI‑Amplified UX Workflow
Over six months, I rebuilt my product design practice around AI, not as a gimmick, but as a system. I used Google AI Studio to vibe-code my portfolio, and VS Code + Copilot to harden code for production. I used Figma Make to create a complete enterprise web app prototype, plus generated specs and markdown requirements. By treating prompts as requirements and contracts instead of clever one-liners, I created a repeatable workflow that turned design ideas into functioning interfaces that can be used by development as a starting point for production code.
Client
Independent Study
Start
01 May 2025
Complete
01 October 2025
Services
AI-First Design
Description
The project involved a six-month effort to rebuild a product design practice around AI as a systemic tool rather than a gimmick. The goal was to operationalize an AI-amplified UX workflow to increase velocity without sacrificing quality. This included using Google AI Studio for "vibe-coding" a portfolio website and Figma Make for rapid application prototyping, all while developing structured "precision protocols" to ensure quality control over AI outputs.
Key Features
-
AI-Accelerated Vibe Coding in Google AI Studio: I created a version of my portfolio website that was built from scratch using high-token conversational history and markdown requirements with Google AI Studio, translating abstract aesthetic requests into production-ready code.
-
Vibe Coding Lesson 1: Define What and Why: AI is just a tool, and it is a tool that will create content infinitely if no direction is given. As UX professionals, we must first know WHAT and WHY we are building something BEFORE we build it.
-
Vibe Coding Lesson 2: Evaluate How to Build: Once we have a clear plan, requirements, etc., we must evaluate HOW to build it. In my case, I am not a professional software developer, and I learned the hard way that AI does not generate production-ready code. After spending way too much time trying to force it, I made the decision to work with professional code for the final version of my portfolio website.
-
Rapid Prototyping in Figma Make: I created BrowserFence, a full application prototype created in Figma Make, from concept to demo-ready in 13 days. 70%+ faster than the typical 4–6 week baseline. I maxed out 1 month's worth of Make tokens in 13 days.
View the live BrowserFence prototype ↗ -
Precision Protocols (The Human Filter): I developed systematic Human-in-the-Loop (HITL) prompting steps, including using strict markdown files to provide system instructions and project rules to the LLM. Using md files helped to counter frequent AI hallucinations, ensure accuracy, and to validate code integrity.
-
Design System Consistency (Human-Driven): The final prototype maintained visual consistency near 97%. This was achieved not by the AI models, but through my own visual QA expertise, very careful curation and refining of the prompt-created designs.
Technologies Used
-
AI Code Generation: Google AI Studio, VS Code + Copilot, Cursor, Claude, ChatGPT 5
-
AI Code Refinement/Debugging: VS Code + Copilot, Cursor AI, Google AI Studio, ChatGPT 5
-
AI Prototyping: Figma Make, Google Stitch, Lovable, Framer
-
Design/Component Libraries: Figma, Preline + Preline for Figma (for design system integration), Adobe XD, Google AI Studio
-
Image Generation: Adobe Creative Cloud (Firefly, Photoshop AI, etc.), Google Nano Banana image gen, Pixlr, Canva
Project Highlights
-
The Illusion of One-Shot Magic: AI provided some very delightful experiences that quickly showed me what was possible with a single prompt. The downside, however, is that moving any deeper than the surface reveals all of the weaknesses and structural fragilities of AI-generated work.
-
AI is Just a Tool: Digging deep into AI-led projects shows (rather disappointingly) that AI is just a new tool in the product design toolbelt (though it is a really good tool!). All of the key human factors that UX / product design are based on (empathy, judgment, taste, and context) are now even more relevant in the AI era.
-
Velocity and Quality: The last 6 months showed me that an AI-amplified design process can move up to 3× faster and deliver code-based deliverables with a high level of polish suitable for executive review and immediate user testing.
-
Conclusion: This six-month AI upskilling journey proved that velocity and quality don’t have to compete. The designer’s role is no longer defined by execution, but by curation, taste, and strategic guidance. Knowing which prompts to craft, which outputs to keep, and what feels right, relevant, and human. AI can generate endlessly, but it’s our ability to curate, refine, and translate vision into value that makes the work resonate with people and drive business forward.
Detailed Case Study
Enterprise SaaS
platforms
Revenue-generating
products shipped
% delivery velocity
improvement
Fortune 500
positions held