Rebuilding a Learning Platform From the Ground Up

Rebuilding a Learning Platform From the Ground Up

BigID had a learning platform. What it didn't have was a system. The catalog was a dump, learning paths were confusing, and badging hadn't been touched in years. I was brought in to fix all of it across 11 surfaces in 3 months.


Phase 1 was the skeleton: foundation first, everything else second. That meant saying no to a personalization questionnaire we didn't have the user research to justify, and designing gamification hooks into the homepage without surfacing them yet. The detailed decisions are in the sections below.


At the end of this case study, I also ran an experiment: what would it look like to ship parts of this without an engineering team at all.

BigID had a learning platform. What it didn't have was a system. The catalog was a dump, learning paths were confusing, and badging hadn't been touched in years. I was brought in to fix all of it across 11 surfaces in 3 months.


Phase 1 was the skeleton: foundation first, everything else second. That meant saying no to a personalization questionnaire we didn't have the user research to justify, and designing gamification hooks into the homepage without surfacing them yet. The detailed decisions are in the sections below.


At the end of this case study, I also ran an experiment: what would it look like to ship parts of this without an engineering team at all.

Role

Senior Product Designer (CONTRACT)

Senior Product Designer (CONTRACT)

Timeline

3 months

COMPANY

bigid

SCOPE

UNAUTHENTICATED HOMEPAGE · AUTHENTICATED HOMEPAGE · CATALOG · LEARNING PATHS · EVENTS · CONTENT (LEGACY FORMAT) · CONTENT (EVOLVE INLINE) · CERTIFICATION EXAM · LAB MANAGEMENT · COMMUNITY · USER PROFILE

Collaborators

DIRECTOR · 3 ENGINEERS · CONTENT CREATOR

Tools

FIGMA · CLAUDE + CHATGPT (COMPETITIVE RESEARCH ONLY) · V0 (AI EXPERIMENT, POST-PROJECT)

IMPACt

FIRST TIME BIGID UNIVERSITY WAS DESIGNED AS A PRODUCT, NOT ASSEMBLED AS A TOOL · dESIGNED FOR THOUSANDS OF BIGID LEARNERS 40 SCREENS ACROSS 11 SURFACES · LAUNCH MAY 2026

KEY decision

DESIGNED HOMEPAGE TO ACCOMMODATE GAMIFICATION IN FUTURE PHASES WITHOUT REQUIRING IT NOW. SKIPPED ONBOARDING QUESTIONNAIRE: BIGID DOESN'T YET HAVE THE RESEARCH TO PERSONALIZE MEANINGFULLY.

VIEW THE designs

VIEW THE designs

Flow 1: The learning journey (24 screens)

Unauthenticated homepage → authenticated homepage → catalog → learning path → course → lab → badge

Flow 1: The learning journey (24 screens)

Unauthenticated homepage → authenticated homepage → catalog → learning path → course → lab → badge

Flow 2: Life of a learner (14 screens)

Authenticated homepage → events → profile → community

Flow 2: Life of a learner (14 screens)

Authenticated homepage → events → profile → community

RESULTS

RESULTS

The designs were presented to the full company. For a contract engagement, that's not a given.

Engineering and the PM responded most to the structural decisions: separating the catalog from learning paths, and untangling events into distinct surfaces for office hours, labs, and exams. These were problems the team had lived with for a long time but hadn't had a design solution for.


The director noted the designs were the most coherent version of BigID University the team had seen. The layout gives content creators a repeatable format for building future courses and makes learner progress easier to track at scale.

The platform launches May 2026 and initial metrics will be added to this case study end of May.


This project was designed in August 2025, before AI was a meaningful part of the design workflow. The AI experiment at the end of this case study explores what that would look like today.

The designs were presented to the full company. For a contract engagement, that's not a given.

Engineering and the PM responded most to the structural decisions: separating the catalog from learning paths, and untangling events into distinct surfaces for office hours, labs, and exams. These were problems the team had lived with for a long time but hadn't had a design solution for.


The director noted the designs were the most coherent version of BigID University the team had seen. The layout gives content creators a repeatable format for building future courses and makes learner progress easier to track at scale.

The platform launches May 2026 and initial metrics will be added to this case study end of May.


This project was designed in August 2025, before AI was a meaningful part of the design workflow. The AI experiment at the end of this case study explores what that would look like today.

context & problem space

context & problem space

This is what happens when a platform is built by engineers and content creators without a product designer. Every surface added in isolation.

This is what happens when a platform is built by engineers and content creators without a product designer. Every surface added in isolation.

Context

Context

BigID University had grown by addition, never by design. Every time a new learning surface was needed, someone bolted it on. The result was a platform that worked fine if you already knew how to use it, and was nearly impenetrable if you didn't.

BigID University had grown by addition, never by design. Every time a new learning surface was needed, someone bolted it on. The result was a platform that worked fine if you already knew how to use it, and was nearly impenetrable if you didn't.

core problems

core problems

Three things were broken. Discovery had no clear entry point: content was dense, unscannable, and identical whether you were logged in or not. Continuity was missing: labs, certification, and learning paths lived in separate silos with no thread connecting them. Progression was invisible: learners had no way to know where they stood or what to do next.

Three things were broken. Discovery had no clear entry point: content was dense, unscannable, and identical whether you were logged in or not. Continuity was missing: labs, certification, and learning paths lived in separate silos with no thread connecting them. Progression was invisible: learners had no way to know where they stood or what to do next.

How i diagnosed this

How i diagnosed this

I audited the existing platform with the director and content creator before touching any screens. The engineering team flagged the calendar and events confusion specifically. They had heard it from learners but had no design solution for it. That input shaped where I focused first.

I audited the existing platform with the director and content creator before touching any screens. The engineering team flagged the calendar and events confusion specifically. They had heard it from learners but had no design solution for it. That input shaped where I focused first.

The problem wasn’t a lack of content, it was a lack of cohesion.

The problem wasn’t a lack of content, it was a lack of cohesion.

The problem wasn’t a lack of content, it was a lack of cohesion.

Constraints & Design Realities

Constraints & Design Realities

BigID University needed to support the full learning lifecycle, from exploration to certification, inside a first-party product ecosystem with no Learning Management System (LMS) foundation to build on.

BigID University needed to support the full learning lifecycle, from exploration to certification, inside a first-party product ecosystem with no Learning Management System (LMS) foundation to build on.

platform constraints

platform constraints

BigID University is a first-party platform, not built on any LMS. That meant designing everything from scratch: onboarding, discovery, labs, exams, and certification, all within an existing product ecosystem, with legacy and new content formats coexisting, and an incremental rollout that couldn't disrupt active learners.

BigID University is a first-party platform, not built on any LMS. That meant designing everything from scratch: onboarding, discovery, labs, exams, and certification, all within an existing product ecosystem, with legacy and new content formats coexisting, and an incremental rollout that couldn't disrupt active learners.

design implications

design implications

The constraints shaped every decision. I prioritized systemic consistency over isolated page redesigns, designed reusable patterns that could scale across content types, balanced ideal UX with engineering constraints and incremental rollout strategy, and focused on clarity and guidance rather than feature expansion. The engineering team needed designs they could ship incrementally without breaking what was already live.

  • Prioritized systemic consistency over isolated page redesigns

  • Designed reusable patterns that could scale across content types

  • Balanced ideal user experience with engineering constraints and incremental rollout strategy

  • Focused on clarity and guidance rather than feature expansion


Rather than redesigning individual pages, the work focused on establishing a flexible foundation that could support future learning formats and growth.

The constraints shaped every decision. I prioritized systemic consistency over isolated page redesigns, designed reusable patterns that could scale across content types, balanced ideal UX with engineering constraints and incremental rollout strategy, and focused on clarity and guidance rather than feature expansion. The engineering team needed designs they could ship incrementally without breaking what was already live.

The goal was not to design a perfect Learning Management System, but to build a flexible foundation that could evolve without re-architecture.

The goal was not to design a perfect Learning Management System, but to build a flexible foundation that could evolve without re-architecture.

The goal was not to design a perfect Learning Management System, but to build a flexible foundation that could evolve without re-architecture.

Goals & Success Criteria

Goals & Success Criteria

Before touching any screens, I established what good would actually look like. Not aspirationally. Measurably.

Before touching any screens, I established what good would actually look like. Not aspirationally. Measurably.

Experience goals

Experience goals

The priority was one thing: eliminate "what should I do next?" for first-time learners. Everything else followed from that. Progress visible early. Logged-out and logged-in states that actually connect. Courses, labs, events, and certification unified into one system instead of four separate ones.

The priority was one thing: eliminate "what should I do next?" for first-time learners. Everything else followed from that. Progress visible early. Logged-out and logged-in states that actually connect. Courses, labs, events, and certification unified into one system instead of four separate ones.

success criteria

success criteria

For learners: a clear starting point, relationships between content types that made immediate sense, and progress visible at a glance.

For the platform: new content formats could be added without restructuring navigation, and engineering could reuse patterns and ship incrementally.

Success meant making a complex learning system feel understandable, not smaller.

Success meant making a complex learning system feel understandable, not smaller.

Success meant making a complex learning system feel understandable, not smaller.

Research & Iterative System Design

Research & Iterative System Design

Formal user studies would have taken time I didn't have. So I made a different call: structural analysis and competitive benchmarking over interviews. The system's problems were already visible. I didn't need more data, I needed to understand the patterns.

Formal user studies would have taken time I didn't have. So I made a different call: structural analysis and competitive benchmarking over interviews. The system's problems were already visible. I didn't need more data, I needed to understand the patterns.

Left - Salesforce Trailhead: Role-based entry points, gamified progression

Right - Udemy: Search first, topic browsing, no structured paths

BigID needed structure without rigidity.

Left - Salesforce Trailhead: Role-based entry points, gamified progression

Right - Udemy: Search first, topic browsing, no structured paths

BigID needed structure without rigidity.

Left - Salesforce Trailhead: Role-based entry points, gamified progression

Right - Udemy: Search first, topic browsing, no structured paths

BigID needed structure without rigidity.

The old system had seven disconnected surfaces. The new one has one spine with everything connected to it.

The old system had seven disconnected surfaces. The new one has one spine with everything connected to it.

The old system had seven disconnected surfaces. The new one has one spine with everything connected to it.

RESEARCH INPUTS

RESEARCH INPUTS

I looked at Salesforce Trailhead and Udemy specifically because they solve opposite problems. Trailhead uses role-based entry points and gamified progression. Udemy is search-first with no structured paths. BigID needed something in between: structured enough for certification, flexible enough for self-directed learners. That gap became the design brief. I used ChatGPT to accelerate the competitive analysis, using it to surface patterns across platforms faster than manual review alone.

I looked at Salesforce Trailhead and Udemy specifically because they solve opposite problems. Trailhead uses role-based entry points and gamified progression. Udemy is search-first with no structured paths. BigID needed something in between: structured enough for certification, flexible enough for self-directed learners. That gap became the design brief. I used ChatGPT to accelerate the competitive analysis, using it to surface patterns across platforms faster than manual review alone.

iteration in practice

iteration in practice

I iterated in high fidelity early because abstract wireframes wouldn't surface the real constraints. Every round was a working session with engineering, not a handoff. That's how we caught the labs entry point problem, the events calendar confusion, and the logged-out to logged-in disconnect before anything was built.

I iterated in high fidelity early because abstract wireframes wouldn't surface the real constraints. Every round was a working session with engineering, not a handoff. That's how we caught the labs entry point problem, the events calendar confusion, and the logged-out to logged-in disconnect before anything was built.

This approach allowed design decisions to be tested against real constraints before committing to system-level patterns.

This approach allowed design decisions to be tested against real constraints before committing to system-level patterns.

Entry Points & Momentum

Entry Points & Momentum

Most platforms show a watered-down homepage that tries to serve both new and returning learners at once. I designed two distinct experiences: one that sells, one that guides. Same surface, completely different jobs.

Most platforms show a watered-down homepage that tries to serve both new and returning learners at once. I designed two distinct experiences: one that sells, one that guides. Same surface, completely different jobs.

logged-out (exploration)

logged-out (exploration)

New learners don't know what BigID University offers or whether it's worth their time. The logged-out homepage answers that before asking for anything. Value proposition first, learning paths and certification outcomes second, no sign-in required to understand what you're getting into.

New learners don't know what BigID University offers or whether it's worth their time. The logged-out homepage answers that before asking for anything. Value proposition first, learning paths and certification outcomes second, no sign-in required to understand what you're getting into.

logged-in (commitment)

logged-in (commitment)

Once a learner is in, they don't need to be sold. They need to know where they left off and what to do next. Progress, upcoming priorities, and next steps surface immediately. The homepage becomes a dashboard, not a marketing page. The authenticated homepage is also designed to accommodate gamification in a future phase without requiring restructuring. The hooks are built in, just not surfaced yet.

Once a learner is in, they don't need to be sold. They need to know where they left off and what to do next. Progress, upcoming priorities, and next steps surface immediately. The homepage becomes a dashboard, not a marketing page. The authenticated homepage is also designed to accommodate gamification in a future phase without requiring restructuring. The hooks are built in, just not surfaced yet.

Before: one homepage for everyone.

After: two distinct experiences that serve different learner needs without compromise.

Before: one homepage for everyone.

After: two distinct experiences that serve different learner needs without compromise.

Discovery & Structure

Discovery & Structure

The old catalog was a dump. Everything in one place, no hierarchy, no way to know where to start. I rebuilt discovery around learner intent rather than content inventory.

The old catalog was a dump. Everything in one place, no hierarchy, no way to know where to start. I rebuilt discovery around learner intent rather than content inventory.

Navigation was reorganized around what learners are trying to do, not how BigID internally categorizes its content. Role, topic, and certification tier replaced a flat alphabetical list.

Navigation was reorganized around what learners are trying to do, not how BigID internally categorizes its content. Role, topic, and certification tier replaced a flat alphabetical list.

Navigation was reorganized around what learners are trying to do, not how BigID internally categorizes its content. Role, topic, and certification tier replaced a flat alphabetical list.

The catalog unified courses, labs, learning paths, and events into one filterable surface. Previously each lived in a separate place. A learner shouldn't need to know the platform's architecture to find what they're looking for.

The catalog unified courses, labs, learning paths, and events into one filterable surface. Previously each lived in a separate place. A learner shouldn't need to know the platform's architecture to find what they're looking for.

Every content type follows the same structural model: scope, effort, and outcomes visible before clicking in. The decision to standardize this across all content types was deliberate. Learners make better decisions when they can compare options on equal footing.

Every content type follows the same structural model: scope, effort, and outcomes visible before clicking in. The decision to standardize this across all content types was deliberate. Learners make better decisions when they can compare options on equal footing.

design intent

design intent

The catalog is organized around what a learner is trying to achieve, not where content happens to live. Every item surfaces the same information in the same order so comparison is effortless.

The catalog is organized around what a learner is trying to achieve, not where content happens to live. Every item surfaces the same information in the same order so comparison is effortless.

The catalog evolved into a flexible discovery framework that unified content formats while clarifying scope, effort, and progression.

The catalog evolved into a flexible discovery framework that unified content formats while clarifying scope, effort, and progression.

Before: a content dump with no hierarchy.

After: a unified catalog organized around what learners are trying to achieve.

Before: a content dump with no hierarchy.

After: a unified catalog organized around what learners are trying to achieve.

Progression & Motivation

Progression & Motivation

The biggest risk with a certification platform is making learners feel behind before they've started. Every progression decision was made to prevent that.

The biggest risk with a certification platform is making learners feel behind before they've started. Every progression decision was made to prevent that.

Left: Completed modules are marked clearly but quietly. The focus stays on what's next, not what's left.

Center: Progress updates in real time. Locked states are framed as readiness signals, not failure. "You're not ready yet" is a different message than "you can't do this."

Right: Certification pages surface eligibility requirements and expected outcomes before a learner commits. Self-assessment before sign-up, not after.

Left: Completed modules are marked clearly but quietly. The focus stays on what's next, not what's left.

Center: Progress updates in real time. Locked states are framed as readiness signals, not failure. "You're not ready yet" is a different message than "you can't do this."

Right: Certification pages surface eligibility requirements and expected outcomes before a learner commits. Self-assessment before sign-up, not after.

Learning paths are designed as guided journeys, not checklists. The distinction matters: a checklist creates anxiety about completion, a journey creates momentum toward a goal.

Learning paths are designed as guided journeys, not checklists. The distinction matters: a checklist creates anxiety about completion, a journey creates momentum toward a goal.

platform constraints navigated

platform constraints navigated

I advocated for automatic advancement between activities in a path. The content creator and the existing BigID content structure made that impossible to implement cleanly in phase 1. The breadcrumb became the fallback: not the ideal solution, but a deliberate one that keeps learners oriented without requiring a backend overhaul.


Badges are earned through exam completion only. Proof of completion covers everything else. I kept that distinction visible throughout so learners always know what they're working toward and what they've actually earned. (visible in Flow 1 of the prototype above)

I advocated for automatic advancement between activities in a path. The content creator and the existing BigID content structure made that impossible to implement cleanly in phase 1. The breadcrumb became the fallback: not the ideal solution, but a deliberate one that keeps learners oriented without requiring a backend overhaul.


Badges are earned through exam completion only. Proof of completion covers everything else. I kept that distinction visible throughout so learners always know what they're working toward and what they've actually earned. (visible in Flow 1 of the prototype above)

Progress was designed to motivate forward movement, not pressure completion.

Progress was designed to motivate forward movement, not pressure completion.

Before: no visibility into where you stood or what came next.

After: progress, requirements, and next steps visible at every stage.

Before: no visibility into where you stood or what came next.

After: progress, requirements, and next steps visible at every stage.

Hands-On Learning (Labs)

Hands-On Learning (Labs)

Labs are time-limited, stateful, and technically complex. The design job was to make all of that invisible so learners could focus on practicing, not managing.

Labs are time-limited, stateful, and technically complex. The design job was to make all of that invisible so learners could focus on practicing, not managing.

Lab management — active and expiring states

Lab management — active and expiring states

platform constraints navigated

platform constraints navigated

Labs can't be launched directly from within a course activity. I added a clear CTA from the course screen that opens the lab in a new tab, with a breadcrumb on the lab page so learners always know where they came from and how to get back. (visible in Flow 1 of the prototype)

Labs can't be launched directly from within a course activity. I added a clear CTA from the course screen that opens the lab in a new tab, with a breadcrumb on the lab page so learners always know where they came from and how to get back. (visible in Flow 1 of the prototype)

The biggest UX risk with labs isn't complexity. It's losing your place. Every decision here was made to prevent that.

The biggest UX risk with labs isn't complexity. It's losing your place. Every decision here was made to prevent that.

Before: labs lived separately with no connection to courses.

After: integrated into the learning journey with clear entry and return paths.

Before: labs lived separately with no connection to courses.

After: integrated into the learning journey with clear entry and return paths.

System Consistency & Reusable Patterns

System Consistency & Reusable Patterns

The fastest way to make a platform feel broken is inconsistency. A learner who recognizes a pattern doesn't have to think. That was the goal.

The fastest way to make a platform feel broken is inconsistency. A learner who recognizes a pattern doesn't have to think. That was the goal.

One card pattern adapts across courses, labs, paths, and events. Modality changes. Structure doesn't.

One card pattern adapts across courses, labs, paths, and events. Modality changes. Structure doesn't.

The same card scales from logged-out exploration to logged-in action. State changes. Familiarity doesn't.

The same card scales from logged-out exploration to logged-in action. State changes. Familiarity doesn't.

Booking and credit patterns extend across events and office hours. A learner who has booked a course already knows how to book an event.

Booking and credit patterns extend across events and office hours. A learner who has booked a course already knows how to book an event.

Learners don't experience pages, they experience patterns.


Shared components across all 11 surfaces reduced cognitive load and let the platform scale without fragmenting the experience. Engineering could reuse patterns and ship incrementally. Content creators could add new formats without breaking familiarity.

Learners don't experience pages, they experience patterns.


Shared components across all 11 surfaces reduced cognitive load and let the platform scale without fragmenting the experience. Engineering could reuse patterns and ship incrementally. Content creators could add new formats without breaking familiarity.

Learners don't experience pages, they experience patterns.


Shared components across all 11 surfaces reduced cognitive load and let the platform scale without fragmenting the experience. Engineering could reuse patterns and ship incrementally. Content creators could add new formats without breaking familiarity.

Before: four disconnected surfaces with no shared patterns.

After: one design language across 11 surfaces.

Before: four disconnected surfaces with no shared patterns.

After: one design language across 11 surfaces.

Supporting Surfaces

Supporting Surfaces

These surfaces don't drive the core learning journey. But they're where learners decide whether the platform feels coherent or cobbled together.

These surfaces don't drive the core learning journey. But they're where learners decide whether the platform feels coherent or cobbled together.

Events were the most confusing surface in the old platform. Office hours, instructor-led, and events all lived in one undifferentiated calendar. I separated them into distinct types with consistent card patterns so a learner always knows what they're signing up for before they click.

Events were the most confusing surface in the old platform. Office hours, instructor-led, and events all lived in one undifferentiated calendar. I separated them into distinct types with consistent card patterns so a learner always knows what they're signing up for before they click.

Events were the most confusing surface in the old platform. Office hours, instructor-led, and events all lived in one undifferentiated calendar. I separated them into distinct types with consistent card patterns so a learner always knows what they're signing up for before they click.

Left - The profile surfaces progress, state, and next steps outside the core learning flow. A learner who returns after two weeks shouldn't have to reconstruct where they left off.

Right - Community follows the same structural patterns as the rest of the platform. The goal was for learners to feel like they're still inside BigID University, not redirected to a separate tool.

Left - The profile surfaces progress, state, and next steps outside the core learning flow. A learner who returns after two weeks shouldn't have to reconstruct where they left off.

Right - Community follows the same structural patterns as the rest of the platform. The goal was for learners to feel like they're still inside BigID University, not redirected to a separate tool.

Left - The profile surfaces progress, state, and next steps outside the core learning flow. A learner who returns after two weeks shouldn't have to reconstruct where they left off.

Right - Community follows the same structural patterns as the rest of the platform. The goal was for learners to feel like they're still inside BigID University, not redirected to a separate tool.

Supporting surfaces are where design consistency either pays off or falls apart. Every pattern established in the core learning flow extends here, so learners never feel like they've left the system.

Supporting surfaces are where design consistency either pays off or falls apart. Every pattern established in the core learning flow extends here, so learners never feel like they've left the system.

Before: three surfaces that felt like different products. Events, community, and profile each had their own logic with no shared patterns.

After: one coherent system. Events have distinct types so learners know what they're signing up for. Community follows the same patterns as the rest of the platform. Profile surfaces progress and next steps from anywhere. Learners never feel like they've left BigID University.

Before: three surfaces that felt like different products. Events, community, and profile each had their own logic with no shared patterns.

After: one coherent system. Events have distinct types so learners know what they're signing up for. Community follows the same patterns as the rest of the platform. Profile surfaces progress and next steps from anywhere. Learners never feel like they've left BigID University.

Impact & next steps

Impact & next steps

Impact

Impact

The designs were presented to the full company. For a contract engagement, that doesn't happen unless the work lands.

Engineering and the PM responded most to the structural decisions: separating the catalog from learning paths, untangling events into distinct types, and connecting progression across surfaces. These were problems the team had lived with for years but hadn't had a design solution for. The platform serves thousands of BigID learners. This was the first time their entire learning experience was designed as a coherent system.


The director noted these were the most coherent designs BigID University had ever had. The platform launches May 2026. Metrics will be added to this case study end of May.

what i would have measured

what i would have measured

Onboarding: percentage of new learners who reach their first course within one session.

Catalog: search-to-enrollment rate and time to first click from catalog landing.

Learning paths: completion rate per path and drop-off point by activity type.

Badges and certification: exam attempt rate among learners who reach the prerequisite stage.

Overall: return visit rate within 30 days of first session.

Reflection & Next Steps

Reflection & Next Steps

If I had another month, I'd focus on three things. First, onboarding personalization: BigID doesn't yet have enough learner research to justify a questionnaire, but that research should happen before phase 2. Second, smarter recommendations based on what learners have already completed. Third, drop-off measurement across the full learning funnel so the team knows where the system is working and where it isn't.


The foundation is built to support all three without restructuring. That was the point.

EXPLORE THE DESIGNS

EXPLORE THE DESIGNS

Now that you know the decisions behind them, here's the full experience.

Flow 1: The learning journey (24 screens)

Unauthenticated homepage → authenticated homepage → catalog → learning path → course → lab → badge

Flow 1: The learning journey (24 screens)

Unauthenticated homepage → authenticated homepage → catalog → learning path → course → lab → badge

Flow 2: Life of a learner (14 screens)

Authenticated homepage → events → profile → community

Flow 2: Life of a learner (14 screens)

Authenticated homepage → events → profile → community

IF I WERE STARTING THIS TODAY

IF I WERE STARTING THIS TODAY

This project was designed in August 2025, before AI was a meaningful part of my workflow. The section below is a retrospective experiment: I took the core learning journey from this project and explored what it would look like to design it, using v0 by Vercel. This isn't a production-ready product. It's a proof of concept that took one session to build. The point isn't that it's shippable. It's that a designer can now get from idea to interactive prototype without an engineering team, in the time it used to take to just set up a handoff doc.


I chose v0 specifically because I already stress-tested Lovable in my PetPals project. I wanted to see how a different tool handled complex product logic rather than repeat the same experiment.

This project was designed in August 2025, before AI was a meaningful part of my workflow. The section below is a retrospective experiment: I took the core learning journey from this project and explored what it would look like to design it, using v0 by Vercel. This isn't a production-ready product. It's a proof of concept that took one session to build. The point isn't that it's shippable. It's that a designer can now get from idea to interactive prototype without an engineering team, in the time it used to take to just set up a handoff doc.


I chose v0 specifically because I already stress-tested Lovable in my PetPals project. I wanted to see how a different tool handled complex product logic rather than repeat the same experiment.

This project was designed in August 2025, before AI was a meaningful part of my workflow. The section below is a retrospective experiment: I took the core learning journey from this project and explored what it would look like to design it, using v0 by Vercel. This isn't a production-ready product. It's a proof of concept that took one session to build. The point isn't that it's shippable. It's that a designer can now get from idea to interactive prototype without an engineering team, in the time it used to take to just set up a handoff doc.


I chose v0 specifically because I already stress-tested Lovable in my PetPals project. I wanted to see how a different tool handled complex product logic rather than repeat the same experiment.

the experiment

the experiment

I prompted v0 from scratch with no Figma files, no design system, no handoff. Just a description of the platform and its logic. In one session I had a navigable 6-screen learning journey: homepage, catalog, learning path, course activity, certification exam, badge earned, and proof of completion.

I prompted v0 from scratch with no Figma files, no design system, no handoff. Just a description of the platform and its logic. In one session I had a navigable 6-screen learning journey: homepage, catalog, learning path, course activity, certification exam, badge earned, and proof of completion.

Figma Designs Flow (24 screens)

3 months, full engineering team, complete product logic

Figma Designs Flow (24 screens)

3 months, full engineering team, complete product logic

v0 Prototype Flow (6 screens)

one session, no engineering team, interactive proof of concept

v0 Prototype Flow (6 screens)

one session, no engineering team, interactive proof of concept

Top right: Locked exam with badge reward: strong on first attempt.

Top left: Self-paced vs instructor-led toggle: generated correctly on first prompt.

Bottom: The learning path with distinct activity states (completed, in progress, locked).

Top right: Locked exam with badge reward: strong on first attempt.

Top left: Self-paced vs instructor-led toggle: generated correctly on first prompt.

Bottom: The learning path with distinct activity states (completed, in progress, locked).

Top right: Locked exam with badge reward: strong on first attempt.

Top left: Self-paced vs instructor-led toggle: generated correctly on first prompt.

Bottom: The learning path with distinct activity states (completed, in progress, locked).

WHAT WORKED

WHAT WORKED

v0 handled visual structure quickly. Navigation hierarchy, card patterns, progress states, and the self-paced versus instructor-led toggle all came through cleanly from a text description. The certification exam detail page with the locked state and badge reward panel was particularly strong on the first attempt.

Instructor-led scheduling: UI generated correctly but session state didn't persist between screens.

Left: My Figma designs

Right: v0

Instructor-led scheduling: UI generated correctly but session state didn't persist between screens.

Left: My Figma designs

Right: v0

Instructor-led scheduling: UI generated correctly but session state didn't persist between screens.

Top: My Figma designs

Bottom: v0

WHERE IT BROKE DOWN

WHERE IT BROKE DOWN

The tool struggled with stateful logic. Instructor-led scheduling, lab management flows, and credit-based enrollment required multiple prompts to get right and never fully resolved. These interactions are straightforward to design in Figma because you're defining behavior, not implementing it. In v0 the line between design and code means every interaction has to actually work, which surfaces complexity that wireframes can ignore.


The visual output is also intentionally generic. v0 defaults to clean but characterless UI. The brand decisions, typographic hierarchy, and component specificity that make the Figma designs feel like a real product don't transfer from a text prompt alone.

THE HONEST TAKEAWAY

THE HONEST TAKEAWAY

v0 compressed a multi-week engineering effort into one session. That's genuinely significant. But it's a starting point, not a finished product. The gap between what v0 generates and what the Figma designs specify is exactly where design judgment lives. The tool doesn't replace that judgment. It just changes when and where you apply it.

v0 compressed a multi-week engineering effort into one session. That's genuinely significant. But it's a starting point, not a finished product. The gap between what v0 generates and what the Figma designs specify is exactly where design judgment lives. The tool doesn't replace that judgment. It just changes when and where you apply it.