−75% Campaign deployment time
−90% Manual error rate
100% Marketing team autonomy
10× Adoption increase

The opportunity hiding
inside a cost center

The Rewards Rule Engine powered our entire loyalty ecosystem — automating how incentives were triggered, stacked, and distributed. It worked. But it was a black box: opaque to stakeholders, inaccessible to growth teams, and entirely dependent on engineering for even the simplest campaign changes.

For a tool designed to reward users, the experience of creating a rule felt like punishment. The system was built for engineers, not for the growth teams who actually needed to move fast.

— Pooja Kudesia, UX Lead

This wasn't just a UX problem — it was a business velocity problem. Every campaign change required developer time. Every logic conflict surfaced after launch. Every new reward type needed months of backend rearchitecting. I reframed the brief: not "redesign the interface" but redesign the system so non-technical teams could own the entire campaign lifecycle.

Three compounding friction points

Before scoping a solution, I led a cross-functional diagnostic to understand the true cost of the status quo.

Operational Bottlenecks

Every rule change — no matter how minor — required developer intervention. Marketing waited days for campaigns that should have launched in hours. Engineering was firefighting instead of building.

The Trust Gap

Stakeholders had zero visibility into how rewards were triggered or resolved. When campaigns underperformed, nobody could diagnose why. The black-box nature eroded confidence in the data it produced.

Inability to Scale

As the business grew, the rigid architecture couldn't accommodate new reward types — points, cashback, partner vouchers. Every new format required custom backend work. The system was a ceiling, not a platform.

Before vs. After:
The experience transformation

Before — The status quo

Developer-gated, opaque, fragile

  • Campaign launch required 4 days average — dev, QA, and deploy required
  • 60% of rule failures were logic conflicts caused by confusing UI
  • Marketing had zero visibility into active rules or campaign performance
  • New reward types took months to implement in the backend
  • Engineering and growth teams operated in completely separate silos
After — The redesigned system

Self-serve, transparent, scalable

  • Campaign launch reduced to under 1 hour — no engineering required
  • Predictive validation catches logic conflicts before they go live
  • Real-time analytics dashboard gives marketing full performance visibility
  • Modular rule builder supports unlimited reward type combinations
  • Marketing owns the full end-to-end campaign lifecycle
Before — Legacy Interface
Screenshot of legacy rule engine interface: dense data tables with no visual hierarchy, all text same weight, no colour coding, requiring developer knowledge to navigate.
The legacy interface: functional for engineers, inaccessible for growth teams
Screenshot showing complex workflow diagram of legacy system with multiple manual handoff steps between marketing, developer, and QA teams before any campaign goes live.
Complex, undifferentiated data display with no visual hierarchy or logic guidance

I didn't just redesign screens.
I architected a workflow.

My approach was diagnostic before it was generative. I built shared understanding across stakeholders before a single wireframe was drawn.

  1. 1

    Cross-functional audit & stakeholder alignment

    I initiated a structured audit involving Marketing, Product, and Data Engineering — creating a shared language for the problem before any solutions were proposed. This prevented the redesign from being perceived as a UI facelift and positioned design as a strategic driver of operational change.

  2. 2

    Deep-dive user research with 10+ power users

    I personally led interviews asking participants to walk me through their last 3 campaign attempts — mapping friction, workarounds, and failure points in real time rather than asking them what they wanted.

    Key finding: 60% of rule failures were not bugs — they were logic conflicts caused by a UI with no guardrails.
  3. 3

    12 months of quantitative campaign data analysis

    Rather than designing for edge cases, we analysed historical data to identify the most frequent rule combinations. This gave us a principled basis for prioritisation: build the modular builder around the 20% of patterns covering 80% of real-world use cases.

    Prioritisation insight: 80% of use cases covered with zero custom coding in the new system.
  4. 4

    Iterative concept testing & logic validation design

    I designed and facilitated multiple rounds of concept testing — specifically to stress-test the logic-building experience. We tested whether non-technical users could correctly configure complex rule stacks without introducing conflicts. The fail-safe validation system emerged directly from observing users in early prototypes.

  5. 5

    Engineering partnership for scalable architecture

    I embedded with Data Engineering throughout the build — not to dictate implementation, but to ensure the modular design system was technically sustainable. The resulting component architecture enabled future reward types to be added without redesign.

    Result: Unlimited reward type scalability built into the architecture from day one.

Design as business strategy

I reframed the project around three strategic objectives aligned to business KPIs — not just usability metrics. This framing secured cross-functional buy-in and executive sponsorship.

Democratize Rule Creation

Enable non-technical Marketing teams to independently launch campaigns within 1 hour — removing engineering from the critical path entirely.

Ensure Logic Integrity

Build a predictive validation layer that prevents conflicting reward rules from going live simultaneously — eliminating the root cause of 60% of campaign failures.

Drive Measurable Business Impact

Increase campaign frequency by removing the bottleneck — targeting a 40%+ increase in targeted campaign launches per quarter.

After — Redesigned Experience
Redesigned rule engine interface showing a clean step-by-step rule builder with clearly labelled dropdowns for conditions and rewards, colour-coded validation indicators, and a live preview panel.
The redesigned Modular Rule Builder — intuitive for marketing, robust for engineering
Analytics dashboard screen showing campaign performance cards with success rates, active rule counts, and a bar chart comparing campaign frequency before and after the redesign.
Real-time analytics — marketing owns performance visibility
Rule creation flow screen with a guided wizard showing three steps: Select Trigger, Define Conditions, Attach Reward. An inline warning banner highlights a detected logic conflict.
Guided rule creation with live conflict detection
Full system overview diagram showing the journey from campaign setup through rule configuration, validation, activation, and analytics reporting — all accessible to non-technical users.
System-wide overview — from campaign setup to rewards activation
Chart showing 12 months of campaign pattern data analysis. A bar chart highlights the top 10 rule combinations by frequency, with the top 3 combinations accounting for over 80% of all campaign configurations.
Quantitative validation: 12 months of campaign pattern analysis that informed prioritisation

Results that moved the business

These aren't just UX metrics — each outcome maps directly to a business lever: operational cost, revenue velocity, data reliability, and platform longevity.

−75%

Campaign Deployment Time

Reduced from 4 days to under 1 hour. Marketing can now respond to market events in real time — a capability that didn't exist before.

−90%

Manual Error Rate

Predictive logic validation catches conflicts before they go live — eliminating the root cause of the majority of historical campaign failures.

100%

Stakeholder Autonomy

Marketing now owns the full end-to-end campaign lifecycle. Engineering was entirely removed from the operational path for routine campaign work.

10×

System Adoption

The modular architecture and intuitive builder drove a 10-fold increase in active usage — from a tool people tolerated to one they depended on.

What this project proved
about design leadership

Reframing the brief is a leadership act

The original ask was to "clean up the interface." I reframed it as a strategic opportunity to unlock business velocity. That reframe — and the ability to articulate it in business terms — was what secured executive sponsorship and cross-functional participation.

Research is a stakeholder alignment tool

The 60% statistic — that most failures were UI-induced logic conflicts, not technical bugs — didn't just inform the design. It changed how engineering and product thought about the problem. Research presented well is the most persuasive thing a design leader has.

Design for the system, not the screen

The modular architecture wasn't a UX decision — it was a business continuity decision. By designing the component system to be extensible, we avoided a rewrite cycle every time a new reward type was introduced. That's the difference between solving today's problem and building tomorrow's platform.

Democratisation is a measurable outcome

Giving non-technical teams full ownership of a previously gated workflow is a business transformation, not just a usability win. When 100% of campaigns can be launched without engineering, you've fundamentally changed what's possible for the business.

Continuing the roadmap

This redesign created a foundation. The next phase builds on it — using AI and advanced analytics to reduce cognitive load and increase campaign intelligence.

AI-Assisted Rule Creation

Exploring how LLMs can reduce the rule-building workflow to a single structured prompt — letting marketers describe intent in natural language.

Predictive Campaign Intelligence

Using historical performance data to surface recommendations before launch — nudging users toward rule configurations with higher success rates.

Advanced Analytics Layer

Expanding the performance dashboard to support cohort analysis — so marketing can evaluate which rule types drive retention across user segments.

Ongoing User Feedback Loop

Embedding a lightweight feedback mechanism inside the builder itself — so the design team gets a continuous signal on friction points as the system scales.

Interested in how I approach
complex design leadership?

I'm open to Design Director, UX Director, and Head of UX opportunities.