BRIEF

A research-driven overhaul of a secure client portal, where confusing menus and labels become clear paths to the tasks customers use every day.

MY CONTRIBUTIONS

User Research

User Testing

Information Arch

MY TEAM

1 UX designer

2 Developers

DURATION

10 Weeks

Improving Navigation Clarity for a Client Portal

Overview

What does it take to navigate a financial system that was never designed for you?

DataWagon’s secure client portal grew fast as new features were added to support a growing customer base. But as the product expanded, navigation felt inconsistent, and labels no longer matched how they actually thought about their work.


Over three months, our goal was to uncover and resolve key issues using user testing and design iterations, all while working within the constraints of an existing portal platform.


Due to the confidential nature of the work, I’m not able to share detailed screens or data publicly in my portfolio. I can, however, walk through a general overview of my process!

Content

DISCOVERY

The portal is used by a range of customers who relied on the company’s servers to run their projects and businesses.

Who’s Using the Client Portal?

Most users fall into Three Main Groups:

Individuals

Run a few servers for personal or side projects, checking status, usage, and payments.

Resellers

Manage servers for multiple clients, switching between accounts and resolving issues.

Small companies

Run their services on these servers, monitoring infrastructure and handling invoices and account changes.

What I Achieved

During this freelance experience, I worked with two developers, and another freelance designer to make a structurally complex client portal easier to navigate and understand.

Some highlights of my achievements include:

Planning and running usability sessions

I was able to uncover where users were getting lost in the existing navigation and page structure.

Designing 3 Lo-fi navigation solutions

I redesigned key parts of the navigation structure, reducing the time it took to complete certain tasks during retest

Advocating for the User

I used real user evidence to show where people were getting stuck, and pushed for navigation and language changes.


My Approach

I used a mix of desk and primary research to explore what needed to be addressed.

Existing signals

Going through existing information helped to get a better understanding of the current landscape. I started by:

  • Reviewing support tickets that revealed emerging pain points


  • Talking with my team about these common pain points


  • Inspecting the existing navigation to see where key elements and tasks lived

While the portal had everything customers needed, its navigation and labeling didn’t match how they actually thought about their work.

Moderated Usability Testing

To find the right participants and avoid making assumptions, I ran a brief survey and led 10 remote usability sessions on the initial design.

What I Measured

Qualitative Signals: Tracking hesitation cues like long pauses and moments of backtracking.

Task success rate: Whether participants could complete each task without assistance.

Time on task: how long it took to complete each task end-to-end.

I framed the session tasks around our initial research goals and existing pain points within the portal (recorded from support tickets) and looked at how people tried to move through the navigation structure to get things done.

Lo-fi Solutions and Iteration

Once the team aligned on priorities, I moved on to translating the top priority issues into 3 low-fidelity wireframes focused on navigation layout while my other teammate focused on designing the content for each page.

Why Lo-fi?

Because there were conflicting opinions about what mattered, lo-fi helped us stay rooted in user intent and task flow (where people got confused, backtracked, or lost confidence) while keeping implementation constraints visible for the developers.

*Due to confidentiality agreements with DataWagon, I’m unable to share visuals of design iterations or final outcomes

Synthesis & Prioritization

After the sessions, I highlighted integral user quotes and consolidated them into a simple prioritization framework to see which issues to tackle first.

For each usability quote, I synthesized findings through iterative affinity mapping:

  • Grouping quotes into thematic patterns


  • Refining clusters across a few rounds


  • Relabeling them into actionable themes

After clustering our findings into themes, I prioritized them using the MoSCoW Analysis Method, making sure to consider factors such as:

  • Frequency — how often it showed up across participants


  • Impact — based off task success



We received some pushback, so I documented each key finding with a severity reference, evidence from sessions based on user quotes, and a recommended design solution. That shifted the conversation from opinions to shared proof, and it gave the team a clear framework they could implement confidently.

Re-test and feedback

After the first round of changes was implemented in a clickable prototype, I ran a lightweight re-test to validate that our high priority fixes actually resolved the original breakdowns.

What Changed


  • We saw noticeably less hesitation on the targeted tasks and fewer backtracks.


  • Time on task decreased for the same tasks because users spent less time searching, second-guessing, or backtracking.


Research Goals

Rather than jumping straight into redesigning pages, I framed the work around a few clear research objectives:

01

Findability — Can users easily find the actions and information they rely on most?

02

Navigation & labels — Where do the navigation structure and wording conflict with the users' mental models?

03

Experience quality — When do people feel confident vs. when do they feel like they’re guessing?

The goal wasn’t just to collect “problems,” but to create a prioritized understanding of where navigation and labeling were creating pain points.

What I Learned Along the Way

01

Design has to Respect Technical Reality — Working closely with two developers taught me the importance of asking about constraints early and to scope my ideas to what was feasible on an existing platform.

02

Small Changes Provide Large Impact — Small language and IA changes can have a major impact, especially at navigation decision points. When labels and structure match how users naturally think about the task, they stop second-guessing themselves and avoid frustration.