

Optimizing Digital Credential Testing
PROJECT TYPE
UX Design, UI Design, UX Research
DELIVERABLES
Design System, Wireframes, Hi-fi Prototypes
TIMELINE
4 Months
Solution
DTT: The Key To Digital Credential Interoperability
The Digital Trust Test Bench (DTT) is a cloud-based quality assurance platform that enables issuers, holders, and verifiers to collaborate effortlessly on interoperability testing. I meticulously developed the UX to ensure that users—whether product testers, developers, or stakeholders—can navigate complex processes with ease, reducing barriers to adoption and fostering seamless integration across diverse technologies.
Problem
Breaking Down Digital Credentials Barriers
Governments and organizations across Canada face significant technical challenges in ensuring the digital credentials they issue are interoperable - not just nationally, but globally. Without a unified approach, the risk of a fragmented, incompatible system looms large, creating inefficiencies and hindering trust in digital credential solutions.
Process

Information Architecture
DTT was groundbreaking nature—there were no existing platforms to draw comparisons or inspiration from. This meant the team had to navigate uncharted territory, identifying for the first time what information users would need as they progressed through the testing flow. Tackling this ambiguous problem required a highly iterative approach to ensure the architecture was intuitive and supported a seamless flow.

Concept Development
The greatest challenge was determining the most effective way to present complex test results to users. At the onset of the concept development phase, I explored a variety of interfaces designed to present complex test data, identifying successful approaches to visualization and usability.
Exploratory Research

Lo-Fi Mockups
I developed multiple low-fidelity mockups, enabling the team to explore and compare various approaches to visualizing intricate data.

Prototyping
This iterative process led us to breaking down the data into two formats: a high level overview for users who were less familiar with digital credentials concepts, and a detailed table for developers and testers.
I further developed our chosen design direction into 3 interactive prototypes, each exploring a different way of displaying the navigation.
Prototype 1: Horizontal step tracker at top of screen (shown to the right)
Prototype 2: Breadcrumb navigation
Prototype 3: Vertical step tracker on the right hand side of the screen

Design Critique Workshop
I facilitated a design critique workshop in Miro to gather various stakeholder perspectives and refine the user experience. The session brought together stakeholders, developers, and designers to review key elements such as the testing flow and data visualization concepts. Through a dot voting activity and collaborative discussions, we identified areas for improvement, clarified user priorities, and aligned on the most intuitive solutions.


Key Workshop Takeaways
Overall, stakeholders wanted to see a simpler journey that would require less mental load on the user. Key opportunities included:
-
Prevent mistakes by enabling Auto-Save instead of requiring it as an action by the user.
-
Shorten the journey by naming Step 1 to Step 0 and allow users to skip it via a settings toggle.
-
Simplify the start process by offering multiple ways to begin.
-
Relocate help tips to reduce visual distraction and ensure the main focus remains on the task.
-
Provide clear reminders of what the testing goals to set expectations during the process.
-
Streamline navigation by ensuring step names are consistent and action-oriented.
View more projects:
Analysis & Prioritization
Following the first workshop, I analyzed stakeholder feedback and categorized it into three distinct groups for clarity and actionability:
-
Simple UI Changes
-
Feedback that could be addressed quickly, such as adjusting text size or colors
-
-
Mockup Testing Needed
-
Suggestions requiring further exploration and validation through mockups
-
-
Potentially Out of Scope
-
Feedback that might exceed the current project's scope and required prioritization
-

Prioritization Workshop
I facilitated a prioritization workshop to address the suggestions requiring further evaluation. The process was structured as follows:
-
Impact-Effort Matrix
-
We used this tool to evaluate each suggestion based on its potential impact and the effort required to implement it.
-
-
MVP Focus
-
The matrix helped identify which items should be prioritized for inclusion in the Minimum Viable Product (MVP).
-
This approach ensured a balanced and strategic selection of features, aligning with project goals and resource constraints.

Iterating the Design
Following prioritization workshop, I created a list of action items that emerged from the analysis process and addressed each one in the design prototypes.
I developed a high-fidelity, clickable prototype in Figma to visualize the final changes.


Final Design
Developer Handoff
Developing A Design System in Figma
The Digital Test Bench was an innovative platform, which meant there was little to no existing design documentation. As a major part of the UX design, I developed a documentation process in Figma. I set up and documented new styles, guides, and components to craft a design system that was:
-
Sustainable for future teams: A robust foundation for designers to build upon.
-
A single source of truth for developers: Comprehensive documentation ensured clarity and alignment.
-
Adaptable to evolving needs: Allowing for easy updates as user requirements and business objectives change.

I documented every screen in Figma, covering variable component states, breakpoints for all screen sizes, and unique edge cases and behaviors. This effort resulted in a scalable and sustainable design system that empowered both design and development teams.