top of page
f748b6d5-b01c-4ee4-8959-ecd40e6221cb_rwc_0x0x1814x945x4096.png

Arrow Icon

f748b6d5-b01c-4ee4-8959-ecd40e6221cb_rwc_0x0x1814x945x4096.png
Frame 1024.png

Optimizing Digital Credential Testing

PROJECT TYPE

UX Design, UI Design, UX Research

DELIVERABLES

Design System, Wireframes, Hi-fi Prototypes

TIMELINE

4 Months

Solution

DTT: The Key To Digital Credential Interoperability

The Digital Trust Test Bench (DTT) is a cloud-based quality assurance platform that enables issuers, holders, and verifiers to collaborate effortlessly on interoperability testing. I meticulously developed the UX to ensure that users—whether product testers, developers, or stakeholders—can navigate complex processes with ease, reducing barriers to adoption and fostering seamless integration across diverse technologies. 

Problem

Breaking Down Digital Credentials Barriers

Governments and organizations across Canada face significant technical challenges in ensuring the digital credentials they issue are interoperable - not just nationally, but globally. Without a unified approach, the risk of a fragmented, incompatible system looms large, creating inefficiencies and hindering trust in digital credential solutions.

Process

Process Graphic.png

Information Architecture

DTT was groundbreaking nature—there were no existing platforms to draw comparisons or inspiration from. This meant the team had to navigate uncharted territory, identifying for the first time what information users would need as they progressed through the testing flow. Tackling this ambiguous problem required a highly iterative approach to ensure the architecture was intuitive and supported a seamless flow.

Frame 1025.png

Concept Development

The greatest challenge was determining the most effective way to present complex test results to users. At the onset of the concept development phase, I explored a variety of interfaces designed to present complex test data, identifying successful approaches to visualization and usability.

Exploratory Research

Frame 58320.png

Lo-Fi Mockups

I developed multiple low-fidelity mockups, enabling the team to explore and compare various approaches to visualizing intricate data. 

Frame 58321.png

Prototyping

This iterative process led us to breaking down the data into two formats: a high level overview for users who were less familiar with digital credentials concepts, and a detailed table for developers and testers.

I further developed our chosen design direction into 3 interactive prototypes, each exploring a different way of displaying the navigation.

Prototype 1: Horizontal step tracker at top of screen (shown to the right)

Prototype 2: Breadcrumb navigation

Prototype 3: Vertical step tracker on the right hand side of the screen

1227-ezgif.com-video-to-gif-converter.gif

Design Critique Workshop

I facilitated a design critique workshop in Miro to gather various stakeholder perspectives and refine the user experience. The session brought together stakeholders, developers, and designers to review key elements such as the testing flow and data visualization concepts. Through a dot voting activity and collaborative discussions, we identified areas for improvement, clarified user priorities, and aligned on the most intuitive solutions. 

Screenshot 2024-12-28 at 11.32.56 AM.png
Screenshot 2024-12-28 at 11.33.24 AM.png

Key Workshop Takeaways

Overall, stakeholders wanted to see a simpler journey that would require less mental load on the user. Key opportunities included:

 

  1. Prevent mistakes by enabling Auto-Save instead of requiring it as an action by the user.

  2. Shorten the journey by naming Step 1 to Step 0 and allow users to skip it via a settings toggle.

  3. Simplify the start process by offering multiple ways to begin.

  4. Relocate help tips to reduce visual distraction and ensure the main focus remains on the task.

  5. Provide clear reminders of what the testing goals to set expectations during the process.

  6. Streamline navigation by ensuring step names are consistent and action-oriented.

View more projects:

PSI Portal - Home.jpg
rb_90968 1

Enterprise Portal

Design System, UX/UI

EspialTV

Design System, UX/UI

rb_90968 1

Analysis & Prioritization

Following the first workshop, I analyzed stakeholder feedback and categorized it into three distinct groups for clarity and actionability:

  • Simple UI Changes

    • Feedback that could be addressed quickly, such as adjusting text size or colors

  • Mockup Testing Needed

    • Suggestions requiring further exploration and validation through mockups 

  • Potentially Out of Scope

    • Feedback that might exceed the current project's scope and required prioritization

Screenshot 2024-12-30 at 9.52.06 AM.png

Prioritization Workshop

I facilitated a prioritization workshop to address the suggestions requiring further evaluation. The process was structured as follows:

  • Impact-Effort Matrix

    • We used this tool to evaluate each suggestion based on its potential impact and the effort required to implement it.

  • MVP Focus

    • The matrix helped identify which items should be prioritized for inclusion in the Minimum Viable Product (MVP).

This approach ensured a balanced and strategic selection of features, aligning with project goals and resource constraints.

Screenshot 2024-12-30 at 9.58.34 AM.png

Iterating the Design

Following prioritization workshop, I created a list of action items that emerged from the analysis process and addressed each one in the design prototypes.

I developed a high-fidelity, clickable prototype in Figma to visualize the final changes.

iMac_Front.png
12271-ezgif.com-video-to-gif-converter.gif

Final Design

Developer Handoff

Developing A Design System in Figma

The Digital Test Bench was an innovative platform, which meant there was little to no existing design documentation. As a major part of the UX design, I developed a documentation process in Figma. I set up and documented new styles, guides, and components to craft a design system that was:

  • Sustainable for future teams: A robust foundation for designers to build upon.

  • A single source of truth for developers: Comprehensive documentation ensured clarity and alignment.

  • Adaptable to evolving needs: Allowing for easy updates as user requirements and business objectives change.

Screenshot 2024-12-30 at 11.15.49 AM.png

I documented every screen in Figma, covering variable component states, breakpoints for all screen sizes, and unique edge cases and behaviors. This effort resulted in a scalable and sustainable design system that empowered both design and development teams.

bottom of page