Home

Sevco Security

11 Usability Insights

hero image

TL;DR

My UX research team from the University of Washington partnered with Sevco Security to ensure their new Exposure Management dashboard helps security teams prioritize vulnerabilities effectively.

Through usability testing and stakeholder interviews, we identified key issues around users' ability to filter exposure data, understand dashboard terminology, and take action directly within the interface.

MY ROLE

I worked as a UX Researcher on a 3-month research project with Sevco's designers and customer success manager. I led analysis, delivered research findings, and co-authored the final report, which we presented to the entire Sevco team—directly informing their roadmap and planned feature improvements.

IMPACT

50+
Clients benefited
11
Usability insights
14
Actionable recommendations

Duration

3 months

(Jan. 2025 - Mar. 2025)

Team

Chang Zeng

Jen Zhang

Lori cai

Key Findings.

Showcase card 1

Users lacked the ability to tailor the Exposure Management dashboard to their roles and workflows—slowing down vulnerability analysis.

Showcase card 2

Users had to switch pages to take actions (e.g., create tickets). We recommended this redesigned IA to streamline their workflow.

Showcase card 3

Inconsistent icons, unclear filters, and vague severity labels made it hard for users to quickly interpret key information.

RESEARCH TOPIC

Improving Exposure Management Dashboard.

Sevco Security's upcoming feature, Exposure Management, helps cybersecurity experts track the full lifecycle of security vulnerabilities—from discovery and prioritization to remediation and validation.

RESEARCH TOPIC

THE PROBLEM

Unclear user experience of a new feature.

Sevco had released Exposure Management to early users but didn’t know if it fit their workflows or met their needs.

RESEARCH GOAL

What we wanted to learn.

01 icon

01

How do users incorporate Exposure Management in their current workflows?

02 icon

02

How valuable do users perceive the information from the Exposure Management dashboard?

03 icon

03

How successful can users identify, prioritize, and address vulnerabilities?

OUR APPROACH

Moderated remote testing + think-aloud protocol + post-test survey.

We used moderated think-aloud sessions to observe real-time behaviors and uncover users’ thought processes as they navigated the dashboard.

OUR APPROACH

PARTICIPANT SUMMARY

Who we spoke with.

We selected participants with a mix of seniority and hands-on experience. Our group included CISOs, senior security leaders, and junior analysts—with 3 participants already familiar with the Exposure Management feature.

PARTICIPANT SUMMARY

METHODOLOGY

Task 1: Interpret the EM dashboard.

We evaluated how well users could interpret the dashboard data, such as exposure types, severity levels, and device-user links. We found hesitation, misinterpretations around iconography and labels.

01 icon

01

"It takes me some time to realize what these bars are and where they are on the right side." - P5

02 icon

02

"I suggested allowing users to hide or prioritize issues based on their responsibilities (e.g., IT vs. security teams)." - P4

03 icon

03

"We've used some of these graphics with senior leadership to show current state month to month." - P3

METHODOLOGY

Task 2: Find the most affected devices.

Participants struggled to assess the significance of affected devices, interpret status categories, and filter high-priority issues quickly. This highlighted a need for clearer labeling, better grouping logic, and visibility into device importance.

01 icon

01

"What I want to know when thinking of the devices with a problem, is how important those machines are." - P1

02 icon

02

"It's not clear about the visibility on status changes (e.g., snoozing, in-progress tracking)." - P2

03 icon

03

"Does the Open count include "In Progress"? Unclear. It'd be helpful to define the stats further and distinguish Open, In Progress, and Total." - P3

RESEARCH ANALYSIS

Step 1: Turning 500+ notes into recurring themes.

I synthesized over 500+ raw observations collected from usability testing into clear, actionable insights. I started by organizing individual notes from each session into structured formats.

RESEARCH ANALYSIS

RESEARCH ANALYSIS

Step 2: Collaborative pain point affinity mapping & journey mapping

I then led the team through a collaborative affinity mapping session to cluster insights, align on key pain points, and co-create a user journey map that highlighted pain points at each stage of the workflow.

RESEARCH ANALYSIS

RECOMMENDATIONS #1

Let users define what matters.

Security teams have different priorities based on role, risk appetite, and workflow. We recommended enabling custom dashboards and saved queries so users can focus on what’s most relevant to them.

RECOMMENDATIONS #1

RECOMMENDATIONS #2

Help users track and act without disruption.

Analysts lost focus switching pages to check statuses or start remediation. We recommended clearer status visibility and in-dashboard ticket creation to keep actions seamless.

RECOMMENDATIONS #2

RECOMMENDATIONS #3

Clarify the language of the interface.

Users struggled with unclear icons and layout. We recommended adding severity text, consistent hover tooltips, and better spacing to improve clarity.

RECOMMENDATIONS #3
TAKEAWAYS

Actionability Drives Usability: Users prioritize features that allow them to take immediate action (e.g., ticket creation, status updates) directly within the Exposure Management page, reducing reliance on navigating to other sections like Live Inventory.

WHAT I LEARNED
  • Cross-Functional Collaboration

  • Working with a multidisciplinary team and presenting to Sevco stakeholders taught me how to synthesize diverse perspectives, balance user needs with product priorities, and communicate research findings clearly to drive actionable decisions.
1/3
NEXT STEP

Sevco is currently implementing key recommendations, including customizable dashboard views, tag-based filtering, and enhanced device criticality scoring.

VERSION

v2.2.1

RALPH'S TIME