← All projects

Case study · B2B SaaS · Fleet operations

Custom Reports Builder

Enhanced analytics & reporting for fleet management — turning massive operational data into actionable insights for fleet managers.

Role
UX research lead
Timeline
Discovery Jan–Mar 2024 · Evaluative Apr–Jun 2024 · Beta Q2 2025
Team
Sr. UX designer, Staff PM; regular exec readouts
Methods
Desk research, competitive analysis, analytics, interviews, concept & usability testing
Advanced custom reporting experience in the fleet management product
Product experience for custom reporting — datasets, editor, and workflows informed by the research below.

Background & challenge

User need

Customers needed more actionable insights to streamline workflows. Data was buried and hard to manage at scale.

Business challenge

Alongside 10+ legacy dashboards, leadership planned 15–20 net-new dashboards by 2027. Without a strategy for surfacing data coherently, UX risked becoming even more fragmented.

My role

As UXR lead I clarified the problem space and business goals, planned and ran foundational and evaluative research, and partnered with stakeholders from insight through launch.

Physical operations run on enormous telemetry and event data — the product had to make that data discoverable, trustworthy, and useful for fleet managers who are not analytics specialists.

Outcomes at a glance

  • Leadership approved and funded design and engineering work on new reporting capabilities based on critical needs surfaced in discovery.
  • Evaluative research helped the team iterate toward general availability; Advanced Custom Reports (ACR) launched to Beta customers in Q2 2025.
  • The release was positioned as a major contributor to ACV within the broader product portfolio.

What we shipped toward

  • More comprehensive, discoverable datasets

  • Simplified reporting workflows

  • AI-assisted report creation, dynamic charts, and trend tracking

Research process

A phased approach moved from understanding the problem and business goals through evaluative cycles that de-risked launch.

  1. 01

    Foundational

    Understand problem & goals

  2. 02

    Discovery

    Desk research & synthesis

  3. 03

    Evaluative

    Concept & usability testing

  4. 04

    Build + iterate

    Heatmaps, refinement, Beta

Desk research

Goals

  • Evaluate existing research to inform new discovery
  • Consolidate dispersed customer feedback for stakeholders

Methodology

  • Synthesized prior user research, support tickets, sales calls, and feature requests
  • Organized findings by themes to focus downstream interviews

Outcomes

  • Avoided duplicative research
  • Improved speed to insight
  • Sharpened interview plans

Competitive analysis

Goals

  • Benchmark UX against SaaS leaders and industry competitors
  • Find opportunities for differentiation

Methodology

  • Compared UI and feature sets across 3 SaaS leaders and 3 competitors using public materials (sites, blogs, KB, video)

Outcomes

  • Established a bar for best-in-class UX
  • Highlighted feature gaps to match or surpass competitors

Product analytics

Goals

  • Understand adoption of existing reporting tools
  • Track usage patterns and which report types delivered value

Methodology

  • SQL queries, charts, and month-over-month usage analysis across verticals

Outcomes

  • Dashboards were underutilized by eligible users
  • Users created custom reports but often did not return to them
  • Standardized reports remained heavily used across segments

In-depth interviews

Goals

  • Explain the “why” behind quantitative patterns
  • Map journeys, needs, and pain points across personas

Methodology

  • 10 customers across geographies (US, Canada, EMEA) and industries (retail transit, field services, construction, wholesale, and more)
  • Remote sessions with stakeholder debriefs; coding in Dovetail; readouts to product and exec leadership

Outcomes

  • Silos, steep learning curves, and inconsistent UX surfaced as core pain points
  • Many teams relied on external analytics tools due to limits in customization and available datasets

How I built consensus

I translated discovery findings into three strategic tenets that unified cross-functional partners under a single, research-backed roadmap.

  • Consistency

    A coherent information architecture, shared templates, and UI components so users can find the right data.

  • Customization

    Robust custom tools for a more personalized experience as needs diverge.

  • Consolidation

    Move toward a single data ecosystem — trends surfaced across touchpoints, not only in reports.

Evaluative research

Concept testing

Goals

  • Test a range of concepts grounded in discovery
  • Decide which concepts to advance

Methodology

  • Interactive Figma prototypes
  • Remote interviews with 5 users on usability and desirability
  • Feasibility conversations with PM and engineering leads

Outcomes

  • Highest-scoring directions: high-level metrics in-report, low-click drill-down, templates plus AI prompts to build reports

Usability testing

Goals

  • Evaluate end-to-end UX of the new report editor
  • Refine terminology and technical depth of content

Methodology

  • Sessions with 10 customers who had reported reporting pain
  • Mix of open-ended and task-based prompts; task success rates
  • Maze heatmaps; affinity mapping with PM and designer

Outcomes

  • Concrete refinements to flows, language, and editor patterns ahead of Beta

Key findings

  1. Most users are not data analytics specialists.
  2. Users struggle to locate datasets independently.
  3. Users have difficulty visualizing how a report will look before they run it.
  4. Customers are excited about richer datasets and new AI- and manual-assisted ways to build reports faster.

Where we landed

Build with AI

Describe the report you need; the system generates a starting point you can edit and customize — lowering the burden for non-specialists.

Richer visuals

Add bar, line, donut, or metric-card visualizations so trends are visible at a glance inside custom reports.

Impact

De-risked launch: concept and usability testing drove tactical fixes for a successful Beta
Revenue: significant net-new ACV through enterprise acquisition and expansion
Alignment: exec and cross-functional partners stayed grounded in research for on-time delivery
“I actually can't stress enough how much of a game changer this product will be for Samsara.”
— Customer, Closed Beta
“This research is going to form the basis for a lot of my planning in the next few quarters.”
— Engineering manager

The work supported external storytelling as well — product marketing published the release and it was featured at the company's 2025 flagship conference.

Reflections

  1. Stay curious — avoid confirmation bias, embrace surprising findings, and stay intellectually honest.
  2. Tie findings to business strategy so stakeholders stay engaged and impact stays measurable.
  3. Combining qualitative and quantitative methods produced the most actionable, defensible insights.